Let’s assume that MCP becomes widely adopted within the next few years.
That’s a big “if”! The Model Context Protocol is still early, with most usage confined to developers and early adopters. Competing standards exist. But let’s say it breaks through and becomes the underlying fabric for how tools connect to AI.
If that happens, who stands to gain the most? And who gets left behind?
The Winners 👑
Model Providers (OpenAI, Anthropic, etc.)
If MCP succeeds, model providers win by default.
The Model Context Protocol makes LLMs more useful by adding more context. More context means more tokens. For LLM providers, more tokens means more revenue. And because MCP unlocks access to real-world tools, it expands what LLMs can do, making them even more central to workflows.
Some of the harshest critics of MCP argue that this is why companies like Anthropic have been quick to push for MCP adoption even while the protocol itself is still evolving. Model providers benefit from more usage, regardless of whether or not the protocol is ready for the attention.
LLM Clients and Agent Hosts (Claude Desktop, Cursor, Windsurf, etc.)
LLM Clients are arguably the biggest winners in an MCP-forward world.
LLM Clients and Model Providers (like Anthropic’s Claude) are often one and the same, so are doubly incentivized to see MCP succeed.
MCP turns the chat interface into the aggregation layer of the AI era. LLM clients become the central conduit between users and relevant information, data, and MCP tools:

As a result, LLM clients have the power to decide:
What tools users can discover
When those tools get invoked
Whether third-party tools get paid - and how much
What, if any, user data is shared back with providers
In a world where users interact mostly through agents or assistants, LLM clients are the new gatekeepers, with the power to control consumer demand.
Picks & Shovels: MCP Infrastructure
Last week, I wrote about the MCP “gold rush”, which, for now, is more optimistic than practical. Just like in California in the 1850s, there’s less gold in the water than everyone’s saying. But there’s definitely money to be made in infrastructure.
Existing infrastructure players have already recognized this and begun to act. Cloudflare is positioning itself as the remote MCP hosting layer. Vercel is following. Auth providers like Stytch and Auth0 announced solutions for handling MCP login and authorization. Security will also be a profitable space; if MCP becomes the control layer for tool access, it also becomes the new attack surface.
There are also lots of interesting emerging players, like Mastra, which focuses on end-to-end creation and deployment of agents, and MCP.run, who’s working to developing enterprise features for MCPs like SSO and compliance.
Particularly if VCs continue to invest heavily in MCP-focused businesses (as early enthusiasm seems to reflect that they will), infrastructure companies that ride the wave will have a lot to gain.
Companies with Proprietary Data
By standardizing the UX around a single LLM Client, MCP flattens interface differentiation. But data exclusivity still matters.
If you’re the only tool with access to a key dataset, the power of aggregators is neutralized. Your tool will be invoked regardless of placement or UI. The LLM doesn’t care if your app is beautiful — it cares that your server responds with the right data.
As a result, proprietary data offers unique defensibility, and the best protection against commoditization. Just guard it closely!
The Losers 💀
Display-Ad-Driven Businesses
If your monetization model depends on people viewing your website, you’re in trouble.
This isn’t new; companies are already seeing lower click-through-rates and less scrolling due to AI overviews.
But MCP exacerbates this even further. By providing LLms with structured, accurate data directly from the source, MCP servers increase the breadth and complexity of questions that can be answered within an LLM chat window without ever needing to visit the source.
Weather.com is a prime example. Weather forecast change frequently and demand reliability. Hallucination can mean the difference between rocking a cute outfit and coming home soaked.
For that reason, I’ve historically still checked weather.com rather than asking ChatGPT, despite the fact that the user experience is bad, and the forecast is surrounded by a wall of display ads:
With a weather MCP server feeding real-time data into the LLM, I can trust the AI tool more, and get answers tailored to my exact questions:
This is great for me, but a big problem for weather.com. How will they make money if I never see the ads?
Multiply this across finance, sports, ecommerce… and display-based businesses start to look challenging.
Tools that Differentiate on UX
Today, some products win because they’re beautiful and pleasant to use. In an MCP world, users don’t see your interface. Everything is natural language.
Project management offers a great example of why this can be a challenge for some comapnies. Linear is a new entrant that’s grown exponentially because of their stunning UI and clean user interface:
But when Linear, Asana, and Jira — all project management tools — showed off MCP servers for their respective tools at Cloudflare’s Demo Day, all the tool calls were functionally the same.
If the LLM is editing and assigning issues for you, you may never see the UI. Perhaps there is differentiation to still be found in Agent Experience or in MCP server design. But appealing to customers and offering a smooth user journey is no longer the moat it was in a pre-MCP world.
Fragmented Vertical SaaS
Niche SaaS tools that bundle thin functionality or connect a few tasks together with a simple UI are at risk.
As an example, consider lightweight dashboards and simple analytics platforms — things like DashThis or AgencyAnalytics. These tools often win on ease of use, not depth.
With MCP servers, an LLM can:
Pull the data directly
Perform analysis
Return insights - in natural language, and in interactive charts
Answer questions
Update incorrect data
Suddenly, the lightweight dashboard tools that used to valuable SaaS products can easily be replaced with free MCP tools.
Gray Areas / Mixed Outcomes 🧐
Companies That Opt Out of MCP
As MCP has begun taking off, some have warned that creating an MCP server for your product could actually be a competitive disadvantage.
Here’s tech journalist Benedict Evans’ take:
Some companies can afford to opt out of MCP: they have market power, a strong enough brand that users will still seek them out, or true proprietary data. They can continue to own the user relationship end-to-end.
However, this is a risky proposition. Not exposing an MCP server also means:
You don’t show up in tool selection
You don’t get invoked in workflows
You might cede ground to someone who did show up
In reality, the choice of to-MCP-or-not-to-MCP likely depends on a range of factors: your source of differentiation, your pricing model, MCP adoption rates among your core users, and more. But the decision is not to be taken lightly.
The way I see it, a historical analogy might be not making a website during the dot com transition. Yes, some people will still seek you out in the phone book. But others might not ever call.
Consumers
Most people will benefit from wider adoption of MCP. servers. More tools and better access mean more functionality and less time spent copy-pasting between tools.
But there are tradeoffs. The LLM becomes the middleman. From a consumer lens, that means:
You don’t pick which tools are used
You don’t always know where your data is going
You might end up paying for convenience — just like Apple’s 30% App Store fees get passed along to users in higher prices.
MCP servers give users power, but MCP clients give them less visibility. It will be up to us as end-users to decide if we think that tradeoff is worth it.
What’s Next
MCP is still early, but if it becomes widespread, it will reshape the software landscape - changing who owns the user, who delivers value, and who controls the experience.
These are a few perspectives on what that future might look like, but there are lots of open questions remaining.
If there are categories I missed or areas where you disagree with my conclusions, I would love to hear more!
One thing I find interesting to think about is, if we take the claim that search engines and display ads will become less relevant due to direct LLM usage (which seems likely), it still leaves the question of where the LLMs themselves will find the information. Its still unclear exactly how datasets for training are obtained, but even after that, ChatGPT and now Claude will still use web search to find information. While that is obviously bad for advertising on search results, it doesnt necessarily mean things like SEO is irrelevant, it's just instead you now need to SEO so that the LLM includes your desired message in its answers, instead of SEOing to convert human visitors.
Either way, I fully agree that embracing AI is a potential double-edged sword for many products and companies.