The Aggregator’s Dilemma
Why AI That Helps Humans Search Better Leaves You Unprepared for Agents That Don’t Search at All
I’ve been thinking about what happens to aggregator businesses when AI agents become the primary way transactions get executed.
Not AI helping humans search better. That’s happening already and every aggregator is investing in it. I mean when the AI doesn’t just assist, but actually executes: searches, evaluates, negotiates, and completes transactions on behalf of humans, without the human clicking through your interface.
This isn’t theoretical. Perplexity bypasses traditional search. OpenAI is building browser agents and an agent marketplace. Anthropic’s Model Context Protocol lets agents discover and use services programmatically. The infrastructure is being built right now.
The conclusion I keep coming back to:
Your suppliers could become your competitors for agent traffic.
For twenty years, aggregators have succeeded by being the search layer between supply and demand. Property portals. Job boards. Comparison sites. Energy switchers. Travel booking platforms. Insurance aggregators. The pattern was always the same: fragment the internet, reunify it behind your interface, tax the reunion.
It worked because humans search like humans. Slowly, visually, clicking through pages whenever they have time. Your value was simple: be where the eyeballs are, make the experience better than visiting dozens of individual websites, charge suppliers for access to those eyeballs.
But AI agents don’t search like humans. An agent doesn’t care about your mobile app or your UX. It doesn’t get tired. It doesn’t need you to reunify the internet. It can query 47 energy suppliers, 200 hotels, or 30 insurance providers simultaneously and synthesise the results in seconds.
If your value proposition is “we make search easier for humans,” what happens when search isn’t the bottleneck any more?
Where Aggregators Are Investing Today
Major aggregators are making substantial AI investments. Rightmove’s recent announcement is market leading: £60m over three years focused on consumer innovation, operational transformation, and R&D for new growth.
Their 2026 allocation of £12m spans:
Consumer experience: £3.5m (conversational search, AI-powered recommendations)
App development: £2.1m
Operational transformation: £2.0m (automated workflows, AI-powered support)
New growth initiatives: £3.1m
Support functions: £1.3m
They’re executing 27 AI initiatives spanning vendor prediction, online valuations, conversational search, and AI-powered styling tools.
Similar patterns appear across sectors. Energy comparison sites are investing in smart tariff recommendations. Insurance aggregators are deploying quote optimisation engines. The specifics vary, but the pattern is consistent: AI investments focused on improving the human experience.
These investments make complete sense. Better consumer experiences drive engagement. Operational efficiency improves margins. New revenue streams diversify the business.
But here’s what I notice: every investment category is either consumer facing (making the human experience better) or operational (making internal processes more efficient).
What’s less evident is infrastructure that lets AI agents transact programmatically. Maybe some aggregators are building this quietly. The visible focus remains on AI-assisted human experiences rather than agent-native infrastructure.
Why Aggregators Are Actually Well-Positioned
Here’s the counterintuitive bit: aggregators could be even more valuable in agentic economies than they are today. But only if they build the right infrastructure.
Platform intelligence compounds with transaction volume
When transactions flow through your platform, you accumulate intelligence that makes every subsequent transaction better.
For example, property platforms learn timing patterns (Bristol properties: 6 weeks offer to completion; London: 12 weeks), pricing signals (suburban asking prices negotiate 8%; city apartments rarely discount beyond 3%), workflow optimisation (digital-first conveyancers complete 40% faster).
Energy platforms understand consumption patterns by property type, seasonal switching windows, supplier reliability scores, optimal timing to avoid exit fees.
Travel platforms know route-specific pricing patterns, hotel cancellation behaviours, optimal booking windows, supplier service levels.
Insurance platforms track risk assessment refinements, claims processing speeds by provider, renewal negotiation leverage points.
This intelligence gets encoded in your API responses, search rankings, and workflow suggestions.
When an agent queries your platform, whether it’s a cloud-based service like ChatGPT or a local open-source model running on someone’s phone, it benefits from that accumulated intelligence.
Example: Smart platform vs basic API
Agent query: “Find 3-bed properties in Bristol, £400-500k, schedule viewings this week”
A basic API returns:
{
“properties”: [...]
}
An intelligent platform returns:
{
“properties”: [...],
“metadata”: {
“typical_offer_to_completion”: “6 weeks”,
“price_negotiation_range”: “6-10%”,
“optimal_viewing_times”: [”Tue 2-4pm”, “Sat 10am-12pm”],
“fast_response_agents”: true,
“typical_documents_required”: [...]
}
}
The same pattern applies across sectors:
Energy switching: Not just tariff rates, but scenario modelling (if you charge your EV at night it looks like this, if you install solar it looks like that), supplier switching reliability (how often switches fail, how long they actually take), exit fee timing windows.
Travel booking: Not just availability, but preference-based filtering (you stayed in boutique hotels in Barcelona and Berlin, here’s similar in Lyon), cross-platform learning (your preferences travel with you, not locked in one booking site), contextual intelligence (you always book window seats, you avoid properties near nightlife).
Insurance quotes: Not just premiums, but claims processing reputation, renewal leverage points, coverage gap analysis.
The agent using your platform completes transactions faster with better outcomes because your platform is smarter, regardless of whether the agent itself is sophisticated or basic.
Your competitors can copy your API structure. They can’t copy the transaction intelligence that makes your responses valuable.
Small players won’t build this themselves
Individual suppliers (estate agents, energy retailers, hotels, insurance providers) could theoretically build agent-readable infrastructure themselves. In practice, they won’t.
Because it’s infrastructure, not differentiation. A local estate agent doesn’t win by having better API protocols than the agent down the street. They win on properties, service, and local knowledge. A small energy supplier doesn’t differentiate on API design; they differentiate on tariff innovation or customer service.
That’s your opportunity: become the shared infrastructure layer that small players plug into. Just like they use Stripe instead of building payment processing, they’ll use your protocols instead of building their own.
You can shape the protocol while it’s still forming
In open markets, protocols eventually commoditise. But the first mover gets to shape what gets standardised.
If you define the property listing schema, you ensure it includes fields you already have (transaction history, verification status, pricing benchmarks) that smaller players don’t. If you define the energy switching protocol, you embed the consumption patterns and supplier reliability data you’ve accumulated. If you define the insurance quote schema, you include the claims processing metrics and coverage analysis you’ve built.
By the time competitors catch up, you’ve already captured the transaction graph that makes those fields valuable.
You don’t need to own the standard. You need to shape it before someone else does.
What Infrastructure the Agentic Economy Actually Needs
The agentic economy requires six foundational infrastructure pillars to function. These aren’t aggregator-specific. They’re universal requirements for any market where AI agents transact autonomously.
Think of them like HTTPS and payment gateways for e-commerce. Not optional, not differentiating on their own, just foundational.
The six pillars:
Discoverability – How agents find and understand what’s available (structured schemas, multimodal search APIs, machine-readable formats)
Trust & Verification – How agents validate legitimacy and track records (verifiable credentials, queryable transaction history, third-party verification hooks)
Payments & Settlement – How value exchanges hands (embedded payment rails, escrow, dispute resolution, multi-party settlement)
Orchestration – How multi-party workflows execute (Model Context Protocol for tool integration, workflow state persistence, agent-to-agent messaging, parallel coordination)
Attribution – How contributions get recognised when multiple agents participate (explicit attribution rules, contribution tracking, flexible compensation models)
Privacy & Consent – How user preferences are maintained when agents act autonomously (preference APIs, consent tokens, data minimisation, revocation mechanisms)
I’m not going to detail each pillar here. That’s a separate piece. But the key point: these need to exist for agentic economies to function at all.
Someone will build them. The question is who.
Why Aggregators Are Natural Infrastructure Providers
You’re uniquely positioned to build these pillars for your vertical because you’re already the transaction intermediary. You have structural advantages that make you the natural provider:
Transaction volume: You process enough transactions to identify patterns and build intelligence that individual suppliers can’t match. One major property aggregator processes hundreds of millions of listed items and generates billions of consumer signals annually. That data density creates platform effects. Every transaction makes the platform smarter for the next agent.
Supplier relationships: You already have commercial relationships with suppliers in your sector. Adding API protocols to existing partnerships is easier than building those relationships from scratch. Small suppliers trust you because you’re already embedded in their operations.
Trust graph: You’ve accumulated reputation data through repeated transactions. Agents need that trust layer. They can’t independently verify every supplier. Your platform becomes the verification mechanism agents rely on.
Capital to invest: Building these pillars requires upfront investment before revenue materialises. You have the balance sheet and cash flow to fund infrastructure development. Startups building agent-first from scratch don’t have that luxury.
Domain intelligence: Property aggregators understand conveyancing workflows, stamp duty calculations, and local market dynamics. Energy aggregators understand consumption patterns, switching windows, and regulatory constraints. Travel platforms understand route economics and supplier behaviours. That domain knowledge gets encoded in your APIs and makes them genuinely useful to agents rather than just data pipes.
But should you build this infrastructure yourselves, or partner with specialists?
eBay’s acquisition of PayPal (2002) wasn’t about owning payments infrastructure for its own sake. It was about controlling the trust and friction points that affected transaction completion. The eventual spin-out (2015) was financial engineering, not a statement that payments integration was strategically worthless. Seamless, trusted payments were core to eBay’s marketplace success.
The parallel question for aggregators: is transaction intelligence infrastructure you control, or a commodity service you consume? I’d argue it’s different from payments. Payments are horizontal infrastructure. The same mechanics work across every marketplace. Transaction intelligence is vertical, deeply specific to your domain, your suppliers, your transaction patterns.
If you don’t build these pillars, someone else will. But they’ll be starting from scratch while you’re already processing transactions and accumulating the intelligence that makes the infrastructure valuable.
The Competitive Clock
You’re not the only one who could build this. Your competitors see the same future. So do horizontal platforms (Google, Microsoft, Amazon) that might add vertical depth. So do startups building agent-first from scratch.
OpenAI’s emerging agent marketplace, Anthropic’s Model Context Protocol (MCP) for tool integration, Perplexity’s direct answer engines. These aren’t hypothetical. The infrastructure for agents to discover and use services is being built right now. The question is whether aggregators become preferred service providers in these marketplaces or get bypassed entirely.
What’s different about agentic competition:
Accumulated transaction intelligence might be the only moat that matters.
An agent that’s completed 500 property transactions through your platform knows which estate agents are reliable, what pricing patterns mean, what workflows succeed. An agent that’s switched energy suppliers 1,000 times through your platform knows which suppliers process switches fastest, which have hidden fees, which offer the best customer service.
Even if a competitor launches an identical protocol, agents would have to relearn everything from scratch.
But that intelligence only accumulates if you’re processing transactions, not just facilitating search.
Brand still matters. Supplier relationships still matter. But in agentic economies where agents make decisions based on structured data and verified track records, those legacy advantages weaken. Transaction intelligence becomes disproportionately important.
The race isn’t “who builds the prettiest agent interface.” It’s “who captures the transaction graph first.”
Consumer AI AND Agent Infrastructure
The investments aggregators are making in consumer-facing AI are necessary and valuable. Conversational search, personalised recommendations, automated operations. All good moves.
Many aggregators offer search interfaces that are evolutions of 10-year-old capabilities. That work needs doing. Better search drives engagement. Operational efficiency improves margins.
But these investments create an adjacent question: as AI gets better at helping humans, when does the ‘assistant’ become an ‘agent’ that transacts autonomously? And when that happens, what infrastructure needs to exist?
This isn’t either/or. Not “consumer AI instead of agent infrastructure.”
It’s both/and.
You need consumer-facing AI to deepen engagement with humans using your platform today.
AND you need agent-facing infrastructure to ensure agents transacting on behalf of humans use your platform tomorrow.
The urgency isn’t that you’re behind. It’s that you’re investing heavily NOW in AI, and agent infrastructure should be a parallel workstream, not a sequential one.
Here’s the trap: fixing your search interface is necessary work. But if that’s ALL you’re doing, you’re still building for yesterday’s transaction model. You’re optimising for human search when the game is shifting to agent transactions.
Building consumer AI first, then agent infrastructure later, means you miss the window to shape the protocol. By the time you’re ready, the standards may already be set, and not to your advantage.
This is the difference between fixing and reimagining. Fixes keep you competitive in the current model. Reimagining prepares you for the next one.
What Happens Next
I’m publishing this openly because I think the pattern is important and time-sensitive. If you’re running an aggregator and this resonates, take it and run with it.
The six infrastructure pillars deserve detailed exploration. That’s the next piece. But the strategic insight stands alone: aggregators are investing heavily in AI to improve human experiences while the real transformation is agents transacting autonomously.
The organisations that see this early and build agent-native infrastructure won’t just survive the transition. They’ll be more valuable than they are today.
The ones that don’t will become legacy search layers in a world where search isn’t the bottleneck any more.
Mark Strefford helps organisations reimagine their business architecture for AI-enabled economies. He focuses on competitive economics and ecosystem implications, not just AI deployment. To schedule a confidential 1-to-1 session, book here.


