Why Splitting SEO and GEO Functions Doesn’t Make Sense
- Chris Green
- Aug 6
- 3 min read
Splitting SEO (Search Engine Optimisation) and what some are calling “GEO” (Generative Engine Optimisation) is, in most cases, a new solution in search of a old problem. I can't deny that GEO's ability to grab attention and budgets is looking effective and sometimes winning where SEO hasn't, but creating a distinction between the two does not make sense - at least for now.
The only real exception might be a future where paid-for visibility in AI chat results becomes the norm. But for today’s businesses, there’s very little to gain by separating your “classic” SEO team from your AI/search optimisation efforts. EVEN if it means SEO becomes more interested in optimising feeds of data rather than webpages.
I’m not going to re-litigate the whole debate here, but there are a few points that are hard to ignore:
1. Trust & Citations - It’s Not Just About the Training Data
Right now, getting an AI assistant to cite high-quality or trustworthy sources based solely on its training data is risky. Large language models (LLMs) are powerful, but their sense of “trust” is a little wonky when they struggle at accurately citing sources. Check your 404 reports where ChatGPT is the referrer if you don't believe me! To really judge quality and relevance, you need other signals, like link graphs, site authority, or user engagement signals. Traditional search engines have been refining these for decades.
That’s why we’re already seeing modern AI systems “ground” their answers in search results - they fetch the latest and most relevant pages to support answers and fill gaps in their knowledge.
This hybrid approach is crucial for accuracy and reliability and will be for a little while longer at least.
2. Freshness - Why Live Data Still Matters
Training an LLM is a massive undertaking. It’s expensive, slow, and, the second you stop, some of your data starts going stale.
No matter how good your model, you’ll always have knowledge gaps if you only rely on static training data. That’s why current systems often plug those holes with live data - whether that’s via direct search queries or scraping the latest content.
For businesses and site owners, this means the same thing it always has: bots still need access to your content, and making your site understandable and accessible to crawlers is just as important now as it was in the “ten blue links” era.
3. Consistency - New Signals, New Sources, Old Problems
We’ve seen this in SEO already: all the different signals an algorithm can access - on-page, off-page, structured data, and external data sources - need to be consistent and accurate.
I’ve lost count of how many times I’ve seen businesses tripped up by inconsistent signals between page content, structured data, and how their site is presented elsewhere online. Most of these issues were entirely predictable if you stopped to consider it.
To help address these, search engines (most notably Google) have rolled out new markup and standards, ways for site owners to clarify the “ground truth.” I don’t see any reason why AI-first companies (like OpenAI) won’t start doing the same, especially since it’s far cheaper than retraining models every time the web shifts.
But don't wait for AI companies to start defining their own markup to assist, you can pro-actively ensure citations and information about your company are consistent across the web - something that SEO is MORE than capable of.
4. Content Strategy - Don’t Double Up on Work
There’s a notion out there that we should be producing separate content for SEO and “GEO” - one set for classic search, another for generative engines. This isn't because "GEO Content" is somehow more advanced/sophisticated, but rather most people's assumptions about the content needed for "GEO" is actually TOO basic and TOO literal against set criteria. Think what the SEO industry was doing 10-15 years ago.
Whilst we may need to alter/tweak some things, we have EVERY reason to believe that AI companies WON'T want to reward overly spammy or manipulative content. But it WILL want to reward content it can understand and be confident with. The same as SEO.
Right now any kind of content strategy split ONLY starts to make sense in a future where the chatbot/agent fully replaces the need for websites. Where users never visit your site, and all interaction happens in the AI’s answer box.
Until that day arrives, businesses still need websites that serve both human users and AI bots. That means one high-quality, well-structured set of content, optimised for clarity and usefulness, is still the smartest strategy.
For Now
If we ever reach a future where agents replace websites entirely, then yes, SEO and GEO will both need to be redefined from the ground up.
But for now, the fundamentals of good content, accessible code, and clear markup serve both worlds equally well and the same team can very capably do both.