It started with annoyance when Google started integrating its Gemini-powered AI summaries in search, occupying a large portion of the search results screen. The real web links that I actually needed were now buried below this summary, Google’s own result snippets, and sponsored ads. It was as if the web itself was now a second thought.
The annoyance increased when I saw loading circles within the AI summaries. As I waited a full second or two for text to appear, I was transported back more than a decade to when loading screens on the Internet was a thing.
The AI summary I received stumbled over itself into a jumbled, verbose mess.
I did another search: "How many kilometres are there in a mile?" The cutting-edge new Google AI that’s supposed to save us from the tyranny of web search was just plain wrong here--"If you run at a speed of 6 minutes per mile, you will not run 10 km in an hour but about 16 kilometres."
This wasn’t a one-off example. I continue to get summaries that are often a mash-up of content from pages that Gemini has put together. When not wrong outright, it is still confusingly verbose. How did getting information from the internet become so much worse and harder?
The irritation, however, has turned into despondency because I am worried about where this is going.
The web, as we know it, is slowly disappearing. Google Gemini, Perplexity, and other LLMs (or large language models) have compressed the internet into a primordial information ooze. This primordial ooze is being slowly transformed into an information black box. The only window into it is through the language models, which - as we all know - is an unreliable narrator.
Sorry for being dramatic: but is this the death of the web?SEO already killed web quality
Google’s tagline, “Organising the world’s information,” is a little too quaint and simple for what it’s been doing for a while. “Shaping the world’s information” is more like it. Even more accurate: “Owning world’s information”.
Google organises the web, no doubt. It crawls over a trillion web URLs and runs the Suez Canal of web traffic. Google Search still controls 90% of all web searches.
However, Google's organisation of the web has also significantly shaped it. Google’s introduction of sponsored links and its rules on ranking have shaped how pages optimise themselves to please Google. Search Engine Optimisation (SEO) has not only changed how pages write content but has also encouraged commercial entities to build a plethora of pages that are hyper-optimised to suck the sweet nectar of web traffic from Google searches.
Ranking higher means access to hundreds of thousands more clicks and cheaper ad bids, so the companies with the most significant incentives to monetise these landings would spend time, effort, and money optimising pages. SEO marketers and content creators have since flooded us with content pages in areas relevant to their industry.
The result? The web has gotten shittier.
A recent German paper published in March analysed the idea that “Google is getting worse” and monitored 7,392 product review queries for a year. It concluded that higher-ranked pages on Google that were “more SEO optimised” and “were more monetised with affiliate links” had lower-quality content.
If you’ve felt frustrated with Google search results, you are not alone. Unfortunately, this vicious cycle actually helps Google. With no competitors, the generic nature of search results actually makes Google's ads programme more valuable for those who can spend and have their links show up higher.
And yet, despite all this, the underlying structure of the web was being preserved. You could still spend time and effort searching for the link containing the information you sought. Good quality publishers still existed. Google would still consider credibility as something that can’t be gamed as part of the ranking.
However, AI summaries feel like a step towards finishing the linked web.
AI summaries might just kill the web
Consider a possible end state.
The web is no longer organised as hundreds of millions of pages that you access but as a mass of information. The more information you get synthesised by an LLM, the less incentive for web pages to exist. Future web needn’t necessarily be pages you can access but information stores that are updated, and the only way to get to them is through Google or other LLMs. This goes against the tenets on which the web was originally established with open protocols and free access. This, in turn, creates a loop in which customers need to consume information through these windows into nebulous information structures that are not exposed to us. Much like a teacher or an oracle who’d rather not let you have the books but tell you what’s in them.
There are many benefits to this. It’s so much more efficient to get synthesised conclusions and specific views on topics rather than wading through dozens of web pages. You could spend less time trying to find the right pages and more time thinking through the information and asking follow-up questions. It does seem like the ideal way to access the information humanity has created and possibly the future.
But it is also scary for multiple reasons.
The first is because it abstracts away the sources of information on the web, which removes context. When you read a random blog by a conspiracy theorist (you may be doing so for entertainment) versus a credible world-famous news publication, you have different contexts in your head. When a site gets something factually wrong multiple times, you tend to avoid it the next time. But in this future web where everything is compressed, rehashed, and summarised (with no guarantees for accuracy), what’s your context for information?
Google's 90% monopoly
Second, it consolidates the web even more into the hands of very few. Google is already your gateway to the web, the primary search engine that powers 90% of web discovery. However, you’re still exposed to individual sites, publishers, and brands that build trust and bring all the information. However, an opaque window into the web will further consolidate power into the hands of Google and a few others.
The third reason is the accuracy and quality of information on the web. As publishers get abstracted away, who is being held accountable for information? The primary sources of information no longer have a million pairs of eyes watching them but instead rely on the oligopoly of top companies to ensure what you read is accurate, balanced, and high quality. I don’t trust that they have these ideas front and centre in their mind.
Google inserting Gemini summaries in search results is reckless and a reaction to being caught flat-footed in narratives around winning the generative AI race. Continuing this will accelerate the process that Google itself has been responsible for--reshaping the web so that Google isn’t just a conduit but a proxy for the web itself. I can only hope that this is a passing fad.
--TYAGARAJAN SUNDARESAN
The author is a product consultant and mentor, with 15 years of experience building products and businesses for some of the top e-commerce companies.
(By arrangement with livemint.com)