
It happened quietly. At the beginning of 2023, generative artificial intelligence (AI) and the chatbots powered by large language models (LLMs) were considered innocuous, only capable of writing overly formal essays and answering fact-based questions. However, as the months passed, these chatbots gained additional capabilities. First, they learned how to answer subjective queries; then, they gained access to the Internet and could tell you the live score of a match or how a stock was performing in real-time. Then, in May 2024, things took a dark turn, especially if you owned a content website or were a news publisher.
Google, one of the challengers to the AI throne, released AI Overviews in Search. It was a small window where Gemini would generate a brief summary in response to the queries users made. The small box quickly became an easier way to find information without having to click on any link and be redirected to another website. It was not just AI Overviews, however. OpenAI’s ChatGPT followed suit and brought an AI-powered search engine integrated with the chatbot. Perplexity created an AI-native answer engine, while Anthropic’s Claude allows users to upload a file and answer queries based on it. The snowball effect was in full force.
While most people did not take notice, a paradigm shift was occurring in how the Internet was used. People were not searching the web anymore; they were asking the AI to collate the information. However, with that transition, content websites and news publishers started to feel the heat.
The Restructuring of the Internet Traffic Model
A large part of the visible Internet today — the indexable websites and web pages that appear in search results on Google, Bing, Yahoo, or DuckDuckGo — consists of content-driven websites. These include news platforms (like the one you’re reading this article on), social media, blog sites, forums, and more.
These platforms, managed by companies or individual creators, are constantly updated with fresh content, new engagement tools, and user experiences designed to attract visitors. By doing all of this, the goal is to attract a large number of visitors. This footfall of visitors, also known as Internet traffic, allows the websites to rank higher on search engines and earn revenue from ads displayed on the pages.
The rules, until now, have been straightforward: If a website has content that appeals to a large number of people, it draws more visitors, which will make it more visible on the Internet, and it can earn more money by showing them ads. In a nutshell, the content dictates the direction of the Internet traffic, which in turn dictates revenue generation. Even if the system appears to be transactional, the rules ensure that the quality of content and information remains the biggest focus. But with the rise of AI, these rules are breaking down.
The typical AI chatbot with web searching capabilities uses bots (also known as crawlers) to scan hundreds of websites and their content in seconds. When a user requests a query, it collates the data, generates a natural language response, and presents it to them. Most of the time, these chatbots will provide inline citations to the sources, but if you were already getting the answer directly in the chat, how often would you actually click through to the original website?
So, what happens when only AI and their crawlerbots are visiting these content websites? For one, these bots do not count as real traffic since they leave no visible footprint. They don’t view ads, and as a result, they don’t generate any revenue. As a result, websites get less traffic and even less revenue.
The Real Impact on Content and News Publishers
The tremors of this shift are already visible. According to Similarweb (via The Economist), the global search traffic fell by 15 percent year-on-year (YoY) as of June 2025. Older data also show that the no-click-through rate to news websites grew from 56 percent in May 2024 (this was when AI Overviews was launched), to almost 69 percent in May 2025.
For the unaware, no-click-through or zero-click searches refer to situations where a user can find an answer to their search query directly on the search engine results page (SERP) without needing to click on any of the listed websites. To make matters worse, Similarweb also found that organic traffic took a downturn, falling from more than 2.3 billion visits in H1 2024 to less than 1.7 billion in June.
These are not just numbers. This is the first restructuring of the Internet since search engines arrived in the 1990s, and the world is not prepared for it. In May, the New York City–based financial and business news website Business Insider reportedly fired 21 percent of its workforce to “endure extreme traffic drops outside of our control.”
Business Insider is not an isolated case, either. According to Similarweb data (via New York Post), both HuffPost and Washington Post have seen their traffic decline by as much as 50 percent between April 2022 and April 2025.
The emergence of the new technology is also shaping how Internet users access information. As per a Financial Times report, 80 percent of users now use AI-generated content for around 40 percent of their search queries.
The Problem Statement
At the core of this entire restructuring lies a unique problem. Ever since search engines have existed, they have been able to use their bots to crawl websites. This was done to analyse the content and rank the most relevant web pages at the top. This agreement was beneficial to both search engines and website owners. If websites followed good content practices, they would be brought to the top of search results, and search engines would get repeat users if queries showed high-quality results.
However, AI companies are now using this same method to crawl and analyse the content of websites with no upside to publishers. Yet, websites cannot block these crawlers since AI platforms often deploy these bots without proper disclosure. For instance, Reddit recently sued Anthropic after the former found that the latter’s bots accessed the social media platform as many as 1,00,000 times in less than a year. With Google, things are even worse. The tech giant uses the same bots to crawl websites for AI that it uses to index them on its search engine. So, if publishers opt out of AI analysing their data, they also drop out of Google Search, effectively killing their traffic.
Protests Have Begun
News and media publishers understand this situation and are not planning to remain silent. Reddit’s lawsuit against Anthropic is just one in many such cases currently pending in courtrooms.
In February, Condé Nast, The Atlantic, Forbes, The Guardian, and others filed a lawsuit against AI startup Cohere, alleging unauthorised use of copyrighted content of more than 4,000 articles.
Publisher Ziff Davis, the parent company behind ZDNet, PCMag, CNET and IGN, has also recently filed a lawsuit against OpenAI, accusing the latter of “intentionally and relentlessly” exploiting copyrighted content. In India, ANI has filed a similar lawsuit against the ChatGPT maker.
Apart from this, the US-based nonprofit trade association News/Media Alliance, which represents the likes of The New York Times, The Washington Post, and Vox Media, issued a statement against Google’s AI Mode and called its functioning “the definition of theft.” However, most of these cases are currently ongoing, and the courtrooms have not decided on them to create a precedent. The AI companies argue that their usage of publicly available information comes under transformative usage and does not break any copyright laws, while publishers argue that the unauthorised access to the content itself breaks the law.
Future Outlook Is Concerning
The debates on whether a superintelligent AI system will take over the world or not still remain a hypothetical battle, but AI’s impact on the Internet search traffic and publishers is real. Website traffic is down, the click-through rate is dwindling, revenue losses are widespread, and layoffs have become the norm.
Regardless of the current situation, the AI genie is out of the bottle and here to stay. Unless policymakers begin to view content copyrights through a modern lens and courtrooms redefine the definition of transformative usage, it will be an uphill battle for news and content publishers.
There are solutions. Publishers could pivot their revenue model from traditional search engine optimisation (SEO) and ad-driven models to a subscription model; they can focus more on AI optimisation to rank better on chatbots’ algorithms; or even take one of the many unconventional routes, such as micropayments, newsletters, AI licensing, or even SaaS.
If the laws of the Internet are being rewritten, publishers will need to evolve with the changing times. However, such a shift would lead to more layoffs and the closure of newsrooms — a reckoning the industry may not be ready for.