Wikipedia is often described as the last good website on an internet increasingly filled with toxic social media and AI slop, but it seems the online encyclopedia is not completely immune to broader trends, with human pageviews falling 8% year-over-year, according to a new blog post from Marshall Miller of the Wikimedia Foundation.
The foundation works to distinguish between traffic from humans and bots, and Miller writes that the decline “over the past few months” was revealed after an update to Wikipedia’s bot detection systems appeared to show that “much of the unusually high traffic for the period of May and June was coming from bots that were built to evade detection.”
Why is traffic falling? Miller points to “the impact of generative AI and social media on how people seek information,” particularly as “search engines are increasingly using generative AI to provide answers directly to searchers rather than linking to sites like ours” and as “younger generations are seeking information on social video platforms rather than the open web.” (Google has disputed the claim that AI summaries reduce traffic from search.)
Miller says the foundation welcomes “new ways for people to gain knowledge” and argues this doesn’t make Wikipedia any less important, since knowledge sourced from the encyclopedia is still reaching people even if they don’t visit the website. Wikipedia even experimented with AI summaries of its own, though it paused the effort after editors complained.
But this shift does present risks, particularly if people are becoming less aware of where their information actually comes from. As Miller puts it, “With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work.” (Some of those volunteers are truly remarkable, reportedly disarming a gunman at a Wikipedia editors’ conference on Friday.)
For that reason, he argues that AI, search, and social companies using content from Wikipedia “must encourage more visitors” to the website itself.
And he says Wikipedia is taking steps of its own, for example by developing a new framework for attributing content from the encyclopedia. The organization also has two teams tasked with helping Wikipedia reach new readers, and it’s looking for volunteers to help.
Techcrunch event
San Francisco
|
October 27-29, 2025
Miller also encourages readers to “support content integrity and content creation” more broadly.
“When you search for information online, look for citations and click through to the original source material,” he writes. “Talk with the people you know about the importance of trusted, human curated knowledge, and help them understand that the content underlying generative AI was created by real people who deserve their support.”
You Might Also Like
Will the Pentagon’s Anthropic controversy scare startups away from defense work?
In just over a week, negotiations over the Pentagon’s use of Anthropic’s Claude technology fell through, the Trump administration designated...
It’s official: The Pentagon has labeled Anthropic a supply chain risk
The Department of Defense has officially notified Anthropic leadership that the company and its products have been designated a supply...
Users are ditching ChatGPT for Claude. Here’s how to make the switch
Many users are switching to Claude following a string of controversies surrounding ChatGPT and its parent company, OpenAI. The tipping...
Musk bashes OpenAI in deposition, saying ‘nobody committed suicide because of Grok’
In a newly released deposition filed in Elon Musk’s case against OpenAI, the tech executive attacked OpenAI’s safety record, claiming...








