December 31, 2025 - 5:00pm

As Jimmy Donaldson’s talent manager, Reed Duchscher spent six years building his client into MrBeast. That is, into the most-subscribed YouTuber on the planet, boasting a $5 billion empire spanning chocolate bars, game shows and a Saudi-backed theme park. Now Duchscher is delivering a sobering diagnosis: there will never be another MrBeast.

He argues that the algorithms have grown too sophisticated at sorting viewers into their own individual silos. If a viewer seeks out automotive content, they receive more automotive content. If they like health and beauty, their feed is largely restricted to health and beauty. The days of a single creator punching through to hundreds of millions of viewers are effectively over. Donaldson’s rise required a specific historical moment, one where recommendation engines still permitted the emergence of mass figures. That window has closed.

This would be noteworthy enough on its own, a shift from the broadcast era of online video to something more resembling the fragmentation of cable television into countless niche channels. But a study last month from the video editing company Kapwing suggests the next phase may be considerably darker. Researchers examined 15,000 of YouTube’s most popular channels and found that 278 of them consist entirely of what they term “AI slop”, low-quality content churned out by artificial intelligence and designed to exploit recommendation systems.

Put together, these channels have accumulated over 63 billion views and 221 million subscribers. Kapwing estimates they generate approximately $117 million in annual revenue. When the researchers created a YouTube account to observe what is recommended to new users, they found that 104 of the first 500 videos constituted “AI slop”. Another third qualified as “brain rot”, the broader category of low-quality material engineered to monetise attention even if a human being was actually involved in its creation.

These clips can take an hour or two to generate using basic editing software. Meanwhile, Screen Culture, an India-based operation employing a dozen editors, was churning out as many as 23 fake movie trailers for a single film before YouTube finally banned the channel this month. Some of their AI-generated trailers outranked the official studio marketing in search results. Warner Bros. and Sony, rather than demanding removals, had been quietly claiming the ad revenue for themselves.

This convergence of trends is worth examining closely. First, the algorithm came for the universally popular creator, sorting audiences into hyper-specific verticals where a niche influencer with 50,000 followers can sell products more efficiently than a generalist with 50 million. Duchscher himself now counsels creators to dominate narrow categories like this rather than chase broad appeal. The investment firm Slow Ventures is writing cheques of up to $3 million for creators with deep expertise in specific verticals, treating them as founders of micro-brands rather than entertainers seeking fame.

But if the niche creator is the future, what happens when AI can produce niche content at essentially zero marginal cost? The slop ecosystem is already global, with low-dollar producers scattered across India, Pakistan, South Korea, Kenya, Ukraine, Nigeria, and Brazil cranking out material calibrated to game algorithms. Behind them sits a secondary economy of grifters selling courses on how to make viral AI content via Telegram, WhatsApp, and Discord. These instructors often earn more than the slop producers themselves.

The dead internet theory, which claims the internet is largely populated by bots and AI-generated content, has acquired scholarly respectability. A January 2025 paper in the Asian Journal of Research in Computer Science examined the theory’s claims and found them uncomfortably plausible. Imperva’s 2025 Bad Bot Report determined that automated systems accounted for 51% of all web traffic in 2024, the first year bots overtook humans. In September, OpenAI CEO Sam Altman posted on X that he “never took the dead internet theory that seriously, but it seems like there are really a lot of LLM-run twitter accounts now”.

YouTube insists that generative AI is merely a tool and that all content must comply with community guidelines regardless of how it was produced. This is technically true — and practically meaningless. The platform’s recommendation engine optimises for engagement, and AI slop demonstrably engages. Children watch CGI kittens being abused by CGI cat parents. Adults scroll through videos of obese people doing gymnastics. The algorithm has no compelling monetary reason to distinguish between authentic human creativity and low-budget surrealism generated by a content farm.

MrBeast and his ilk may indeed be the last of their kind. What comes next appears to be a digital landscape where algorithms shape not just what we watch but whether any human was involved in making it. The dead internet theory, it seems, was not so much an ominous prediction as a spoiler about the bleak cycles of pop culture that await us.


Oliver Bateman is a historian and journalist based in Pittsburgh. He blogs, vlogs, and podcasts at his Substack, Oliver Bateman Does the Work

MoustacheClubUS