A new New York Times investigation shows how quickly YouTube is flooding feeds with bizarre AI-generated videos aimed at its youngest viewers. According to a single CoComelon video, more than 40 percent of recommended shorts in a 15-minute session contained synthetic visuals.
The algorithm pushes content from channels that claim to teach toddlers the alphabet and animals. But the clips themselves are often nonsensical, containing distorted faces, extra body parts and garbled text. None run longer than 30 seconds.
Experts say the format leaves no room for repetition or narrative structure, both of which are essential to young children’s learning with media. Still, the videos attract millions of views.
Creators, many of whom operate anonymously, have turned AI tools into a reliable source of income. The barrier is low, the payout is high, and the food is constantly flooding.
The algorithm prioritizes quantity over quality
Reporters conducted the analysis over several weeks, watching popular channels like CoComelon and Ms. Rachel through a private browser. Then they scrolled through the recommended YouTube shorts every 15 minutes to see what popped up.
In a session following a “Wheels on the Bus” video, over 40 percent of recommendations showed signs of AI generation. Some clips bore YouTube’s own “altered or synthetic content” label. Others required an AI detector for confirmation because the images were seamless enough to avoid accidental detection.
The same videos and channels appeared again and again in several sessions. This suggests that the algorithm is actively amplifying this content rather than filtering it out. Many accounts produce these clips multiple times a day, optimized for maximum views with minimal effort.
Within the creator economy that powers the feed
Many of the YouTube accounts that produce AI-generated children’s content operate anonymously. They list no contact information and provide few identifiable details about who runs them. The barrier to entry is remarkably low.
Creators teach themselves using readily available tools like Google’s Whisk and Runway and often follow online tutorials. Some channels present themselves as educational, featuring animated animals and sing-along songs designed to appeal to parents seeking educational content for young children.
The financial incentive promotes rapid production. A Halloween video featuring scary animals has been viewed more than 370 million times. Accounts deliver multiple videos per day, optimized for maximum reach with minimal effort. The formula works: grab attention quickly, keep it short and leave the distribution to the algorithm.
YouTube is responsive, but parents need to monitor the feed
After the Times shared examples with YouTube and requested comment, the platform removed all five mentioned channels from its partner program. These accounts will no longer earn advertising revenue or appear on YouTube Kids. The company also removed three hyper-realistic videos from the children’s app and removed a clip for violating child safety guidelines.
But the response was reactive, not proactive. YouTube requires creators to disclose AI-generated, realistic content, but this rule does not apply to animated videos for children. So the burden falls on parents, a task that even experts find daunting as tools continue to improve.
Some families are now creating their own playlists with verified content or removing the app entirely. The American Academy of Pediatrics advises parents to avoid AI-generated or highly sensational content. The hard part remains recognizing it.




