It starts out softly. A few odd videos are making their way into the feed. A victim of flooding who appears almost too dramatic. a staged rescue scene that isn’t overtly phony. Then more show up. quicker. Greater volume. Repeating.
As this develops, there’s a sense that social media has changed fundamentally, even though no one has made an announcement about it.
| Category | Details |
|---|---|
| Topic | AI-Generated Content Flood on Social Media |
| Key Platforms Affected | TikTok, Instagram, Facebook |
| Core Issue | Mass production of AI-generated misinformation, low-quality content, and harmful media |
| Notable Case | Pakistan floods misinformation surge (2025) |
| Research Reference | Digital Rights Foundation (DRF) |
| Key Risk Areas | Misinformation, gender exploitation, algorithm manipulation |
| Estimated Impact | Millions exposed to misleading or fabricated content |
| Social Context | Climate crisis, low digital literacy, high reliance on social media |
| Industry Challenge | Weak moderation systems and lack of AI labeling |
| Reference | https://digitalrightsfoundation.pk |
When floodwaters began to infiltrate Punjabi villages in Pakistan last year, people looked to their phones for updates. Outside, muddy water covered the roads. There were screens with videos on them, many of them fake. A woman in waist-deep water with a child. A man was yelling cautions. An explosion in the distance was attributed to a dam release. Many of these clips might not have been recorded at all.
The Digital Rights Foundation discovered that AI-generated content was circulating alongside actual crisis updates, frequently making it difficult to distinguish between fact and fiction. That is the reality. However, it is more difficult to ignore the finding that a video’s likelihood of traveling increases with its emotional content. After it had traveled, it became stuck. Platforms don’t seem to have been designed for this amount of artificial narrative. Not at this rate.
The most unsettling videos weren’t political in nature. They were more subdued and nearly passed off as human tales. Walking through floodwaters with infants in their arms, their faces strangely symmetrical, their clothes clinging in an unnatural way. Thousands of people liked the videos. Comments flooded in, expressing curiosity, sympathy, and occasionally something darker. Whether the majority of viewers were aware that what they were seeing was artificial is still up for debate.
The way these clips changed focus is noteworthy. The catastrophe faded into the background. The focus shifted to the body.
This is not wholly novel. Attention has always been rewarded on social media. But the math is altered by AI. Production effort is eliminated. Dozens or even hundreds of videos can be produced in a single day by a single creator, if that term is still appropriate. Everything is changed by that scale.
Lena, a small food creator, was the first to notice it. As she posted recipes while running errands in her kitchen, she noticed an influx of new accounts. polished clips. infinite uploads. Comments are not answered. The same words are used again. Her ideas were slightly altered, but they were still passable. She once remarked, “They don’t cook.” “They upload.” That distinction has an unsettling quality. content that lacks experience. output without regard to time.
Naturally, platforms are designed to reward volume and consistency. Regular posting improves visibility. Even in cases where quality is dubious, engagement is a sign of quality. The algorithm doesn’t stop to consider whether it was created by a human when AI enters this system and produces at scale. It merely gauges reaction. The tension increases there.
Real creators—constrained by time, effort, and focus—in one corner. Automated systems that can continuously flood feeds are the other. It’s possible that the platform itself is no longer able to meaningfully differentiate between the two.
This content wave also has a darker side. Some AI-generated videos distort tragedy in addition to being deceptive. A reenacted catastrophe in Swat went viral, complete with made-up screams superimposed on artificial images. Watching an inaccurate replay of their experience must have felt weird, almost intrusive, to those who were present. This is a risk that seems hard to quantify. Not only is there false information, but memory is also being altered.
In the meantime, strange patterns keep appearing. Millions of people have watched videos of animated food—pasta families, anthropomorphic meals—being prepared while “screaming.” They appear ridiculous at first. However, something more profound is revealed by their popularity: engagement doesn’t always make sense. It comes after feeling. Shock is effective. Novelty also does. Both are amplified by platforms that are made to maximize attention.
It would be easy to characterize this as a lack of moderation. And it is, in part. Systems for labeling are still uneven. Generation tools outperform detection tools. However, there is a structural problem as well. These platforms were developed during a period when content creation was labor-intensive. Effort is now optional. The ecosystem is altered as a result.
We feel as though we are witnessing a supply shock, similar to what occurs in markets when production unexpectedly becomes abundant and inexpensive. The distinction is that in this case, the product is information, or something that appears to be. Additionally, quality tends to decline when supply exceeds demand.
It is subtly noticed by users. Feeds seem louder. Under layers of content, well-known creators vanish. Trust erodes gradually rather than all at once. It gets more difficult to determine what is important. It is more difficult to care.
Whether platforms can change fast enough is still up for debate. Although they are experimenting—adding labels, modifying algorithms, and promoting “authentic” content—the scope of AI generation continues to grow. more quickly than laws can keep up.
Additionally, there is the issue of accountability. Regulators, platform companies, and AI developers all have a part to play. However, coordination is sluggish. Incentives don’t always coincide. Meanwhile, the feed continues to fill.
The most remarkable thing is how natural it already feels. the odd phony video. The weird, extremely polished clip. the feeling that something is wrong, but not enough to make you stop scrolling.
That could be the true change. Not only is AI content proliferating on platforms, but users are also getting used to it.
quietly making adjustments. filtering. speculating. Additionally, they may be lowering their expectations of what the internet should be without even realizing it.

