Iran's AI propaganda videos rack up billion views in month
Image: Illustration by Megaton

Technology

Iran's AI propaganda videos rack up billion views in month

By Julius RobertSunday, April 19th 2026

Pro-Iran networks deployed culturally fluent, meme-savvy content to mock Trump and glorify resistance during the Gulf war's opening weeks, achieving viral reach that outpaced traditional Western messaging efforts.

Share

Pro-Iran networks deployed culturally fluent, meme-savvy content to mock Trump and glorify resistance during the Gulf war's opening weeks, achieving viral reach that outpaced traditional Western messaging efforts.

A video of Donald Trump morphing into a crying baby while Iranian missiles rain down. Another showing resistance fighters as anime heroes battling American mechs. These aren't amateur fakes cobbled together in someone's basement. They're part of what the Institute for Strategic Dialogue identified as a coordinated AI propaganda campaign that reached over a billion views on X in the first month of the Gulf war.

The scale marks a shift in information warfare. Where previous state-sponsored campaigns relied on bot farms pushing text posts or crude photoshops, Iran's latest effort taps directly into internet subcultures with AI-generated videos that feel native to the platforms they target. According to The Economist's analysis published April 18, these pro-Iran networks are winning the propaganda war by producing content that's witty, culturally fluent, and technically polished enough to pass as organic viral content.

The videos show careful audience research. One series uses the visual language of sigma male edits, a genre popular with young men on TikTok and Instagram, but replaces the usual entrepreneurs and athletes with Hezbollah fighters. Another deploys vaporwave aesthetics, complete with neon grids and retro fonts, to frame American military equipment as outdated relics. The Trump-as-baby video alone accumulated 47 million views before X removed it, though mirrors continue circulating on Telegram.

What makes this campaign effective isn't just the technology. It's the cultural fluency. The content creators behind these videos understand meme formats, timing, and the specific triggers that drive sharing on Western platforms. They're not translating propaganda. They're creating it natively in the visual vernacular of their target audience.

The Institute for Strategic Dialogue tracked how these videos spread through what they term laundering networks: accounts that appear legitimate, share mixed content about sports or entertainment, then suddenly amplify political videos during critical moments. The pattern suggests months of preparation, with dormant accounts activated when needed.

Subscribe to our newsletter

Get the latest model rankings, product launches, and evaluation insights delivered to your inbox.

Traditional counter-messaging appears ineffective against this approach. State Department videos explaining policy positions or fact-checking claims simply don't compete for attention against content designed to trigger emotional responses and shares. As one ISD researcher noted, "They're not trying to convince anyone of facts. They're trying to make resistance look cool and America look weak."

The technical quality varies across the campaign. Some videos show clear AI artifacts: temporal flickering, inconsistent lighting between frames, telltale distortion around edges. Others blend AI-generated elements with real footage seamlessly enough that detection requires frame-by-frame analysis. The Economist reports that at least twelve different AI video models appear to be in use, though specific platforms weren't identified.

Stylized AI-generated video showing political figures or events in a meme-like format
They're part of what the Institute for Strategic Dialogue identified as a coordinated AI propaganda campaign that reached over a billion views on X in the first month of the Gulf war.

X's response has been inconsistent. While the platform removed the highest-profile videos after they went viral, hundreds of smaller variations continue circulating. The company declined to comment on its detection methods or whether it's developing new systems to identify AI-generated propaganda.

Iran has attempted digital influence operations before, but this campaign represents a qualitative leap. Previous campaigns from 2019-2023 relied heavily on fake news sites and Twitter bots pushing articles. Those efforts, while persistent, rarely achieved viral reach outside Persian-language networks. The shift to AI video content fluent in internet culture changed the dynamics entirely.

The timing matters. These videos emerged just as consumer AI video tools reached a threshold of quality and accessibility. What required a visual effects team and weeks of work two years ago can now be produced in hours. The same democratization that enables indie creators also equips state propaganda operations.

AI video generation has lowered the barrier for sophisticated propaganda from millions to thousands of dollars. Cultural fluency matters more than technical perfection for viral reach. Platform detection systems remain optimized for text and static images, not video. Traditional fact-checking and counter-messaging fail against entertainment-formatted propaganda. The liar's dividend problem intensifies as audiences lose ability to distinguish real from synthetic video.

Western governments are scrambling to develop responses, though none have emerged publicly. The challenge isn't just technical. It's conceptual. How do you counter propaganda that doesn't look like propaganda, that viewers share because they find it funny or aesthetically striking rather than politically persuasive?

The next phase may already be underway. ISD researchers identified a new cluster of accounts sharing AI-generated videos about Taiwan, using similar aesthetic strategies but targeting different cultural references. Whether these represent Iranian expansion or other actors adopting the playbook remains unclear. The information battlefield has shifted, and the old rules no longer apply.

Related Articles