YouTube isn't playing around with foreign influence operations anymore. The platform recently wiped out a pro-Iran channel that spent its time uploading slick, brick-based animations designed to poke fun at Donald Trump and American military figures. It sounds ridiculous—Lego-style figures used for geopolitical warfare—but the intent was dead serious.
This wasn't just a hobbyist with a camera. The channel, which operated under names like "Kish" or "Kish Media," was part of a coordinated effort to spread Iranian state narratives through a medium that usually feels safe and playful. By the time YouTube pulled the plug, the channel had racked up millions of views. It was a classic example of "information laundering," where state propaganda gets dressed up as pop culture to slip past your mental filters.
The Strategy Behind Brick Based Propaganda
Most people think of state propaganda as grainy news broadcasts or angry speeches. That's old school. Modern psychological operations use "militainment." They take something universally loved—like Lego—and weaponize it.
The animations were surprisingly high quality. They featured digital recreations of Donald Trump, Mike Pompeo, and various military leaders. The scripts usually focused on the 2020 assassination of Qasem Soleimani, the Iranian general. One viral clip showed a Lego-fied version of the Mar-a-Lago estate being targeted. Another depicted Trump hiding in a bunker while Iranian drones circled overhead.
It's clever. It's also dangerous. When you see a plastic yellow figure, your brain doesn't immediately scream "foreign interference." You're more likely to watch, laugh, and share. That's exactly what the Islamic Republic of Iran Broadcasting (IRIB) wants. Tech researchers have been tracking these assets for months, noting that the production value suggests a professional studio rather than a basement creator.
Why YouTube Acted Now
YouTube's official stance is that the channel violated policies regarding "coordinated influence operations." Essentially, they found evidence that the account was linked to the Iranian government while pretending to be an independent creator.
The platform has been under massive pressure to clean up its act ahead of global elections. They can't afford to let state-backed actors run wild with deepfakes or inflammatory animations. It’s not just about the content itself—which was arguably "satire"—it’s about the source. If the Iranian government wants to run ads or post videos, they’re supposed to label them. They didn't. They tried to hide.
I've seen this play out before with Russian and Chinese bot nets. The goal isn't necessarily to make you love Iran. The goal is to make Americans hate each other. By mocking a polarizing figure like Trump, these channels farm engagement from his critics while enraging his supporters. It's a win-win for anyone looking to destabilize Western discourse.
The Connection to IRGC and State Media
Follow the money and the metadata. Security firms like Mandiant and Google’s own Threat Analysis Group (TAG) have spent years mapping out these networks. The "Lego" channel shared digital fingerprints with several other accounts previously linked to the IRIB and the Islamic Revolutionary Guard Corps (IRGC).
These groups use a "scattershot" approach. They launch twenty channels. Nineteen get banned. One survives long enough to hit the algorithm’s front page. This specific channel managed to stay live far longer than it should have by piggybacking on trending hashtags and using English-language voiceovers that sounded suspiciously like professional AI-generated scripts.
The irony? Lego actually hates this. The company has strict guidelines about its brand being used for political or violent imagery. While these weren't "official" Lego products—they were 3D models made to look like them—the association hurts the brand. It’s a form of IP hijacking for the sake of a fatwa.
Breaking the Cycle of Digital Warfare
If you think this is the end of it, you're kidding yourself. As soon as one channel goes dark, three more pop up with slightly different names. The "Lego" tactic worked because it was visual and easy to digest. Expect the next wave to use Minecraft-style blocks or Roblox environments.
The platforms are constantly playing catch-up. AI tools make it cheaper than ever to churn out this stuff. Five years ago, an animation like this would take a team of artists weeks to finish. Today? One guy with a powerful GPU and a script can do it in an afternoon.
We need to get better at spotting the "vibe" of state propaganda. If a video feels weirdly obsessed with a specific foreign policy grudge—like the Soleimani strike—and it's coming from an account with no clear history or personality, it's probably a plant.
Spotting the Signs of Foreign Influence
You don't need to be a cybersecurity expert to see the strings on the puppet. Here's what to look for when you're scrolling.
First, check the "About" section. Most of these pro-Iran channels have vague descriptions or were created years ago but only started posting "satire" recently. This is a sign of a "zombie account"—a hacked or bought channel used to bypass initial spam filters.
Second, look at the comments. They’re often filled with "praise be" messages or repetitive slogans in broken English. It’s a telltale sign of a bot farm boosting the video to trick the algorithm into thinking the content is "viral."
Third, notice the hyper-fixation. Normal creators talk about many things. State actors talk about one thing from fifty different angles. For Kish Media, that one thing was the humiliation of the United States.
Stop treating these videos as harmless memes. They are digital munitions. When you engage with them, you’re helping a foreign intelligence service reach its KPIs. YouTube did its job by nuking the channel, but the responsibility to not be a "useful idiot" falls on the viewer. Check the source before you click the share button. It’s that simple.