State actors are behind much of the visual misinformation about the Iran war

State actors are behind much of the visual misinformation about the Iran war

Since the outbreak of the Iran war last weekend, a surge of misrepresented and fabricated videos has flooded social media platforms, complicating the public’s understanding of the conflict. This wave of misinformation is driven in large part by state-linked propaganda campaigns aiming to influence perceptions about the war’s progress, casualties, and overall impact. The proliferation of false content is exacerbated by advances in artificial intelligence (AI), which enables the creation of highly convincing yet entirely fictional videos that are difficult for the average user to discern as fake.

One prominent example involved a video showing crowds gazing up at fire, smoke, and debris falling from the top of a high-rise building purportedly located in Bahrain. The video circulated widely on social media, accompanied by claims that the building was struck in an Iranian attack. While it is true that Iranian missile strikes have targeted buildings in Bahrain during the conflict, this particular video was not real. It was an AI-generated fabrication, shared by accounts linked to the Iranian government as part of a broader effort to amplify Iran’s military successes. Careful analysis reveals telltale signs of its inauthenticity, such as two cars on the left side of the frame appearing unnaturally stuck together and a man whose elbow bizarrely passes through his backpack.

This example is emblematic of a larger trend. Since the onset of hostilities, a flood of manipulated or entirely fabricated content has been disseminated online, often with clear political motives. Melanie Smith, senior director of policy and research on information operations at the Institute for Strategic Dialogue, notes that content originating from state actors tends to be more strategically targeted. These actors craft videos and narratives designed to support specific geopolitical messages, shaping public opinion about who is winning the war and the scale of casualties, while reinforcing their own political agendas.

Pro-Iran social media accounts have been especially active in spreading narratives that exaggerate the destruction wrought by Iranian forces and inflate casualty figures. These narratives align closely with reports from Iranian state media, which seek to present Iran as a dominant and victorious actor in the conflict. The use of AI-generated videos depicting supposed airstrikes, such as the fake Bahrain high-rise footage, is a key tactic in this effort to bolster Iran’s image of military strength.

Meanwhile, a separate but related influence operation—known as Operation Overload, or Matryoshka/Storm-1679—has been active, with ties to Russian-aligned networks. This operation uses videos impersonating intelligence agencies and news organizations to sow fear and confusion among the public. For example, it circulated a false warning attributed to Israeli intelligence, cautioning Israelis in Germany and the United States to avoid going outside, thereby undermining a sense of security. Such tactics are designed to manipulate behavior and weaken public trust in official information sources. The network behind Operation Overload has previously employed similar methods during election cycles, demonstrating a pattern of using disinformation to influence political outcomes.

The widespread use of misrepresented and fabricated videos is not unique to the Iran conflict; similar trends have been observed in recent wars, including the Russia-Ukraine and Israel-Hamas conflicts. However, experts highlight a significant difference in the current Iran war: the relative absence of direct information from Iranian citizens. Internet shutdowns and stringent censorship have severely limited the ability of ordinary Iranians to share their perspectives online. This vacuum contrasts sharply with the Ukraine conflict, where firsthand accounts and videos from Ukrainians facing attacks played a crucial role in shaping global understanding and sympathy. Todd Helmus, a senior behavioral scientist at RAND specializing in irregular warfare and information operations, points out that the lack of authentic Iranian voices leaves a gap in the narrative, making it harder to fully grasp the human impact of the conflict or to challenge government propaganda.

Apart from state-linked efforts, opportunistic social media users unaffiliated with any government have also contributed heavily to the misinformation landscape. In pursuit of clicks, these users have recycled old footage from previous conflicts, passed off video game clips as real war scenes, and posted their own AI-generated content without disclaimers. This opportunism further muddies the information environment, making it difficult for observers to separate truth from fiction.

AI technology, in particular, has transformed the scale and sophistication of misinformation dissemination. Whereas past conflicts saw manipulated content, the current wave includes entirely AI-created videos that can simulate realistic scenes, complete with crowd reactions and environmental details. This technological advancement challenges traditional verification methods and compounds the difficulty of accessing reliable information amidst the fog of war. Melanie Smith warns that

Previous Post Next Post

نموذج الاتصال