A wave of misinformation has inundated social media platforms following the reported capture of Venezuelan leader Nicolás Maduro. Users on TikTok, Instagram, and X circulated old videos and AI-generated content, misleadingly claiming they depicted recent events in Caracas. These posts falsely portrayed US Drug Enforcement Administration (DEA) agents and law enforcement officials apprehending Maduro.
In recent years, significant global events have spurred a proliferation of disinformation, particularly as tech companies have scaled back their moderation efforts. Numerous accounts have exploited these relaxed rules, aiming to enhance engagement and attract followers. Former President Donald Trump claimed on Truth Social early Saturday that, “The United States of America has successfully carried out a large-scale strike against Venezuela and its leader, President Nicolás Maduro, who has been, along with his wife, captured and flown out of the Country.”
Shortly thereafter, US Attorney General Pam Bondi revealed that Maduro and his wife had been indicted in the Southern District of New York on multiple charges, including conspiracy to commit narco-terrorism, cocaine importation conspiracy, and possession of machine guns. “They will soon face the full wrath of American justice on American soil in American courts,” Bondi stated in a post on X.
As news of Maduro’s alleged arrest spread, an image purporting to show two DEA agents alongside the Venezuelan president quickly gained traction across various platforms. However, WIRED utilized SynthID, a technology developed by Google DeepMind, to ascertain that the image was likely fabricated. “Based on my analysis, most or all of this image was generated or edited using Google AI,” Google’s Gemini chatbot determined after examining the widely shared image. “I detected a SynthID watermark, which is an invisible digital signal embedded by Google’s AI tools during the creation or editing process. This technology is designed to remain detectable even when images are modified, such as through cropping or compression.” Fact-checker David Puente was the first to flag the misleading image.
While X’s AI chatbot Grok also verified that the image was not genuine, it erroneously claimed that it depicted an altered version of the arrest of Mexican drug lord Dámaso López Núñez in 2017. Furthermore, WIRED previously reported that when questioned about Maduro’s capture on Saturday morning, ChatGPT strongly denied that the event had occurred.
In addition to the questionable image, numerous users have leveraged AI tools to create videos purporting to showcase Maduro’s arrest. TikTok showcased several of these seemingly AI-generated videos, which garnered hundreds of thousands of views within hours of the reported capture. Many TikTok clips appeared to be based on AI-generated images initially posted on Instagram by a digital creator identified as Ruben Dario, which were viewed over 12,000 times. Similar content has also surfaced on X.
At the time of this report, requests for comment from X, Meta, and TikTok went unanswered.
As is often the case following major global incidents, such as the outbreak of the Israel-Hamas war in October 2023 or the US bombing of Iranian nuclear sites last summer, numerous disinformation purveyors have circulated old footage while asserting it depicted events in Caracas on Saturday. Pro-Trump influencer Laura Loomer was among many who shared footage showing a poster of Maduro being taken down, stating on X: “Following the capture of Maduro by US Special Forces earlier this morning, the people of Venezuela are ripping down posters of Maduro and taking to the streets to celebrate his arrest by the Trump administration.” This footage was originally recorded in 2024, and Loomer later deleted the post.
Another misleading video, alleging to show the US assault on Caracas, was shared by an account named “Defense Intelligence” shortly after Trump’s announcement regarding Maduro’s capture, amassing over 2 million views on X. This footage was actually first posted on TikTok in November 2025. As of the publication of this article, the misleading post remains available on X.
