X’s engagement-driven algorithms transformed the Minnesota shooting into a global disinformation crisis within hours, as political influencers and high-profile figures—including Elon Musk—leveraged the platform’s weakened moderation to broadcast contradictory and unverified narratives. While other social media networks grappled with the fallout, X functioned as an optimized catalyst for falsehoods, rewarding users who prioritized sensationalism over verified reporting.
The Infrastructure of Viral Misinformation
The explosive proliferation of conspiracy theories stems directly from X’s current architectural priorities. Laura Edelson, an assistant professor at Northeastern University specializing in online disinformation, notes that the platform’s feed algorithm maximizes engagement at any cost, including negative engagement. “In these conditions, conspiratorial, extreme content tends to perform very well,” Edelson explains, citing the combination of weakened content rules and algorithmic amplification as the primary driver of the chaos.
Since the 2022 takeover, the platform has dismantled the specialized teams previously tasked with mitigating disinformation. In their place, a system exists where high-volume users feel incentivized to share out-of-context clickbait to maximize visibility and potential revenue. When WIRED sought a response regarding these systemic failures, X declined to comment.
A Battlefield of Contradictory Narratives
Minutes after authorities identified the shooter, dozens of accounts disseminated an 11-minute video showcasing a massive cache of weapons and ammunition. The shooter adorned these items with over 120 symbols, phrases, and memes referencing a chaotic spectrum of hateful ideologies. Despite warnings from extremism researchers to avoid premature conclusions, X users immediately weaponized the imagery to serve specific political agendas.
Right-wing figures, including Representative Marjorie Taylor Greene, podcaster Benny Johnson, and Elon Musk himself, focused heavily on the shooter’s gender identity. Although X’s AI chatbot, Grok, later clarified that transgender individuals do not disproportionately commit mass shootings, the initial narrative had already achieved millions of views. Simultaneously, commentator Nick Sortor and FBI Director Kash Patel suggested the attack targeted Catholics, with Patel noting the investigation treated the event as a “hate crime targeting Catholics.” Other voices, such as Laura Loomer, alleged radicalization by leftist and Islamic ideologies, citing anti-Israel phrases found on the weapons.
Conversely, left-wing commentators highlighted the shooter’s use of racist language and praise for previous mass killers. Benjamin Dixon, a podcaster, characterized the perpetrator as a “right-wing incel.” The platform’s automated summaries further muddied the waters by highlighting “anti-Trump messages” while ignoring the broader, more complex set of symbols present on the scene.
Nihilism Over Ideology: What Researchers Actually Found
While partisan accounts raced to categorize the shooter, extremism experts analyzing the digital footprint suggest a more disturbing reality: the absence of a coherent ideology. Marc-André Argentino, a prominent researcher, argues the shooter likely belongs to a growing faction of nihilistic violent extremists. These individuals view violence not as a means to a political end, but as the goal itself.
“They are simply doing this for the sake of violence, for their desire for notoriety, to know what it feels like to be one of their idols,” Argentino wrote. He warned that such attackers often follow a “script” designed to troll journalists and trick the media into creating a “Streisand effect,” thereby immortalizing the attack through out-of-context viral spread.
The Systematic Collapse of Platform Integrity
The current environment on X rewards speed over accuracy, a phenomenon Nina Jankowicz, CEO of the American Sunlight Project, describes as “context collapse.” This rhetorical device involves twisting real events or quotes to mislead audiences who are disincentivized to read beyond a headline or a 280-character post.
The replacement of professional moderation with the crowdsourced Community Notes system and the Grok AI fact-checker has proven insufficient during major breaking news events. This lack of oversight previously fueled misinformation during the Israel-Hamas conflict and recent civil unrest in Los Angeles. Mike Rothschild, an author focusing on conspiracy theories, concludes that X has become a sanctuary for disinformation accounts and grifters. “There are certain narratives about mass shootings that will instantly find homes on X, and nothing holds them back from spreading,” Rothschild observes.
