I’m a researcher of media manipulation, and watching the 2024 US election returns was like seeing the Titanic sink.
Every day leading up to 5 November, there were more and more outrageous claims being made in an attempt across social media to undermine election integrity: conspiracy theories focused on a tidal wave of immigrants plotting to undermine the right wing, allegations that there were millions of excess ballots circulating in California, and rumors that the voting machines were already corrupted by malicious algorithms.
All of the disinformation about corrupt vote counts turned out not to be necessary, as Donald Trump won the election decisively. But the election proved that disinformation is no longer the provenance of anonymous accounts amplified by bots to mimic human engagement, like it was in 2016. In 2024, lies travel further and faster across social media, which is now a battleground for narrative dominance. And now, the owners of the platforms circulating the most incendiary lies have direct access to the Oval Office.
We talk a lot about social media “platforms”. The word “platform” is interesting as it means both a stated political position and a technological communication system. Over the past decade, we have watched social media platforms warp public opinion by deciding what is seen and when users see it, as algorithms double as newsfeed and timeline editors. When tech CEOs encode their political beliefs into the design of platforms, it’s a form of technofascism, where technology is used for political suppression of speech and to repress the organization of resistance to the state or capitalism.
Content moderation at these platforms now reflects the principles of the CEO and what that person believes is in the public’s interest. The political opinions of tech’s overlords, like Musk and Zuckerberg, are now directly embedded in their algorithms.
For example, Meta has limited the circulation of critical discussions about political power, reportedly even downranking posts that use the word “vote” on Instagram. Meta’s Twitter clone, Threads, suspended journalists for reporting on Trump’s former chief of staff describing Trump’s admiration of Hitler. Threads built in a politics filter that is turned on by default.
Implementing these filtering mechanisms illustrates a sharp difference from Meta’s embrace of politicians who got personalized white-glove service in 2016 as Facebook embedded employees directly in political campaigns, who advised on branding and reaching new audiences. It’s also a striking reversal of Zuckerberg’s free speech position in 2019. Zuckerberg gave a presentation at Georgetown University claiming that he was inspired to create Facebook because he wanted to give students a voice during the Iraq war. This historical revisionism was quickly skewered in the media. (Facebook’s predecessor allowed users to rate the appearance of Harvard female freshmen. Misogyny was the core of its design.) Nevertheless, his false origin story encapsulated a vision of how Zuckerberg once believed society and politics should be organized, where political discussion was his guiding reason to bring people into community.
However, he now appears to have abandoned this position in favor of disincentivizing political discussion altogether. Recently, Zuckerberg wrote to the Republican Jim Jordan saying he regretted his content moderation decisions during the pandemic because he acted under pressure from the Biden administration. The letter itself was an obvious attempt to curry favor as Trump rose as the Republican presidential candidate. Zuckerberg has reason to fear Trump, who has mentioned wanting to arrest Zuckerberg for deplatforming him on Meta products after the January 6 Capitol riot.
X seems to have embraced the disinformation chaos and fully fused Trump’s campaign into the design of X’s content strategies. Outrageous assertions circle the drain on X, including false claims such as that immigrants are eating pets in Ohio, Kamala Harris’s Jamaican grandmother was white, and that immigrants are siphoning aid meant for Fema. It’s also worth noting that Musk is the biggest purveyor of anti-immigrant conspiracy theories on X. The hiss and crackle of disinformation is as ambient as it is unsettling.
There are no clearer signs of Musk’s willingness to use platform power than his relentless amplification of his own account as well as Trump’s Twitter account on X’s “For You” algorithm. Moreover, Musk bemoaned the link suppression by Twitter in 2020 over Hunter Biden’s laptop while then hypocritically working with the Trump campaign in 2024 to ban accounts and links to leaked documents emanating from the Trump campaign that painted JD Vance in a negative light.
Musk understands that he will personally benefit from being close to power. He supported Trump with a controversial political action committee that gave away cash to those who signed his online petition. Musk also paid millions for canvassers and spent many evenings in Pennsylvania stumping for Trump. With Trump’s win, he will need to make good on his promise of placing Musk in a position on the not-yet-created “Department of Government Efficiency” (Doge – which is also the name of Musk’s favorite cryptocurrency). While it sure seems like a joke taken too far, Musk has said he plans to cut $2tn from the national budget, which will wreak havoc on the economy and could be devastating when coupled with the mass deportation of 10 million people.
In short, what we learn from the content strategies of X and Meta is simple: the design of platforms is now inextricable from the politics of the owner.
This wasn’t inevitable. In 2016, there was a public reckoning that social media had been weaponized by foreign adversaries and domestic actors to spread disinformation on a number of wedge issues to millions of unsuspecting users. Hundreds of studies were conducted in the intervening years, by internal corporate researchers and independent academics, showing that platforms amplify and expose audiences to conspiracy theories and fake news, which can lead to networked incitement and political violence.
By 2020, disinformation had become its own industry and the need for anonymity lessened as rightwing media makers directly impugned election results, culminating in January 6. That led to an unprecedented decision by social media companies to ban Trump, who was still the sitting president, and a number of other high-profile rightwing pundits, thus illustrating just how powerful social media platforms had become as political actors.
In reaction to this unprecedented move to curb disinformation, the richest man in the world, Musk, bought Twitter, laid off much of the staff, and sent internal company communications to journalists and politicians in 2022. Major investigations of university researchers and government agencies ensued, naming and shaming those who engaged with Twitter’s former leadership and made appeals for the companies to enforce its own terms of service during the 2020 election.
Since then, these CEOs have ossified their political beliefs in the design of algorithms and by extension dictated political discourse for the rest of us.
Whether it’s Musk’s strategy of overloading users with posts from himself and Trump, or Zuckerberg’s silencing of political discussion, it’s citizens who suffer from such chilling of speech. Of course, there is no way to know decisively how disinformation affected individual voters, but a recent Ipsos poll shows Trump voters believed disinformation on a number of wedge issues, claiming that immigration, crime, and the economy are all worse than data indicates. For now, let this knowledge be the canary warning of technofascism, where the US is not only ruled by elected politicians, but also by technological authoritarians who control speech on a global scale.
If we are to disarm disinformers, we need a whole of society approach that values real Talk (Timely, Accurate Local Knowledge) and community safety. This might look like states passing legislation to fund local journalism in the public interest, because local news can bridge divides between neighbors and bring some accountability to the government. It will require our institutions, such as medicine, journalism, and academia, to fight for truth and justice, even in the face of anticipated retaliation. But most of all, it’s going to require that you and I do something quickly to protect those already in the crosshairs of Trump’s new world order, by donating to or joining community organizations tackling issues such as women’s rights and immigration. Even subscribing to a local news outlet is a profound political act these days. Let that sink in.
Joan Donovan is the founder of the Critical Internet Studies Institute and assistant professor of journalism at Boston University