The phrase “out of the frying pan, into the fire” is an incredibly apt description of the plight of the internet’s social media giants in 2020. Already grappling to settle into their increasingly large roles in democracy and culture, social networks like Facebook and Twitter suddenly gained an even bigger role in our daily lives as the coronavirus pandemic took hold. In the face of this extra pressure, they had no choice but to adapt.
While these forced adaptations were no doubt difficult for the companies involved, the resulting changes have arguably been good ones — not only for individual users, but for the world at large.
Too many fires to put out
When the COVID-19 pandemic took hold, social media was a natural fallback. People turned to their online networks for community updates, virtual hangouts, news, entertainment, and more. Giants such as Facebook and Twitter faced a fresh coronavirus-related “infodemic,” while at the same time, an urgent responsibility hung on their shoulders to police an influx of controversial political content from President Donald Trump and many others who were quickly racking up huge follower counts.
….Twitter is completely stifling FREE SPEECH, and I, as President, will not allow it to happen!
— Donald J. Trump (@realDonaldTrump) May 26, 2020
That’s not all. Three months into the pandemic lockdown, as people tried to establish normalcy in their work-from-home routines, the police killing of George Floyd spurred nationwide Black Lives Matter protests, social media activism, and yet another tide of questionable online posts that brought social networks back into the spotlight.
Social media companies were running out of options. No longer could they downplay their massive role in the spread of misinformation, and sit idly by why we descended into chaos. Then, one of them took an unprecedented plunge. On May 26, Twitter took action on a tweet by Trump, who claimed mail-in ballots would lead to “a rigged election,” for the first time.
Forced moderation
The repercussions were not in Twitter’s favor. Trump soon launched an all-out assault on social networks and signed an executive order that sought to repeal and update Section 230, a subsection of the Communications Decency Act put in place to protect these sites from being held responsible for the content they host. With Section 230 potentially getting sacked, Facebook (to an extent) and Twitter began actively taking down political posts, including those from Trump — something they had refused to do for the previous four years.
Unlike Twitter, Facebook was remarkably slow to take a stand and that cost the company further reputation damage. For the first time since its inception, Facebook employees publicly criticized it and expressed disapproval over the company’s political choices. One such employee quit and published a scathing, 24-minute long video detailing how Facebook “hurts people at scale.”
With the looming general elections, Facebook and its social media peers couldn’t afford a repeat of the 2016 Cambridge Analytica scandal. In the ensuing weeks, these companies scrambled to patch their services and ensure they couldn’t be abused for political gain or hate speech campaigns. That meant flagging misleading posts no matter which world leader posted them, cracking down on political ads, blocking shares of certain content even if that meant taking an engagement hit, and stopping the rise of conspiracy hoaxes like the “Plandemic” movie. Facebook also released the results of an internal audit that basically said that the company was not doing enough to protect civil rights.
In a nutshell, 2020 was the year social networks began to truly realize they aren’t actually making the “world a better place” anymore, and to an extent, these initiatives were a breath of fresh air. However, at the same time, with new malicious groups like QAnon and Boogaloo popping up to challenge the new safeguards, oftentimes these same initiatives appeared to be piecemeal and reflected a case of too little, too late and seemed more reactionary and less preventative.
The politics of platforms
As politics continue to seep into the tech world, the results were nearly deadly for one social app in particular: TikTok, the only remaining threat to Silicon Valley’s established social giants, was pushed to the verge of a complete ban in the United States over national security concerns. To survive, the China-based video platform had to sell its business to Oracle and also lost its ex-Disney CEO, Kevin Mayer, after a mere three-month-long stint.
Even though TikTok survived the Trump administration, we learned what political interference can do to an app that is responsible for the livelihood of thousands of creators. Due to the mounting brouhaha around TikTok’s Chinese roots, it did end up getting banned in India, its largest market in terms of users.
In the midst of all this, Facebook CEO Mark Zuckerberg also testified in a landmark House Judiciary Committee hearing. Despite the excruciatingly long virtual session, the group of congressmen was barely able to sink their teeth into the most pressing issues, and many of them simply resorted to questioning the tech overlords over the treatment of their own parties. It was not totally unproductive, however.
‘Instagram can hurt us’
For the hearing, Facebook had to cede a treasure trove of internal communication data that unearthed details Congress needed to launch an antitrust investigation. “Instagram can hurt us,” Zuckerberg said in one email sent in February 2012. A few weeks ago, the Federal Trade Commission demanded a breakup of Facebook’s acquisitions, including Instagram and WhatsApp.
These concerns could shape the tech industry for years to come, especially given that Google is currently also facing an unprecedented antitrust lawsuit. What would a world of several competing tech giants look like?
Another question that hangs in the balance is whether social networks will ever be able to return to normalcy. Over half of participants in a study conducted by the Pew Research Center said they were “worn out” by political posts and discussions.
The road ahead
It’s unlikely that social networks will ever be able to restore their former images as they come under growing scrutiny across the world.
And next year, social media companies face another hurdle in the form of vaccine misinformation. Pandemic fake news is already a topic Facebook and Twitter have not been able to completely suppress, and we can expect more of it once countries kick-start wide-scale vaccine rollouts.
Tech companies have begun gearing up for it with updated policies and information centers, but will that prevent vaccine misinformation? Only time will tell. What we do know today is that 2020 has reformed social networks and their priorities in more ways than one and that hopefully should be enough of an indication for them not to take their foot off the pedal just yet.