Facebook COO Sheryl Sandberg is rightfully being criticized for claiming that the US Capitol riots were “largely organized” on other social media platforms “that don’t have our abilities to stop hate and don’t have our standards and don’t have our transparency.” Despite her attempts to deflect blame, Facebook as well as other social media companies played a crucial role in fueling the mob that stormed the Capitol building.
The compartmentalization of users in echo chambers where only extreme — and often false — viewpoints can travel is probably the single-most important driver of political polarization. Absent a social media landscape that incites outrage and division, the Capitol riots likely never would’ve happened.
With so much damage already done and just a handful of days left in the Trump presidency, social media and technology companies finally banned Trump and the apps some white supremacist followers used to incite violence. But these decisions were made too late and now risk emboldening conspiracy theorists who still believe the election was stolen. They also raise legitimate questions as to why we let profit-driven companies set the rules of democratic engagement online in the first place.
It is troublesome that we’ve turned over the rules of online deception, harassment and incitement to violence to powerful tech CEOs who are not accountable to the public. Our democratic institutions must take back control and introduce regulations that address incendiary online propaganda.
Here’s how Congress and the Biden-Harris administration should approach social media regulation in the months ahead:
Establish a disinformation task force
For the Biden-Harris administration to begin to address the problem, it should treat the spread of disinformation as a fundamental threat to achieving progress in all facets of policy, as a large coalition that I’m part of urged last month.
The spread of online disinformation is not only a threat to our democratic institutions; it also undermines important efforts to respond to the pandemic or combat climate change. The incoming administration should establish a task force to study the harms of disinformation on social media; launch a website to combat viral disinformation; and appoint an expert on disinformation to the Covid-19 task force — this person would be in charge of coordinating a national response.
By prioritizing the fight against disinformation, the next administration can help restore common ground between existing factions, lay the foundation for Congress to reverse the decay of our democratic institutions and fix our broken information ecosystem.
Expand financial disclosure laws
For far too long, the First Amendment’s specter has been used by actors that benefit from an unregulated internet to prevent any legislation from increasing online transparency, especially when it comes to political speech.
Six years ago, as vice chair of the Federal Election Commission, I called on the commission to discuss online political messaging, which was and still is largely exempt from federal campaign finance disclosure laws. These gaps allow malicious actors — both domestic and foreign — to anonymously target voters with inflammatory propaganda, and to leverage influencers, troll farms and fake accounts to amplify their messages, all outside of the public eye.
As online political disinformation morphs into physical violence, it’s past time for comprehensive reforms such as H.R.1, which was recently reintroduced in the House. H.R.1 introduces clear disclosure and disclaimer requirements for political ads online, mandates organizations to reveal their top donors and requires tech companies to maintain a public database of online political ads shown to their users, including who the ads targeted, the buyer and the rates charged.
Improve online content moderation
Congress should also pass legislation requiring social media and technology companies to enhance the transparency of their content moderation processes, conduct periodic risk assessments that look at how their rules and recommendations may help spread deceptive propaganda and other harmful content and explore suitable mitigation strategies in cooperation with government agencies, experts, consumer associations and civil society groups.
In their role as gatekeepers of the public discourse, social media companies should also be required to develop a crisis protocol that could be activated to contain the spread of harmful activities that pose an imminent threat to public security or public health. Other jurisdictions, such as the EU, are currently considering options like these, and US lawmakers should also have a thoughtful debate on how to keep platforms accountable without stifling free speech.
The violent outburst of anger that disrupted a key step in the peaceful transfer of power and resulted in the death of five people was not only the outcome of white supremacy and viral conspiracy theorists enabled by unregulated digital platforms — it was also preventable. We are staring into the abyss, but the erosion of our democracy is not an inevitable fate.