Donald Trump’s Facebook ban will last for at least two years, the company said on Friday. Facebook said the actions of the former president on January 6, which contributed to a violent mob storming Capitol Hill and staged an insurgency that left five people dead, “constituted a serious violation of our rules,” and that ‘he adopted this policy shift as part of a new approach to public figures during the civil unrest.
Facebook added that the two-year sanction is a period of time “long enough” to have a significant deterrent effect on Trump and other world leaders who might post similar messages, as well as enough to allow for a “period of safety afterwards. acts of incitement. “However, Facebook has still not made a final decision on the future of Trump’s account. The company said that after two years it will re-assess whether there is still a risk to the public safety and potential civil unrest.
“We know that any sanction that we apply – or choose not to apply – will be controversial. There are a lot of people who think it was not appropriate for a private company like Facebook to suspend an exiting president from its platform, and many others think that Mr. Trump should have been immediately banned for life ”, Nick Clegg, Corporate Vice President. global affairs, said in a blog post, later adding: “The Supervisory Board is not a substitute for regulation, and we continue to call for thoughtful regulation in this space.”
The announcement comes after the supervisory board of Facebook, a group of political experts and journalists the company has appointed to handle tough content moderation issues, decided to keep the platform frozen on the account of the former president. In May, the board ruled that Facebook should not have banned Trump indefinitely and should make a final decision within six months. The board also said Facebook should clarify its rules regarding world leaders and the risk of violence, among other recommendations.
“The Supervisory Board is reviewing Facebook’s response to the Board’s decision in the case involving former US President Donald Trump and will offer further comments once that review is complete,” the Board’s press team said in response to Facebook’s announcement on Friday. Later that day, the board of directors said in a statement that it was “encouraged” by Facebook’s move and that it would monitor the company’s implementation.
Facebook now says it will fully implement 15 of the 19 supervisory board recommendations. He also responded to the board’s request for more details on its media-value exception, a policy that Facebook has used – albeit rarely – to give politicians a free pass to post content that violates its rules. Now Facebook says it will tag posts that receive these exceptions and treat posts from politicians more like those from regular users.
This set of Facebook decisions has major implications not only for Trump’s account, but also for national policy in the United States for the foreseeable future. At the same time, they report that the company has remained committed to retaining its power to decide what politicians can ultimately post on the platform. Facebook provides more details on the rules it could use to punish politicians who violate its community guidelines, potentially increasing transparency. Still, it’s Facebook that has the final say on the app, including what is considered newsworthy and stays on the platform versus what violates its community guidelines and gets deleted.
Facebook still decides who gets a free pass from its period
In Friday’s announcement, Facebook said it would change one of its most controversial policies: an allowance for content that breaks its rules but is important enough to public discourse to stay online, often because ‘it was published by a politician. Some call it the “media value exception” or the “world leader exception”. Now Facebook is changing the rules to make the exemption appear more transparent and less unfair. But the company still retains its power to decide what will happen the next time a politician posts something offensive or dangerous.
Trump was behind the exemption, which Facebook first created in 2015 after the former president (then candidate) posted a video of himself saying Muslims should be banned from the United States. . The topical exception was officially announced in 2016 and has long been controversial because it creates two types of users and posts: those who must follow Facebook’s rules and those who don’t, and can post content. offensive and even dangerous.
In 2019, the company added more details. Nick Clegg, Facebook’s vice president for global affairs and communications, said Facebook would assume anything a politician posted to its platform would be of interest to the public and should stay active – “even if it would otherwise violate our rules. of normal content ”- and as long as the public interest outweighed the risk of harm.
The policy also presumably serves as a practical shield for Facebook to avoid fighting with powerful people (like the President of the United States).
Despite all the controversy and confusion it has engendered, Facebook says the media interest exception is rarely deployed. In 2020, Facebook’s independent civil rights audit reported that Facebook had used the exception only 15 times in the previous year, and only once in the United States. Facebook changed its previous statement to the supervisory board on Friday, saying it had technically only used the standard once in relation to Trump, following a video Trump posted on one of the his 2019 rallies. Although he is rarely the beneficiary of the policy, the supervisory board said in May that the suspension of Trump’s account meant Facebook would have to respond to the continuing confusion.
Now Facebook says politicians’ content will be scanned for violations of its community guidelines – and weighed against the public interest – like any other user. While that means the official world leader exception is gone, much of what actually remains on Facebook remains where it started: in Facebook’s hands.
Facebook will not study how the platform contributed as of January 6
In the aftermath of the deadly January 6 insurgency, many have pointed to the role social media platforms, including Facebook, have played in escalating the violence. Facebook critics said the insurgency showed how Facebook should not only think about its approach to Trump’s account, but also algorithms, ranking systems, and design feature choices that might have helped them. rioters to organize.
Even the supervisory board of Facebook, an independent body created by Facebook to act as a sort of tribunal for disputes over the company’s most difficult content moderation decisions, recommended that Facebook take such a step. Earlier this week, allies in the Biden administration urged the company to follow these guidelines and conduct a public review of how the platform could have contributed to the insurgency.
Facebook has many reasons to believe that its platform contributed to the events of January 6. At a minimum, they have an obligation to conduct a full, independent and thorough investigation and to publish the results. It’s the least they should be doing.
– Katy Glenn Bass (@KGlennBass) June 4, 2021
But Facebook doesn’t do that, and it seems to be deflecting that responsibility. Instead, the company points to a separate research effort focused on Facebook, Instagram and the 2020 U.S. Election, which Facebook says could include studying what happened on Capitol Hill.
“The responsibility for Jan. 6, 2021 lies with the insurgents and those who encouraged them,” the company said in its decision on Friday, adding that independent researchers and politicians were best placed to study the role of social media in the ‘insurrection.
“We also believe that an objective examination of these events, including contributing societal and political factors, should be conducted by elected officials,” the company wrote, adding that it would continue to work with law enforcement. Republicans in particular virtually shut down the possibility of a bipartisan commission on Jan.6.
Facebook may never make a final decision on Trump
Facebook is delaying, perhaps forever, a final decision on Trump himself. Currently, Facebook plans to suspend Trump for at least two years, which means he will recover his account in early 2023. The ban excludes Trump from using the platform to comment on the midterm elections of 2022, in which his posts could have boosted (or hurt) hundreds of Republican House candidates.
Still, the two-year ban is not a final decision on whether Trump can come back to Facebook. This means that it is still not clear whether the former president will have access to the platform if he runs for president again. It also leaves open the question of what it would really take for a politician to be permanently kicked off the platform.
Many are frustrated that Facebook has not permanently banned Trump. He may return to the platform in time to run for president in 2024, and Facebook obviously knows that. “If it takes 2 years, what can be done to get a lifetime ban,” wrote an employee in an internal post, according to BuzzFeed. Civil rights groups reacting to the decision, Facebook’s decision was inadequate and Trump’s potential social media comeback was a danger to democracy. Some believe the ruling proves yet again that lawmakers need to step in and regulate social media.
Trump, for his part, appears extremely unhappy with Facebook’s decision. “Facebook’s move is an insult to the 75 million record-setting people, as well as many more who voted for us in the rigged 2020 presidential election,” Trump said in a released statement. Friday. “They shouldn’t be allowed to get away with this censorship and silence and eventually we will win. Our country can no longer endure this abuse!
It’s not even clear what Trump’s return to Facebook would look like. Facebook said the policy was in part aimed at deterring politicians from breaking their rules again, but Facebook’s current suspension hasn’t stopped the former president from spreading electoral conspiracy theories on other platforms. Facebook has hinted that Trump could eventually return when things are more stable, but it often appears that Trump himself is a primary source of instability.
It’s important that Trump doesn’t post to Facebook until 2023, at the earliest, and that the company has some shiny new rules. But overall, Facebook once again retains its power to decide what to do next.
Update, June 4, 6:10 p.m. ET: This article has been updated with further analysis.