Today, as prompted by feedback from the Oversight Board’s continued push to ensure our penalty system is more proportionate, we’re taking steps to update Facebook’s penalty system to make it more effective and fair. Since nothing is changing about the underlying content removal, we do not anticipate this penalty system refinement to have a material effect on prevalence. With this update we will still be able to keep our app safe while also allowing people to express themselves.
Under the new system, we will focus on helping people understand why we have removed their content, which is shown to be more effective at preventing re-offending, rather than so quickly restricting their ability to post. We will still apply account restrictions to persistent violators, typically beginning at the seventh violation, after we’ve given sufficient warnings and explanations to help the person understand why we removed their content. We will also restrict people from posting in groups at lower thresholds where warranted. For more serious violations: posting content that includes terrorism, child exploitation, human trafficking, suicide promotion, sexual exploitation, the sale of non-medical drugs or the promotion of dangerous individuals and organizations, we will continue to apply immediate consequences, including account removal in severe cases.
The vast majority of people on our apps are well-intentioned. Historically, some of those people have ended up in “Facebook jail” without understanding what they did wrong or whether they were impacted by a content enforcement mistake.
Our analysis has found that nearly 80% of users with a low number of strikes do not go on to violate our policies again in the next 60 days. This means that most people respond well to a warning and explanation since they don’t want to violate our policies. But at the same time, some people are determined to post violating content regardless of our policies. Our analysis suggests that applying more severe penalties at the seventh strike is a more effective way to give well-intentioned people the guidance they need while still removing bad actors.
As well as being more effective, these changes will be fairer to those people who may have been disproportionately impacted by our old system, particularly when we made an incorrect moderation decision or missed context. For example, someone may jokingly post, “I’m on my way to kidnap you,” not knowing that this statement could violate our violence and incitement policy, when really it was a post about taking a friend out for dinner after that person had a rough day. Someone else might post a friend’s name and address as a way of sharing information for a party, and without that context, that post might appear to be sharing personally identifiable information in violation of our policies. The implications of overenforcement are real — when people are unintentionally caught up in this system, they may find it hard to run their business, connect with their communities or express themselves.
Our previous system resorted quickly to long penalties, such as a 30-day block on a person’s ability to create content. These long blocks were frustrating for well-intentioned people who had made mistakes, and they did little to help those people understand our policies. The blocks also made it harder for us to spot violation trends and sometimes had the effect of letting bad actors stay on the site longer. Our new system, which reduces the number of restriction periods, will allow us to detect persistent violators in less time, leading to faster and more impactful actions.
Today’s changes, while based on our own analysis and feedback of the Oversight Board, are also responsive to feedback from our community — including our civil rights auditors — who noted that our current enforcement system needs more focus on proportionality. Independent experts who offered their guidance on this topic have routinely noted that our penalty system should have a better balance between punishing and encouraging remediation through education.
“…It is positive that Meta has introduced further transparency and coherence in this area as a result of implementing prior Oversight Board recommendations, moving towards what should be a more proportionate and transparent approach with higher strike-to-penalty thresholds. Meta’s plans to issue more comprehensive penalty notifications should ensure that users are better placed to understand the reasons for the consequences of strikes and the reasons for feature-limits in the future.” – Oversight Board Iran protest slogan Case Decision
External guidance is an important part of this process, and we’ll continue to work with experts in academia, civil society and the policy community to continue to improve and evolve our policy enforcement in an ever changing environment. We will provide more updates as this process continues.
The post How We’re Improving Facebook’s Penalty System appeared first on Meta.