Meta’s Ongoing Efforts Regarding Russia’s Invasion of Ukraine

2 years ago 169
  • We’ve established a special operations center staffed by experts from across the company, including native Russian and Ukrainian speakers, who are monitoring the platform around the clock, allowing us to respond to issues in real time. 
  • We’ve added several safety features in Ukraine, including the ability for people to lock their Facebook profile, removing the ability to view and search friends lists, and additional tools on Messenger.
  • We’re taking extensive steps to fight the spread of misinformation by expanding our third-party fact-checking capacity in Russian and Ukrainian. We’re also providing more transparency around state-controlled media outlets, prohibiting ads from Russian state media and demonetizing their accounts.

Our thoughts are with everyone affected by the war in Ukraine. We are taking extensive steps across our apps to help ensure the safety of our community and support the people who use our services — both in Ukraine and around the world.

The following are some of the specific steps we have taken regarding Russia’s invasion of Ukraine:

Helping to Keep People in Ukraine Safe

We’ve added several safety features in Ukraine in response to the situation on the ground.

  • Lock Your Profile: This tool allows people to lock their Facebook profile in one step. When someone’s profile is locked, people who aren’t their friends can’t download, enlarge or share their profile photo, nor can they see posts or other photos on someone’s profile, regardless of when they may have posted it. Our teams are working with non-governmental organizations and civil society organizations to help ensure people know these tools are available.
  • Friends Lists: We’ve temporarily removed the ability to view and search the friends lists of Facebook accounts in Ukraine to help protect people from being targeted.
  • Instagram Privacy and Security Reminders: We’re sending everyone on Instagram in Ukraine a top-of-feed notification about privacy and account security. For public accounts, we are reminding them to check their settings in case they want to make their accounts private. When someone makes their account private, any new followers will need to be approved, and only their followers will be able to see their posts and stories. For people who already have private accounts, we’re sharing tips on how to keep their account secure through strong passwords and two-factor authentication.
  • Privacy and Safety in Messenger: We’ve increased the tools available to Messenger users in Ukraine, such as by quickly rolling out notifications for screenshots and disappearing messages in our end-to-end encrypted chats.
  • Secure Messaging on WhatsApp: As always, your personal messages and calls are protected with end-to-end encryption by default so they cannot be intercepted by any government. You can now use “view once” media to send photos or video that vanish after being seen and “disappearing mode” to automatically erase all new chats after 24 hours to protect information in case you lose your phone. We strongly recommend everyone enable two-step verification to protect against hackers that could try to lock you out of your account. 

Enforcing Our Policies

We are taking additional steps to enforce our Community Standards and Community Guidelines, not only in Ukraine and Russia but also in other countries globally where content may be shared.

  • We are enforcing our policies around hate speech, violence and incitement and coordinating harm, amongst others, using technology to help us find content quickly – often before people see it and report it to us.
  • We’ve established a special operations center staffed by experts from across the company, including native Russian and Ukrainian speakers, who are working around the clock to monitor and respond to this rapidly evolving conflict in real-time. This allows us to remove content that violates our Community Standards or Community Guidelines faster and serves as another line of defense against misinformation.
  • We have teams of native Russian and Ukrainian content reviewers to help us review potentially violating content. We’re also using technology to help us scale the work of our content review teams and to prioritize what content those teams should be spending their time on, so we can take down more violating content before it goes viral. 
  • We’re getting feedback from a network of local and international partners on emerging risks and moving quickly to address these risks. We recognize that local context and language-specific expertise is essential for this work, so we will remain in close communication with experts, partner institutions and non-governmental organizations.
  • As part of this effort, our security teams are continuing to monitor for emerging threats and enforce against malicious activity.

Reducing the Spread of Misinformation

We are taking extensive steps to fight the spread of misinformation on our services and continuing to consult with outside experts.

  • We’re removing content that violates our policies and working with third-party fact-checkers in the region to debunk false claims. When they rate something as false, we move this content lower in Feed so fewer people see it. 
  • In response to the crisis, we have expanded our third-party fact-checking capacity in Russian and Ukrainian languages across the region and are working to provide additional financial support to Ukrainian fact-checking partners.
  • To supplement labels from our fact-checking partners, we are warning users in the region when they try to share some war-related images that our systems detect are over one year old so people have more information about outdated or misleading images that could be taken out of context.
  • We’ve also made it easier for fact-checkers to find and rate content related to the war because we recognize that speed is especially important during breaking news events. We use keyword detection to group related content in one place, making it easy for fact-checkers to find.
  • We’re also giving people more information to decide what to read, trust and share by adding warning labels on content rated false by third-party fact-checkers and applying labels to state-controlled media publishers.
  • Messenger, Instagram and WhatsApp have limits on message forwarding and label messages that haven’t originated with the sender so people are aware that something is information from a third party.
  • We notify people who have previously shared or try to share fact-checked content so they can decide for themselves whether it’s something they want to continue sharing.
  • Facebook Pages, Groups, accounts and domains that repeatedly share false information will receive additional penalties. For example, we will remove them from recommendations and show all of the content they post lower in Feed, so fewer people see it.
  • We show a pop-up notification when you go to connect with a Facebook Page, Group or Instagram account that has repeatedly shared content that fact-checkers have rated false. You can also click to learn more, including that fact-checkers said some posts shared by this Page, Group or account included false information and a link to more information about our fact-checking program.

Transparency Around State-Controlled Media Outlets

We provide greater transparency on accounts from state-controlled media outlets, including Russian-based RT and Sputnik, because they combine the influence of a media organization with the strategic backing of a state, and we believe people should know if the news they read is coming from a publication that may be under the influence of a government.

  • We are prohibiting ads from Russian state media and demonetizing their accounts. 
  • We continue to apply labels to additional Russian state media.
  • We refused an order from the Russian authorities to stop the independent fact-checking and labeling of content posted on Facebook by four Russian state media organizations.
  • State controlled media, like other publishers, are eligible for fact-checking, and our third-party fact-checking partners can and do rate their content.
  • State controlled outlets must follow our Community Standards and Advertising Policies.
  • Ads and posts from state-controlled media outlets on Facebook and Instagram are labeled prominently. We also apply these labels to Instagram profiles, the “About this Account” section of Instagram accounts, the Page Transparency section of Facebook Pages and in our Ads Library.
  • We developed our definition and standards for state-controlled media organizations with input from more than 65 experts around the world specializing in media, governance and human rights and development.

We remain vigilant to emerging trends and stand ready to take additional action to meet the demands of this ongoing conflict.

The post Meta’s Ongoing Efforts Regarding Russia’s Invasion of Ukraine appeared first on Meta.

Read Entire Article