Introducing Lantern: Protecting Children Online

5 months ago 62

Protecting children online is one of the most important challenges facing the technology industry today. At Meta, we want young people to have safe, positive experiences online and we’ve spent a decade developing tools and policies designed to protect them. As a result, we find and report more child sexual abuse material to the National Center for Missing & Exploited Children (NCMEC) than any other service today. 

Many in our industry recognize the need to work together to protect children and stop predators. We use technology like Microsoft’s PhotoDNA and Meta’s PDQ to stop the spread of child sexual abuse material (CSAM) on the internet, but we need additional solutions to stop predators from using different apps and websites to target children.

Predators don’t limit their attempts to harm children to individual platforms. They use multiple apps and websites and adapt their tactics across them all to avoid detection. When a predator is discovered and removed from a site for breaking its rules, they may head to one of the many other apps or websites they use to target children. 

This is behavior experts in online child safety understand well, and something we knew we could do more to address with our peers, so we worked with our partners at the Tech Coalition to establish Lantern. As described in the Tech Coalition’s announcement today, Lantern enables technology companies to share a variety of signals about accounts and behaviors that violate their child safety policies. Lantern participants can use this information to conduct investigations on their own platforms and take action. 

Meta was a founding member of Lantern. We provided the Tech Coalition with the technical infrastructure that sits behind the program and encouraged our industry partners to use it. We manage and oversee the technology with the Tech Coalition, ensuring it is simple to use and provides our partners with the information they need to track down potential predators on their own platforms.

One example of Lantern’s value is an investigation Meta conducted following information provided by Lantern partner MEGA during the program’s pilot phase. MEGA shared URLs with Lantern that they had previously removed for violating their child safety policies. Meta’s specialist child safety team used this information to conduct a wider investigation into potentially violating behaviors related to these URLs on our platforms. The team removed over 10,000 violating Facebook Profiles, Pages and Instagram accounts in the course of the investigation. In accordance with our legal obligations, we reported the violating profiles, pages and accounts to NCMEC. In addition, we shared details of our investigation back to Lantern, enabling participating companies to use the signals to conduct their own investigations.

We’re glad to partner with the Tech Coalition and our peers on the Lantern program, and we hope others in the industry will join us to expand this important work.

The post Introducing Lantern: Protecting Children Online appeared first on Meta.

Read Entire Article