Instagram added several new features on Thursday aimed at protecting teenagers from “sextortion” scams, amid mounting concerns about the role social media platforms play in child safety.
Key Facts
- Screenshots or screen recordings of disappearing messages in Instagram direct messages will be blocked, which means when a user tries to screenshot an image meant to only be viewed once, a black screen will appear.
- Instagram is testing safety notices that will notify teenagers when they’re exchanging messages with someone who may be out of the country.
- Accounts detected as showing signs of “scammy” behaviors, such as newly created accounts, will not be able to see a user’s following and followers lists, a common feature that Meta says scammers use to find friends and family to blackmail users.
- Images that have been flagged to contain nudity will be blurred by default for users under the age of 18.
- The new features follow Instagram’s introduction of designated Teen accounts last month with built-in restrictions for users under 18, including making their accounts private by default and allowing parents to see who their children are exchanging messages with.
What Is “sextortion”?
Sextortion is a type of scam where predators trick or coerce a person, typically a minor, into sending explicit images or videos of themselves, then threatening to release the material if the victim does not send more images or send money. Financially motivated sextortion schemes are on the rise, according to the National Center for Missing and Exploited Children, with teenage boys being the most common targets. As many as 79% of predators now seek money instead of more explicit imagery.
The FBI and the federal law enforcement agency Homeland Security Investigations received more than 13,000 reports of financial sextortion of minors from October 2021 to March 2023, with 12,600 victims, primarily boys. The scams have led to at least 20 suicides. Between October 2022 to March 2023, the FBI saw a 20% increase in financial sextortion scams involving minors compared to the same time period a year prior. Predators are usually located outside of the United States, according to the FBI.
Key Background
The introduction of the features is part of a broader campaign from Meta to address mounting concerns that social media platforms, including Instagram, aren’t doing enough to protect teenagers online. The tech giant has drawn scrutiny for its handling of child sexual abuse material in recent years – a report from the Stanford Internet Observatory published in June 2023 found large networks of accounts advertising and selling CSAM on its platforms. The report noted that while Instagram was the most popular platform by far, it was a “widespread issue” across several online platforms.
Meta CEO Mark Zuckerberg apologized in January at a Senate hearing on child safety to parents who said Instagram contributed to their children’s suicides and exploitation following questions about child sexual abuse on Meta platforms from Sen. Josh Hawley, R-Mo. “Child exploitation is a horrific crime,” Meta spokesperson Sophie Vogel said in an emailed statement. “We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.” She also pointed to Stanford’s public comments that Meta “has fixed problems with user reporting and triage, as well as taking steps to limit discoverability.”
Chief Critic
“Meta has known sextortion is happening at scale for so long,” Annie Seifullah, a lawyer who said she has worked on sextortion cases involving Meta apps, told The Washington Post. “This feels like too little, too late,” Seifullah said, adding that automation has made it easier for predators to contact more victims at once “We continue to develop new technology and tools designed to specifically combat these criminals’ evolving tactics,” Vogel, the Meta spokesperson, said.
Tangent
Snapchat was hit with a lawsuit by New Mexico’s attorney general last month, accusing it of failing to act on “rampant” reports of sextortion and abuse involving minors, prioritizing growth over safety instead. An internal analysis in November 2022 found that Snapchat was receiving 10,000 reports of sextortion each month, according to the suit. Forbes has reached out to Snap for comment.