LOS ANGELES—Instagram, the popular social media platform owned by Meta Platforms Inc., today announced a new set of trust and safety protocols meant to protect users of all ages from potential sextortion and intimate imagery abuse. One such protocol is that Instagram will soon begin to auto-blur images sent in direct messages that its AI suspects to feature nudity.
The new protocols are expected to be rolled out for testing in the days to come.
"When someone receives an image containing nudity, it will be automatically blurred under a warning screen, meaning the recipient isn’t confronted with a nude image, and they can choose whether or not to view it," notes an Instagram blog post, calling the new feature "Nudity Protection in DMs."
"We’ll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat," says Instagram.
This feature also provides fail-safe guards, including prompts asking users who are about to send potentially intimate images and content to be sure about what they're doing. Any user who receives an image that might have nudity in it can also quickly report the sender, especially if the image is unsolicited.
Nudity protection is a feature that can be turned on by adult users and relies on localized machine learning AI. The company noted that it will be turned on by default for users under 18.
Instagram says, "Nudity protection uses on-device machine learning to analyze whether an image sent in a DM on Instagram contains nudity. Because the images are analyzed on the device itself, nudity protection will work in end-to-end encrypted chats, where Meta won’t have access to these images—unless someone chooses to report them to us."
Other features include tools that prevent criminals and scammers from contacting teenagers who use Instagram and other adults in general.
These features are noteworthy for adult content creators and production houses because Instagram is a popular safe-for-work marketing channel for them. For example, adult content creators who use Instagram can publish a "link in bio" that can take users to an intermediary website, like Linktree, with a list of the web content.
In a similar tactic, X (formerly Twitter) has begun testing NSFW labels on adult-oriented communities and blurring images that could potentially contain nudity.
Despite the growing hostile environment toward sex workers and adult content creators, X is one of the only mainstream social media platforms that permits fully explicit sexual content. Users have started more commonly seeing posts of images blurred or labeled as potentially nude. This includes material involving lingerie and clothing.
AVN previously reported that X senior software engineer Dong Wook Chung posted to the platform that the labels ensure X is safer for everyone.
"To be clear, this setting is about making Communities safer for everyone by automatically filtering out NSFW content," Chung wrote.
"Only users who have specified their age will be able to search Communities with NSFW content."
X previously announced that it will start verifying the age of users. This feature is not mandatory at present.