Instagram Teen Survey Reveals Unwanted Nude Image Exposure
- tech360.tv

- 3 hours ago
- 2 min read
A Meta survey found 19% of young teenagers on Instagram reported seeing unwanted nude images, according to a court filing. Nearly one in five users aged 13 to 15 told Meta they viewed "nudity or sexual images on Instagram" they did not wish to see.

A court filing, made public on Friday as part of a federal lawsuit in California, included portions of a March 2025 deposition from Instagram head Adam Mosseri. Another document, dated January 20, 2021, and also made public through the lawsuit, showed a Meta researcher recommending the company focus on teenage users.
The researcher described teens as "catalysts" for their households, influencing how younger siblings, and parents use the app. The memo stated, "If we're looking to acquire (and retain) new users we need to recognize a teen's influence within the household to help do so."
Meta, the owner of Facebook, and Instagram, faces allegations from global leaders that its products harm young users. Thousands of lawsuits in U.S. federal and state courts accuse the company of designing addictive products and fuelling a mental-health crisis for minors.
Meta spokesperson Andy Stone stated the statistic on explicit images originated from a 2021 survey of Instagram users about their platform experiences. The data did not come from a review of posts themselves.
In the same 2021 survey, approximately 8% of users in the 13 to 15 age group reported seeing someone harm themselves or threaten to do so on Instagram, Mr. Mosseri's deposition revealed.
The company in late 2025 said it would remove images and videos "containing nudity or explicit sexual activity, including when generated by AI," for teenage users. Exceptions would be considered for medical and educational content.
Mr. Stone remarked, "We’re proud of the progress we’ve made, and we’re always working to do better."
Mr. Mosseri said in his deposition that most sexually explicit images were sent via private messages between users. He added that Meta must consider users’ privacy when reviewing such content.
He explained, "A lot of people don't want us reading their messages."
A Meta survey found 19% of Instagram users aged 13 to 15 reported seeing unwanted nude or sexual images.
The findings were revealed in a court filing as part of a federal lawsuit alleging Meta's products harm young users.
Approximately 8% of young teenage users also reported seeing self-harm content on Instagram in a 2021 survey.
Source: REUTERS


