top of page

Instagram Teen Survey Reveals Unwanted Nude Image Exposure

  • Writer: tech360.tv
    tech360.tv
  • 3 hours ago
  • 2 min read

A Meta survey found 19% of young teenagers on Instagram reported seeing unwanted nude images, according to a court filing. Nearly one in five users aged 13 to 15 told Meta they viewed "nudity or sexual images on Instagram" they did not wish to see.


Close-up of an iPad screen displaying the Instagram app page with colorful logo, "Get" button, 4.7-star rating, and age 12+ note.
Credit: UNSPLASH

A court filing, made public on Friday as part of a federal lawsuit in California, included portions of a March 2025 deposition from Instagram head Adam Mosseri. Another document, dated January 20, 2021, and also made public through the lawsuit, showed a Meta researcher recommending the company focus on teenage users.


The researcher described teens as "catalysts" for their households, influencing how younger siblings, and parents use the app. The memo stated, "If we're looking to acquire (and retain) new users we need to recognize a teen's influence within the household to help do so."


Meta, the owner of Facebook, and Instagram, faces allegations from global leaders that its products harm young users. Thousands of lawsuits in U.S. federal and state courts accuse the company of designing addictive products and fuelling a mental-health crisis for minors.


Meta spokesperson Andy Stone stated the statistic on explicit images originated from a 2021 survey of Instagram users about their platform experiences. The data did not come from a review of posts themselves.


In the same 2021 survey, approximately 8% of users in the 13 to 15 age group reported seeing someone harm themselves or threaten to do so on Instagram, Mr. Mosseri's deposition revealed.


The company in late 2025 said it would remove images and videos "containing nudity or explicit sexual activity, including when generated by AI," for teenage users. Exceptions would be considered for medical and educational content.


Mr. Stone remarked, "We’re proud of the progress we’ve made, and we’re always working to do better."


Mr. Mosseri said in his deposition that most sexually explicit images were sent via private messages between users. He added that Meta must consider users’ privacy when reviewing such content.


He explained, "A lot of people don't want us reading their messages."

  • A Meta survey found 19% of Instagram users aged 13 to 15 reported seeing unwanted nude or sexual images.

  • The findings were revealed in a court filing as part of a federal lawsuit alleging Meta's products harm young users.

  • Approximately 8% of young teenage users also reported seeing self-harm content on Instagram in a 2021 survey.


Source: REUTERS

As technology advances and has a greater impact on our lives than ever before, being informed is the only way to keep up.  Through our product reviews and news articles, we want to be able to aid our readers in doing so. All of our reviews are carefully written, offer unique insights and critiques, and provide trustworthy recommendations. Our news stories are sourced from trustworthy sources, fact-checked by our team, and presented with the help of AI to make them easier to comprehend for our readers. If you notice any errors in our product reviews or news stories, please email us at editorial@tech360.tv.  Your input will be important in ensuring that our articles are accurate for all of our readers.

Tech360tv is Singapore's Tech News and Gadget Reviews platform. Join us for our in depth PC reviews, Smartphone reviews, Audio reviews, Camera reviews and other gadget reviews.

  • YouTube
  • Facebook
  • TikTok
  • Instagram
  • Twitter
  • LinkedIn

© 2021 tech360.tv. All rights reserved.

bottom of page