top of page

Snapchat's AI Chatbot Raises Privacy Concerns for Children, Warns UK Watchdog

Snapchat's artificial intelligence (AI) chatbot, known as "My AI," may pose privacy risks to children, according to the UK's data watchdog.

Credits: REUTERS

The Information Commissioner's Office (ICO) has stated that Snapchat failed to properly assess the privacy risks associated with its AI chatbot.

The ICO will consider Snapchat's response before making any final enforcement decisions. If the concerns are not adequately addressed, the chatbot could potentially be banned in the UK.

The ICO's investigation suggests that Snapchat did not sufficiently identify and assess the privacy risks to children and other users before launching "My AI." However, this does not necessarily mean that Snapchat has violated data protection laws or that the ICO will issue an enforcement notice.

Snapchat has responded by stating that "My AI" underwent a thorough legal and privacy review process before its public release. The company is committed to user privacy and will work with the ICO to ensure compliance with risk assessment procedures.

The ICO is currently examining how "My AI" processes the personal data of Snapchat's approximately 21 million UK users, including children aged 13-17. The chatbot is powered by OpenAI's ChatGPT, a well-known example of generative AI. Policymakers worldwide are grappling with the need to regulate such AI systems due to concerns about privacy and safety.

Social media platforms, including Snapchat, have age restrictions requiring users to be 13 or older. However, they have faced challenges in effectively preventing underage users from accessing their platforms.

  • The UK's data watchdog, the ICO, warns that Snapchat's AI chatbot may pose privacy risks to children.

  • Snapchat allegedly failed to adequately assess privacy risks before launching "My AI."

  • The ICO will consider Snapchat's response before deciding on any enforcement actions.

  • If concerns are not addressed, the chatbot could potentially be banned in the UK.


bottom of page