Instagram has introduced new age verification tools for users in the UK and EU.
The photo-sharing app has introduced new technology for people trying to change their age from under 18 to over 18. Instead of relying on photographic ID, people have to record a video of themselves which will then be sent to be analyzed by age-guessing technology.
The app – which is owned and operated by Meta, which also controls Facebook and Whatsapp – claimed the move would ensure users have age-appropriate content on their feeds.
The move comes after the company shared its plans to increase methods for teens – who must be over 13 to use the service – to verify their age. In the United States, they ran a track that gave people three ways to prove their age; by uploading a photo ID, having an adult confirm their age, and via video selfie.
Instagram already claims to use artificial intelligence and in-app reporting to verify users’ ages.
However, Ofcom data revealed that one in three children access adult content by lying about their age.
Anna-Sophie Harling, a representative of Britain’s communications industry regulator, said it had a ‘huge impact’ on children and cited the Molly Russell inquest, where a coroner found the algorithm pushed content related to “anxiety, self-harm, or suicide”. before the 13-year-old schoolgirl died by suicide.
She told BBC News at the time: “This was a very specific case of harmful content which had very detrimental effects and tragic results on a family in the UK.
“When we talk about potentially harmful content for people under 18, that is content that could have greater negative consequences for people under 18 because it is still in development.
“When children are repeatedly exposed to images and videos that contain certain images, then they are essentially tricked into acting in different ways or thinking differently about themselves or their friends.”