Social media app Instagram has announced that it is testing new methods for age verification of users, including the use of a video selfie. 

What Are The Issues? 

This latest announcement by Meta’s (Facebook) Instagram is an extension of an age verification program which the company began in 2019. Finding new technology-based ways to confirm a user’s age will help address several key issues, which include: 

  • Instagram has a duty to provide ways to safeguard younger users. For example, it already makes accounts private for teens under 16 by default, plus there are privacy settings to control who interacts with a teen and their content. There is also the ability to block and anonymously report other users, along with the ability to track time spent on Instagram, preventing unwanted contact from adults that young users don’t know and limiting the options advertisers have to reach young users with ads. 
  • As per Instagram’s (and Meta’s/Facebook’s) terms, users (in the U.S.) must be at least 13 years old to sign up (in compliance with the U.S. Children’s Online Privacy Protection Act). The minimum user-age is higher in some countries. 
  • It’s not always clear what age someone really is, simply by asking questions.

What’s Happening

Instagram says that it has partnered with Yoti, a leading age verification provider for several industries around the world including social media, gaming, and age restricted e-commerce to develop two verification methods, in addition to uploading their ID, to verify a person’s age. The two new methods that the AI-based Yoti system will now use for age verification are: 

  • Uploading a video selfie. Instagram (Meta) says that it will only share the image with Yoti, and that Yoti’s technology estimates a user’s age based on their facial features. Meta makes it clear that the selfie is deleted afterwards, and that the technology can’t recognise a person’s identity, just their age. 
  • Using social vouching. This involves asking three mutual followers who are 18 or over to confirm how old the user is.  The mutual followers must respond within three days. 

Meta / Instagram’s Own AI Too

Meta says that, in addition to testing the new methods of age verification, it will be using its own AI system to help decide if a user is a teen or an adult.

What Happened To ‘Instagram For Kids’? 

Back in March 2021, Meta revealed that it was developing the ‘Instagram for Kids’ service, but this was shelved following lawmakers in the US writing to Mark Zuckerberg demanding that the plan be abandoned over worries about the safety of children on social media platforms.

What Does This Mean For Your Business?

Meta, Instagram’s owner, has attracted a good deal of criticism in recent years over the safety of young people using its platforms, e.g. body-shaming, and cyberbullying, and was pressured into abandoning ‘Instagram for Kids’ over similar concerns.

The company has also been criticised over its past privacy record, e.g. the Cambridge Analytica scandal. These new age verification methods, and the accompanying re-assurance that the selfies won’t be shared (and will be deleted) are a way for Meta to show that it’s trying to comply with child protection laws, trying to make its platforms safer for young people, and re-assure parents that its Instagram platform is safe.

Last year, Meta also introduced new rules to stop advertisers from being able to target teens based on their interests although advertisers can still target teens based on age, gender, and location. 

By Mike Knight

Back To Latest News

Comments are closed.