Instagram to ask users to verify themselves with video selfies

Instagram
x

Instagram to ask users to verify themselves with video selfies

Highlights

Instagram is asking some of its users to provide a video selfie showing several angles of their face to verify they are a real person, according to screenshots posted on Twitter by Matt Navarra, the social media consultant. It's unclear how widespread the release is.

Instagram is asking some of its users to provide a video selfie showing several angles of their face to verify they are a real person, according to screenshots posted on Twitter by Matt Navarra, the social media consultant. The social network has long struggled with bot accounts, which can leave spam messages, harass people, or be used to artificially inflate the number of likes or followers, and it is possible that Meta (formerly Facebook, the parent company Instagram) is looking for this feature to help curb the prevalence of bots on the platform.

According to XDA Developers, the company began testing the feature last year but ran into technical issues. Several users have reported that they were asked to take a video selfie to verify their existing accounts.


Bettina Makalintal, another writer on Twitter, posted a screenshot of the help screen for the step where you actually take the video selfie; she reiterates that she is looking at "every angle of your face" to prove that you are a real person and shows that the verification screen is displayed for multiple people.



It is unclear if this feature is currently a trial or is rolling out slowly. I made several attempts to set up an incomplete looking Instagram account and was never presented with the video challenge. Meta did not immediately respond to the request for comment on the feature or its implementation.

The move may surprise some users, given Meta's recent announcement that it would be shutting down one of its facial recognition features. However, as the company has since reiterated, it was only shutting down a specific Facebook feature, not Meta's use of facial recognition as a whole. The message at the bottom of the screenshot also implies that the feature will not use facial recognition at all and that the video will be removed after 30 days.

Show Full Article
Print Article
Next Story
More Stories
ADVERTISEMENT
ADVERTISEMENTS