Meta has updated its facial recognition technology to combat celebrity investment scam advertisements by identifying individuals whose pictures are used the most frequently.
For long, many Facebook and Instagram users fell victim to scams because fraudsters impersonated popular figures to gain their trust and confidence. They would see a celebrity’s picture, click on it and will be directed to another website.
But Meta has now found a way to nail suspicious accounts parading faces of celebrities to win trust of Face look users with the intent of defrauding them.
The parent company of Facebook and Instagram on Monday stated it would opt out of facial recognition trials with 50,000 celebrities or public figures globally in December.
Read also: Facebook aims to retain youth with enhanced social experience
“Early testing with a small group of celebrities and public figures shows promising results in increasing the speed and efficacy with which we can detect and enforce against this type of scam,” Meta said in a blogpost.
Understanding Meta’s Approach to Ad Integrity
Meta will erase an ad if its images match the public figure’s Facebook and Instagram profile pictures if it suspects it is a hoax.
“This process is done in real time and is faster and much more accurate than manual human reviews, so it allows us to apply our enforcement policies more quickly and to protect people on our apps from scams and celebrities,” David Agranovich, director of global threat disruption at Meta, said on Monday.
For celebrities to participate in the system, they need to have a profile on either Facebook or Instagram.
Meta will use the same facial recognition technology to let customers upload video selfies to regain their scammed accounts.
As of November 2, 2021, Meta discontinued facial recognition for photo tagging due to privacy concerns on their website.
Meta’s approach to combating online scams and fraud
Agranovich stressed that the facial data generated will be immediately deleted once the match test is completed for both the scam ads and account hijacking, regardless of whether there is a match, and it’s not used for any other purpose.
The company said early testing with a small group shows “promising results” in the speed and efficiency in detecting scam ads. Celebrities in the initial rollout will see a notification in their app notifying that they have been enrolled, and they will be able to opt out anytime, Meta said.
Recently, lawmakers and regulators have pressured Meta to stop investment schemes using deepfake photos of public personalities like Martin Lewis, David Koch, Gina Rinehart, Anthony Albanese, Larry Emdur, Guy Sebastian, and others.
Read also: Meta discontinues Spark AR Studio
Andrew Forrest, a mining magnate, is suing the company because he says it didn’t do enough to stop scams that used his image. The Australian Competition and Consumer Commission is also suing the company.
Agranovich said the facial-recognition tool was one of several used by the company to detect scams, but admitted some were likely to slip through the cracks.
“It’s a numbers game, and so while we have automated detection systems that are running against ad creative that’s being created and that do remove a very large volume of violating ads before they can be posted or shortly after they’re posted, scam networks are highly motivated to just keep throwing things at the wall in hopes that things get through, and invariably some of them do,” he said.
“Even if it is successful, scammers will probably migrate to other tactics. And so we know we’ll have to keep iterating and building new tooling to get ahead of whatever it is they do next.”