Bumble has heard your prayers and launched a new artificial intelligence-powered tool designed specifically to shield users from scams, spam and fake profiles. No more talking to hot guys named Jason only to realise they’re a bot! But also, apologies in advance to the real hotties who are often mistaken for bots.
The new “Deception Detector” was announced on Tuesday, February 6 — which also happens to be Safer Internet Day — in response to concerns from users of the dating app about the authenticity of the profiles they match with.
Safety on the internet has become a growing concern in recent months. Aside from the influx of cooked scams affecting so many of our favourite companies, there’s also the issue of women’s safety in particular amid horrific deepfakes and dodgy dating profiles.
According to the Bumble Inc, the dating app’s parent company, 33% of Aussies are worried about fake profiles when they online date.
I don’t blame them — I’m still reeling from this dating scam involving restaurants and dodgy business owners. And that one doesn’t even really involve stealing your money.
Government agency Scamwatch reported that more than 3,000 romance-related scams in 2023, which resulted in $33.5 million lost to fake profiles. According to its stats, 69% of the money lost to these scams belonged to women.
This is where the Deception Detector comes in: the AI-powered tool helps identify and block dodgy profiles before you even see them, essentially acting like a shield.
Bumble Inc reported that in the first two months of the Deception Detector being used, the dating app has seen reports of scams and fake profiles by users decrease by a huge 45%.
The tool runs in conjunction with human help too, to make sure that users are as protected as possible — while still being careful with AI.
“People on Bumble won’t see Deception Detector™ or interact with it in the way that they do with other safety tools like Block and Report, Photo Verification, Private Detector or Unmatch, but they will be protected by it regardless of its visibility to the user,” Lucille McCart, Bumble APAC Communications Director, told PEDESTRIAN.TV.
“The whole idea is to use the technology that powers Deception Detector™ to take action against fake profiles before members even see them or before they have the opportunity to do harm.”
This isn’t the first time Bumble has updated its security measures to keep women on its app safe — in 2021, the app announced it would be booting repeat body-shamers off its platform after a spate of nasty comments from little freaks on the app.
Honestly, the Deception Detector is giving protective daddy… maybe we should make it a little AI profile with a hot icon that I can chat to? Where it has a deep, raspy voice that tells me it will keep me safe? Just wanted to float that idea around.