Bumble Will Deploy AI For Unwanted Nudes Detection

Harin - Apr 25, 2019


Bumble Will Deploy AI For Unwanted Nudes Detection

Dating app Bumble is launching an AI-assisted "private detector" to warn users about lewd images and weed them out of the platform.

Very soon AI will remove any NSFW photos you receive from a match on Bumble. The dating app on which women need to be the one to start the conversation said it is introducing what is called a “private detector” to detect lewd images. The announcement was made in a press release on April 24.

Starting from June, an AI-assisted “private detector” will screen all images sent on Bumble. If it suspects a photo to be inappropriate or lewd, users can choose to block, view, or report to moderators before opening it.

Dims

Many women encounter unwanted sexual advances on the internet. And the figure for these cases increases significantly on dating apps. A Consumers’ Research study conducted in 2016 suggested that 57% of women said they felt harassed while being on dating apps, while only 21% of men felt so.

The difference between Bumble and Tinder or Hinge is that the app lets its users send photos to their matches. These images are blurred out, and to view them, recipients need to press on them. By doing this, users are somewhat safe from sexual photos, but it seems like it is not enough for the company though.

In a press release, Andreev said:

Quote

According to Bumble, the effectiveness of its “private detector” can reach 98%. A spokesperson of Bumble said that the AI tool would also be capable of identifying gun pictures and shirtless mirror selfies.

The spokesperson added that an AI tool for harmful language and inappropriate message detection is under development. But Bumble will not be the first company to utilize for profiles and messages filtering. Tinder has also been using an AI tool to detect “red-flag language and images.”

Comments

Sort by Newest | Popular

Next Story