This face recognition search engine is creepily accurate

PimEyes can be used by literally anyone, and works a little too well for our liking

PimEyes can be used by literally anyone, and works a little too well for our liking

By Eve Walker31 May 2022
4 mins read time
4 mins read time

Software that scans your face and compiles every public image of you ever taken in a matter of seconds may sound like something straight out of a crime drama or a sci-fi thriller, but the new search engine PimEyes is very much a real thing.

And the scariest part? Absolutely anyone can search absolutely any face, without the consent of the person in question.

The good news is that PimEyes doesn’t use photos on social media in the search, like Clearview AI, a similar facial recognition tool available only to law enforcement. No need to worry about drunken photos on Instagram being unearthed by your boss whose follow request you never accepted. Those embarrassing PhotoBooth thermal effect pics from high school won’t suddenly resurface either. Yet.

PimEyes does, however, scout obscure images across the internet, unearthing photos you may have not known even existed. Whether you’re in a wedding photographer's online portfolio crying your eyes out behind the bride and groom, fangirling in the front row of a concert, or stuffing food into your gob in the background of a restaurant shot on a Yelp review, PimEyes will find you.

How can PimEyes’ creepy tech be used?

Like most technology out there, facial recognition is a complicated, ethical issue. A tech executive who asked to remain anonymous told The New York Times that he often used PimEyes to identify people who harass him on Twitter (as they use their real photos but fake names). Another anonymous PimEyes user admitted to using the software to find the real identities of women in porn, and extended the search to find explicit photos of his Facebook friends (yuck).

There are currently no controls on PimEyes to stop users from searching for a face other than their own – in fact, users are asked to pay a large fee to keep unwanted photos from following them forever. The PROtect service costs between $89.99 to $299.99, and provides help from professionals to get photos taken down from other sites. While this is a large amount of money, it is no doubt a useful feature. There is also a free option to exclude photos from the internal search, although this is not clearly advertised on the site. In addition, users can “opt out” for free by providing their ID, completely removing their face from future searches.

How does it work?

First, you upload a face, and then check a box agreeing to the terms and conditions. After mere seconds, a collection of photos that match the original face appears alongside links to where they were found on the internet.

A dozen journalists at The New York Times consented to use PimEyes on their faces to test its accuracy, and the results are pretty creepy. The search engine uses one original photo to find similar images, similar to the ‘people’ function in iPhone galleries. In the test, photos of every single person who participated were found, including some that they had never before seen themselves – even snaps in which their faces are turned away from the camera or are partially covered with masks or sunglasses didn’t stop PimEyes from working its freaky magic.

The majority of the matches for the journalists’ faces were correct. One reporter found a photo of themselves dancing at an art museum event a decade ago, and crying after being proposed to. Another found themselves in a sea of people at Coachella in 2011, and another living it up as the life of the party in a string of wedding photos taken over the years. One journalist’s alter ego in a rock band was discovered, and another found themselves lurking in the background of a photo in a Greek airport in 2019.

Take away

Despite the potential dangers of facial recognition software like PimEyes, the owner Giorgi Gobronidze believes that it ultimately is a positive tool, helping people keep track of their online reputations and control the data that is already out there.

Previous generations didn’t have to deal with being hyper-aware of image and protecting their privacy in the way that we all are today. Twenty years ago, when photos were taken, they would likely remain in a box of blurry snaps in your friend’s cupboard. Maybe they wouldn’t even get developed at all. Now, literally anyone can take a photo of us and post it for the world to see without consent.

If you’re worried about your privacy, you can brush up on your rights as laid out in the Human Rights Act and seek advice here.