Megaface Challenge Evaluates How Well Facial Recognition Algorithms Really Work

Megaface Challenge

In today’s digital world, people are becoming increasingly reliant on their smartphones devices, as mobile technology continues to develop at a frantic pace.

More recently, companies have begun to develop services that use facial recognition technology, whether to automatically tag friends in photos or letting people unlock their phone just by looking at it.

And recent reports suggest that these systems have achieved “near-perfect accuracy rates”, even performing better than humans at picking out a face from a crowd.

However, these tests were performed on a dataset with only 13,000 images.

So, to test the tech on a larger scale, researchers from The University of Washington decided to increase the dataset to the size of a large major US city.

The researchers have started the Megaface Challenge, a world first test to see how well facial recognition algorithms really work.

Ira Kemelmacher-Shlizerman, an assistant professor of computer science at The University of Washington and the project’s leading investigator said: “We need to test facial recognition on a planetary scale to enable practical applications – testing on a larger scale lets you discover the flaws and successes of recognition algorithms.

“We can’t just test it on a very small scale and say it works perfectly.”

Megaface Challenge

How Megaface works

The dataset was originally made up of images from publicly available Flickr pictures, which represented 690,572 unique individuals.

The Megaface Challenge tested how well the algorithms performed when they had to rifle through these images to find a one in a million match.

The challenge looked at how well the systems can detect whether two images are of the same person – the process used by some mobile devices to unlock a phone if a photo and a face is the same.

Kemelmacher-Shlizerman said: “What happens if you lose your phone in a train station in Amsterdam and someone tries to steal it?

“I’d want certainty that my phone can correctly identify me out of a million people – or 7 billion – not just 10,000 or so.”

Megaface also tests the ability to affirm if two photos are of the same person, when presented amongst multiple “distractors”.

This process is used by police if they have one image of a person and want to know if they are in a photo of a crowd or large group.

The challenge also asked the facial recognition systems to determine who was who from different poses and the same person at different ages.

Kemelmacher-Shlizerman said: “You can see where the hard problems are – recognising people across different ages is an unsolved problem.

“So is identifying people from their doppelgängers  and matching people who are in varying poses like side views to frontal views.”

The Megaface results

Megaface-results-final

Although proving successful after the first test, as the number of photos increased, many of the algorithms seemed to struggle.

The system that came out on top was Google’s Facenet, getting 75% success on the one million person test.

The challenge was funded by the National Science Foundation, Intel, Samsung, Google, and the University of Washington Animation Research Labs and is still open and accepting results.

Check it out here.