Lately, there's been a lot of buzz in the media about Facebook's facial-recognition technology--admittedly, in part thanks to us. But now we've decided to look at the technology itself: Is it as dangerous as we think it is? In the end, Facebook's new face-recognizing feature doesn't yet work well enough to pose a significant threat to your privacy.
How Well Does It Work?
Facebook says its facial-recognition technology is convenient because it groups together multiple images of the same person; as a result, you have to type a friend’s name only once, and the tag will apply to all photos of that person. If your friend has been previously tagged in enough photos, Facebook will suggest his or her name so you don’t have to do anything. And yes, we think that would be more convenient--if it worked.
We did a few experiments to see how well Facebook's facial-recognition technology, in its current state, actually recognizes faces. We uploaded pictures of people who had no tags on Facebook, as well as photos of people who had many tags on Facebook; we even reuploaded photos that already existed on Facebook with tags, just to see what Facebook could detect and what it couldn’t. Since Facebook allows you to disable the facial-recognition feature in your privacy settings, we made sure to upload photos of people who left the facial-recognition feature on.
As it turns out, the tool isn't exactly advanced to a worrying degree (or anywhere close).
At the moment, Facebook's facial-recognition technology seems to be good at one thing, and one thing only: recognizing that a face is a face. Considering that most point-and-shoot cameras have such technology, this isn't exactly mind-blowing.
As you can see below, Facebook's facial-recognition technology has no trouble recognizing face shapes in these photos. Unfortunately, it also cannot tell that they're photos of the same person.
Below, Facebook is able to group photos together and determine that the top three are of one person, and the bottom two are of another person. Pretty good, right? Except for the fact that, again, these are all photos of the same person.
Facebook's facial-recognition technology appears to be able to recognize people's faces across multiple photos if the lighting is similar, the coloring is similar, the angle is similar, and the person's expression is similar. For the photos below, Facebook recognizes the first two as being of the same person--but that isn't so impressive, considering that the lighting, coloring, angle, and expression are all basically the same in both photos. But Facebook can't tell that the other two photos are of the same person, probably owing to the drastic changes in lighting and angle.
Apparently Facebook can still recognize faces if only one of the variables changes. In the picture below, Facebook correctly identifies the photos as being of the same person, despite the expression change. However, the other variables (lighting, color, angle) are all the same.
We heard a rumor that Facebook's facial-recognition system wasn't too great at recognizing Asian faces. As far as we can tell, that isn't true--the photo set below was one of the only sets that Facebook grouped correctly and properly identified.
Sometimes Facebook is just weird. Even though Megan has over 600 tagged photos of herself on Facebook--a database large enough to give any face-recognizing technology a pretty good idea of what she looks like--we reuploaded four previously uploaded photos of her, and Facebook was able to correctly identify only the one photo in which she's wearing a fake moustache.
We can draw two conclusions here: First, Facebook's facial-recognition system is pretty bad. Second, fake moustaches are bad disguises.
Next Page: How Facebook recognizes faces, and whether you should worry