The eye of tech-artist Benjamin Males’ custom-made surveillance camera is engineered for a black and white world.
Black and white people, that is.
Males, 25, a mechanical engineer who recently graduated with a master’s degree from London’s Royal College of Art, wrote the software for a camera that determines a person’s race.
The RTS-2 (Racial Targeting System) is essentially an automated racial-profiling tool, one that governments and police have not dared touch due to privacy and human-rights concerns, even though the technical capabilities already exist.
However, Males built the camera in an attempt to raise awareness of such issues among the public, which often appears oblivious to how frequently it is surveyed by CCTV (closed-circuit television) due to the prevalence of the cameras, especially in the U.K.
Surveillance cameras “have a significant effect on our lives and civil liberties,” Males said. “We, as the public, aren’t really in a position to discuss them or critique them because they are developed behind closed doors.”
Males bought the CCTV camera on eBay’s auction site. He wrote the software for the program in C++, in part using the Open Source Computer Vision Library from Intel, a library of programming functions that can be used in applications where computers use vision.
Males built a motor for the camera, so when it detects a face, it moves as the person does. Males intended that people who are targeted by the camera have some indication they’re being monitored.
The camera supplies an image of a person’s face via a USB (Universal Serial Bus) cable to a laptop. The software then takes a color sample of a person’s nose and cheeks, and the pixel values are averaged to come up with an approximate determination of the person’s race, Males said. The output is shown as a percentage, such as 90.3 percent white, 9.7 percent black. Those percentages are a mathematical representation of the way a person’s skin has been sampled and classified by the computer.
All of the RTS-2’s components run on batteries, and the setup is portable. Males has taken it to places such as Covent Garden and Kensington High Street in London, both areas busy with tourists and shoppers. Nearly every one who passed by either didn’t notice the camera or barely paid attention, a finding that shows how people are quite used to being monitored, Males said.
Males later mashed the color samples from people’s faces together into one big color swatch, creating a collage of the skin tones seen in a neighborhood. The London neighborhood of Brixton – the scene of violent race riots in 1981 – was “very brown and quite white.” The collage for Kensington High Street, an affluent area in the West End, showed “rich oranges and terracotta,” Males said.
Males has also displayed the RTS-2 in Japan and at London’s Royal College of Art as an art installation called “The Target Project.”
When the device is displayed in a more controlled environment, people are more curious. Males said he was asked why he would create a racial classification device and what he would do if a government asked him to develop the system further.
The second question is irrelevant: The technology already exists, and it’s much more refined, Males said.
“The device isn’t that sophisticated,” Males said. “This software exists at a much more sophisticated and dangerous level in the commercial world. You can buy facial-recognition technology that looks at features and tries to match people.”
But using automated tools such as CCTV to target people by race raises questions about ethnic profiling, which some experts argue puts a person’s race as a forefront consideration in wrongdoing, even before suspicious actions have been observed.
After the July 2005 terrorist bombings in London, many Asians complained of increased police scrutiny and aggression in their communities, merely since some attackers were Asian. The issue caused heightened tensions between Asians and police, which could have potentially hurt the police’s chance to collect valuable intelligence from sources within those communities.
“Personally, I think there’s a place for these kinds of technologies,” Males said. “Technology has a role to play in our security and safety, but there needs to be proper discussion. There needs to be a bit more openness.”