With a day left in the holiday shopping season, there’s just the thing for that last-minute stocking stuffer (or party gag gift, depending on your point of view) courtesy of no less than Hewlett-Packard, a company now in a defensive crouch over the workings of a product that stands in for human experience, maybe more comfortably than we might admit.
The Black Spin blog at AOL Black Voices announced recently that the new HP webcams, built into the latest generation of MediaSmart computers, have a little … bias problem. The cameras, which were intended to foster live online conversations, were supposedly designed to follow the movements of the camera subject.
But not for everyone. Earlier this month, the matter was widely publicized with a video posted to YouTube by two employees at an electronics store — a black man (“black Desi”) and a white woman (“white Wanda”) who tested the face-tracking technology with curious results.
The video showed the HP webcam working properly when "White Wanda" stepped in front of the camera, tracking her movements without fail. But not for black Desi. The camera, whose field of vision is the content of the video, doesn’t budge.
“I think my blackness is interfering with the computer's ability to follow me," Desi says in the clip.
"As you can see the camera is panning to show Wanda's face, it's following her around, but as soon as my blackness enters the frame ... it stops."
"I'm going on record, and I'm saying it: Hewlett-Packard computers are racist,” Desi says. “And the worst part is, I bought one for Christmas.”
HP performed timely, and seemingly sincere, damage control. Tony Welch, lead social media strategist in HP's PC division, said the company was investigating the issue. "The technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose," Welch wrote on a company blog.
"We believe that the camera might have difficulty 'seeing' contrast in conditions where there is insufficient foreground lighting."
To be fair, HP has tried to correct the problem, or at least illuminate people as to why it happens: Welch inserted a webpage with information “on the impact of lighting on facial tracking software, and how to optimize your webcam experience.”
◊ ◊ ◊
But there’s no escaping the echoes of events in the recent past, other instances of intolerance by algorithm. In early December, Google was under fire for a controversy that erupted when a racist image of First Lady Michelle Obama briefly led Google's image search results. The same thing happened a handful of years ago when people searching on Google for images of two prominent African Americans got pictures of animals instead.
Then and this time, Google gamely (and justifiably) defended the randomness of the search experience as fundamental to free expression. But still. There are other ways of seeing this. Joho, blogging at Hyperorg.com, poses one of the more solid arguments:
“Google’s algorithms are undoubtedly tuned by looking at the relevancy of the results. If they come up with a new wrinkle, they check it against the results it returns. So, the algorithms are already guided by Google’s own sense of what are good, useful and relevant results. If they tested a tweak of their ranking algorithm and it turned out always to put all the porn and pro-Nazi literature on top, Google would judge that algorithm as faulty.
“So, Google is already in the business of building algorithms that match its idea of what’s useful and relevant. When those algorithms occasionally turn up racist crap like that photo of Michelle, why not improve the algorithm’s results by intervening manually?”
◊ ◊ ◊
The HP and Google faux pas may be precursor evidence of what Ray Kurzweil calls “the singularity,” a fast-approaching and epochal transformation of the boundaries between biology and technology. Maybe they’re examples of technology doing what technology, what machines have always done: achieving in minutes then seconds then nanoseconds, becoming faster and more efficient at performing a given task. This is proof that they’re just faster in understanding human behavior — and its inherent biases — than we are in understanding them.
For people darker than a paper bag, what happened with HP’s product and Google’s service may or may not be racism, but it is evidence of consumer technology adopting a cultural meme of rendering people with dark skin as abstract, ahistoric, insubstantial. We have been here before: Ralph Ellison’s justly celebrated “Invisible Man” observes the pain and perils of a black man’s apparent invisibility to the wider world.
Homo ex machina: Today, that “peculiar disposition of the eyes” Ellison attributed to humans three generations ago may well be a camera lens or a passive-aggressive algorithm. The broad and longstanding cultural act of not seeing, not really seeing blacks and minorities outside the HTML governing one’s personal, inner software, has finally been distilled, albeit accidentally, in the functions of our technology.
Image credits: HP camera: Via amazon.com. Google logo: © Google. Invisible Man cover: © Random House.
Wednesday, December 23, 2009
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment