The Reason This "Racist Soap Dispenser" Doesn't Work on Black Skin

Impact

At a Marriott hotel in Atlanta, the soap dispensers have a little bit of a race problem.

An African-American guest of the Dragon Con sci-fi and fantasy convention visited a bathroom in the event's host hotel and discovered the soap dispenser, from a British company called Technical Concepts, wouldn't sense his hands. When his friend, a white man named Larry, tried after him, out came the soap.

This ordeal was captured on film. And while there's plenty of giggling in the video, it's because of sheer absurdity of a technology that is meant to sense motion, not skin tone, yet is inoperable based on pigment.

"I wasn't offended, but it was so intriguing, like 'Why is it not recognizing me?'" T.J. Fitzpatrick, the narrator of the video, told Mic. "I tried all the soap dispensers in that restroom, there were maybe 10, and none of them worked. Any time I went into that restroom, I had to have my friend get the soap for me."

Fitzpatrick said he saw the humor in the situation, despite the derogatory comments that followed the video on YouTube. He recalled comments like, "We all know black people don't actually wash their hands anyways," and "Soap dispenser is for human not monkeys or subhumans."

Neither the Atlanta Marriott nor Technical Concepts responded to requests for comment.

What's actually happening: According to Richard Whitney, VP of product at Particle, the soap dispenser uses near-infrared technology, which sends out invisible light from an infrared LED bulb for hands to reflect the light back to a sensor. The reason the soap doesn't just foam out all day is because the hand acts to, more or less, bounce back the light and close the circuit. "If the reflective object actually absorbs that light instead, then the sensor will never trigger because not enough light gets to it," Whitney told Mic.

Whitney presented two extremes. If someone were to put a mirror up to the sensor, the light would reflect easily, triggering the sensor with no problem. But a material like vantablack, which only reflects 0.035% of all light pointed at it, would be kryptonite to an IR sensor. 

"In order to compensate for variations in skin color," Whitney said, "The gain, [or] sensor equivalent to ISO and exposure in cameras, would have to be increased."

No one's skin is dark enough to absorb light like vantablack. Even though the technology world is mostly made of white people, the testing phase that measures how effective a product is when used by people of different skin variants is crucial.

The same course of action goes for security cameras. According to Whitney, what's really happening is that the camera is using an IR flashlight, but it can't see where there's no illumination, or no reflection.

Whitney said there might be other elements in play. The sensor may have been touchy, only picking up hand movement at weird angles that Fitzpatrick didn't hit but his friend did. Or he could have been minimizing his hand exposure to not be detected as easily.

"I know how the sensor works, have pasty white skin and I still have to flail at automatic faucets on a regular basis," Whitney told Mic. "It's not that I disbelieve that the scenario could happen, but that I believe it's unlikely to be the case here. The main reason for skepticism is that the difference in coloration of the palms we see in the video is very small. Small enough that I ... would expect that narrow a range of sensitivity to cause all sorts of other problems."

The real problem: Fitzpatrick's video is far from the first of its kind.

Fitzpatrick made it, and it's clear this is a joke to him. But it introduces the more pervasive problem of technology being constructed without paying mind to the diversity of bodies it is built to serve.

In 2010, Gadgetwise reported that the Xbox Kinect did not recognize the faces of dark-skinned gamers. The company later attributed this to a tricky light sensor, since the results could only be replicated in low light — which is significant, since most people spend a Friday night playing Xbox in a dim living room and not beneath florescent cafeteria lamps.

Before the Xbox incidence, a black man and white woman on YouTube displayed Hewlett-Packard's uneven facial recognition software, in which the camera tracked the woman's movements but didn't follow those of the man.

Earlier this year, Google Photos' auto-labeling system misidentified two black friends as "gorillas" in their photos together, setting off both a social media uproar and a demand for Google to step up its technology to be more sensitive about the words it uses in photos of people.

Flickr's auto-tagging feature also egregiously mislabeled an African-American man as "animal" and "ape" before the Flickr team went in to remove the tags, claiming the algorithm was still learning how to recognize images.

It remains a mystery why all these pieces of software are having a hard time acknowledging African-Americans as people who play video games or need soap, especially since, as Whitney said, the pigment differences probably weren't enough to throw the sensor out of whack. But when it results in computers only being able to identify a certain percentage of the people who use them, it goes beyond just being a viral video and becomes an issue that needs addressing.