Data shows race and sex bias is costly for gig economy workers at startups like TaskRabbit
Flex work companies like Airbnb and Uber have come under fire in recent months as evidence has piled up suggesting the platforms have problems with racial bias.
Now you can add gig-finding sites TaskRabbit and Fiverr to that list: A new paper analyzing 13,500 worker profiles on the two platforms found that race and gender bias seemed to affect how frequent and positive or negative user reviews were. Reviews can affect search result placement — and thus employment prospects — on certain platforms, the researchers found.
Researchers at Northeastern University, the University of Koblenz-Landau in Germany and ETH Zürich in Switzerland conducted the study.
The authors had Amazon Mechanical Turk workers go through the thousands of profiles and carefully sort them based on sex and race using profile photos — a necessary step, since neither platform requires workers to self-identify.
The researchers then studied the reviews for evidence of bias. On TaskRabbit, they found that workers perceived to be white women received 10% fewer reviews than those thought to be white men, and those perceived as Asian men received 13% fewer reviews than white men.
There were also statistically differences in how likely different groups ranked on TaskRabbit's search results page, said Anikó Hannák, one of the authors on the study.
While white men placed relatively high — 6.5 out of 15 slots in the ranking, on average — black men ranked lower: 7.4 on average.
A similar effect was not found in terms of ranking for women of color: Black and Asian women actually ranked higher than white women by this measure.
While the researchers were able to find evidence of racial and gender bias in reviews on both TaskRabbit and Fiverr, bias didn't affect search results on Fiverr the way it did on TaskRabbit, Christo Wilson, a professor at Northeastern who worked on the study, explained in a phone interview.
"On Fiverr we don't see any correlation between rank and demographics," Wilson said. "Which is what you want. But on TaskRabbit we do."
That's important because on TaskRabbit, reviews help determine where your profile appears in the search rankings. Fewer and poorer reviews can lead to fewer gigs.
"You have feedback that's coming in that has a bias that then influences an algorithm which typically promotes white male workers," Wilson said. "You have the potential for a feedback loop where certain workers are systematically getting pushed down in the search results."
One problem that perpetuates bias in the gig economy, the authors said, is a belief that computer algorithms are neutral.
"A lot of people still assume that algorithms present some sort of objective truth," Hannák said. "We still have to understand and get used to the fact that everything on the internet that is presented to us is sort of inherently biased and personalized. That has huge effects on how we see the world."
TaskRabbit disagrees with that assessment, and believes its algorithms are unbiased, a spokeswoman said in an emailed statement to Mic, because reviews aren't weighted heavily in the company's search algorithms.
"We believe our algorithm is unbiased because of its four primary inputs: category of task, availability, geography, and fulfillment," a spokeswoman said. "We acknowledge that discrimination continues to be a challenge faced by society and are passionately taking a stand against these activities."
Hannák and Wilson both stressed to Mic that their findings only suggest a correlation; they haven't been able to prove that racial or gender bias is motivating people to leave fewer or worse reviews, although they hope to prove causation more clearly in future research.
In addition to increasing awareness, solving the problem will also require changing the way engineers are trained, Wilson said.
"Engineers don't like these kind of messy, value-driven discussions that they then have to go and quantify," Wilson said. "But if you're going to write something that's going to be used by millions of people you have a responsibility to the public."
Fiverr argues that their platform does not pose a problem because they don't require profile photos, as the company pointed out in a statement about the study emailed to Mic. A Fiverr spokeswoman also questioned the merits of the study, for failing to control for language and geography, noting that Fiverr is used in 190 countries.
One problem? The bias study showed that workers lacking profile images tended to get fewer reviews — though the same was not true for workers with filler images like cartoons.
After allegations of racial bias on Airbnb's platform, the company set up a team of engineers to address the problem; a spokesman told Mic in an email that the team currently has roughly a dozen members.