“It got really, really hard. I went from making $200 a day to struggling to make $100,” Salas said.
That’s $100 a day before expenses like gas and wear and tear on her car, while she went to school for radiology and took care of her newborn baby.
Did she get low ratings because she did something wrong? Were the customers just grumpy? Or did they react negatively to who she is? Salas is part Pacific Islander, part Native American.
“It’s kind of a bummer,” Salas said. “I would have wished to know why so I could improve myself.”
Frustrated, Salas reached out to Gig Workers Rising, an advocacy group for app workers. Lead organizer Lauren Casey said she has heard this same story again and again from workers of color.
Casey said, “Their performance at work is held to a different standard and in turn they receive worse ratings.”
A representative from Instacart said it has policies to deal with overt racism, but like other app companies, there’s no mechanism for detecting implicit bias, let alone addressing it.
No Data, No Context
Stanford University law professor Richard Ford said the app rating system has magnified the problem of implicit bias, making it easier for customers to hurt workers and harder for workers to prove it is happening.
“You don’t have context, and you don’t have the interpersonal reactions that might give you some clue that the ratings were based on race,” Ford said.
All you have is a number, and given our society’s increasing fetishization of data, Ford said a number without context can be very dangerous. “The difference in today’s environment is that it looks more objective. You’re getting, you know, a numerical rating. How could you argue with the numbers?”
Even if the ratings are high, it doesn’t mean they are fair. It’s possible that a person of different a race, sex or origin had to work harder to get good ratings.
UC Hastings labor law professor Veena Dubal has interviewed more than 100 Lyft and Uber drivers. She said Black and brown drivers often talk about having to perform to make white customers happy.
“There’s a lot of emotional labor and a lot of emotional performance that goes into ensuring that you’re not getting poor ratings, because otherwise you’re going to get fired. It’s almost that you have to play into the racial sensibilities of consumers,” Dubal said.
Some restaurants pool tips so any negative impacts from implicit bias are shared by the whole staff. App companies could adjust tips and ratings for Black and brown drivers to compensate for bias, but that means first figuring out how much lower they are on average.
Thanks to Proposition 22, app companies face no legal pressure to gather the necessary demographic data. Without data, individual workers are left to interpret their own experience, isolated and unprotected.
— to www.kqed.org