How Airbnb is Addressing Racism in the Sharing Economy
CJ Todd | December 27, 2016
In September of 2015, Airbnb addressed concerns of discrimination against African American and Hispanic renters by rolling out a slate of new policies. But the home-sharing company might be falling short of racial equality amongst its users.
In a recent article for CityLab, staff writer Brentin Mock presents data from Boston University economist Ray Fisman and Harvard University business professor Michael Luca. They found that Airbnb is dropping the ball when it comes to the collection and disclosure of data on the race and gender of its users.
The duo published findings based on Luca's group research that consisted of making 20 fake Airbnb profiles, half of which were created with commonly used African American names and the other half used common Caucasian names. The findings were upsetting to say the least...
"Requests with black-sounding names were 16% less likely than those with white-sounding names to be accepted."
The fake users contacted 6,400 hosts about their properties. Requests with black-sounding names were 16% less likely than those with white-sounding names to be accepted. The discrimination was pervasive, running through all types of listings.
Most hosts who declined requests from black-sounding profiles had never hosted a black guest—suggesting that some hosts are especially inclined to discriminate on the basis of race.
According to Luca and Fisman, the problem hinges on businesses like Airbnb becoming too reliant on algorithms and big data for managing online commerce. Luca and Fisman believe that algorithm-generated discrimination occurs in ways that humans would likely avoid.
"Algorithms don’t naturally launder racism out of business transactions—if data generated from a racist society is what goes into the formula, racism is what comes out."
Data that some companies depend on reflect racism that already exists in American society and institutions. "Algorithms don’t naturally launder that kind of racism out of business transactions—if data generated from a racist society is what goes into the formula, racism is what comes out," says Mock.
When a company neglects to collect data on race or gender altogether, leadership shouldn't be surprised when racism and sexism become hard to stop. Fisman and Luca suggest that companies can correct the problem by creating algorithms that are more attuned to potential bias and designing websites with fewer opportunities for discrimination to happen.
"We're eager to work with researchers and experts who share our commitment to building a community that's fair for everyone and we will review all of these recommendations."
Airbnb spokesman Nick Papas told CityLab that the company is still in the early stages of its anti-discrimination efforts. “The proposal we put forward in September was just the beginning of our work to fight bias and discrimination,” he says. “We're eager to work with researchers and experts who share our commitment to building a community that's fair for everyone and we will review all of these recommendations.”
Airbnb is one of the few companies in the sharing economy that has been taking steps to improve the racial diversity of its employees and leadership. In Airbnb’s most recent report to the Equal Employment Opportunity Commission, it showed that just 2.9 percent of its staff are African Americans while 10 percent of its staff identify as minorities. The company made a goal of increasing that percentage to 11 percent in 2017.
It's difficult to say how racial issues will be addressed moving forward, but at least Airbnb has recognized the problem and has publicly announced its dedication to fighting discrimination.
Have you experienced racism within the sharing economy? I'd love to hear your story and discuss how you think companies should address the problem. Leave a comment below and let's start exploring solutions!
Interested in knowing more about partnering with platformOS?
Ensure your project’s success with the power of platformOS.