- Tech startup Sanas has been accused of racism for its ‘accent translation’ technology.
- Artificial intelligence and tech industry experts say the startup’s mission legitimizes racism and is a form of “digital whitewashing.”
- But some call center agents told Insider they think the technology would improve their day-to-day work.
Accent translation startup Sanas has faced accusations of racism and discrimination in the past week after it was accused of manipulating non-American accents to sound “whiter”. The company uses speech recognition technology to change the user’s accent in near real time; their main target seems to be foreign call center employees.
Sharath Keshava Narayana, co-founder and COO of Sanas, denied that the startup’s technology is discriminatory, telling Insider that the company has always intended to expand its translation model to include other accents. The demo on its website, where the technology translates an Indian accent into an American standard, shows only its initial design, according to Keshava Narayana.
“It’s not just an American who has trouble understanding someone from India, it’s the other way around,” Keshava Narayana told Insider. “As we continue to scale the product and as we start to see more and more targeted accents, we believe this will be a localized solution.”
Sanas has tested translation models in other countries, such as India and the Philippines, and plans to introduce accented translation in Latin America and Korea as well, according to the startup.
However, some experts within the technology industry have accused the startup’s product of being a form of “digital whitewashing”. AI investor and tech angel and CEO of women-led computer programming group FrauenLoop Nakeema Stefflbauer told Insider that the problem with Sana’s response is that “accents signal power and belonging.”
“When this is commercialized, there’s only one direction that everyone flocks to,” she said. “It’s not about understanding as much as it’s about comfort—for groups that don’t want to understand, empathize with, or engage with people who are not at all different. This technology does nothing to ensure the comfort of the hypothetical employee of call center.”
She added that until Sanas advertises this technology to clients in the Global South as a means to better understand and communicate with Americans and Western Europeans, then “it’s a one-sided ‘solution’ that reinforces racial hierarchies, whether intended or not.” no”.
Artificial intelligence and tech industry experts and call center workers spoke to Insider about what they saw as the cultural costs as well as the potential benefits that come from Sanas. While the company says the goal of its technology is to make people on the phone around the world sound more “local,” Stefflbauer and others in the AI field worry it’s another step toward homogenizing the startup world. eve – something that Silicon Valley has been accused of repeatedly. of perpetuation.
“What is this trying to tell us in terms of what the future sounds like and how we should all experience voices online and communicate with people?” Stefflbauer said. “Who are the people we need to communicate with and who are the people we never hear from?”
Tech Industry Experts Say Accent “Translation” Is A Form Of “Digital Whitening”
Sanas, which has raised $32 million in funding, says its goal is to help people sound “more local, more global” on its website. In an interview with the BBC, Keshava Narayana said that 90% of the company’s employees and its four founders are immigrants, and denied criticism that the company is trying to make the world sound “white and American.”
But Mia Shah-Dand, founder of Women in AI Ethics and Lighthouse3, told Insider that as an immigrant from India with a non-American accent, she found Sana’s announcement “very inflammatory,” especially as someone who has been “bullied.” and discriminated against because of [their] accent.”
She said technology is trying to erase what makes people unique and telling them who they are “isn’t good enough.”
“It feels like anything in Silicon Valley, as long as it’s legitimized by the Stanfords or MIT, it’s OK,” she said. “People will accept racism, they will accept sexism, as long as the people doing it belong to one of these prestigious universities.”
Shah-Dand added that Sana’s product reinforces a power dynamic that “resembles colonial times.” Rather than addressing the root causes of racism and discrimination, “accent translation” leans toward a form of “whitewashing”—a power dynamic seen in many historically colonized countries where people felt pressured to make their skin lighter. white to fit European standards of beauty.
“It’s Silicon Valley’s version of digital whitewashing,” Shah-Dand said. “Instead of technology making the world a better place, it’s reinforcing, helping, just cashing in on all that hate and racism instead of trying to fix anything.”
Stefflbauer told Insider that she found Sana’s technology “really frustrating and disturbing,” especially in the growing culture of bringing “your whole self to work.”
“Only some people can bring all of themselves, everyone outside of this mythical norm is not invited to bring any of themselves,” she said, referring to the 2018 dark surrealist comedy Sorry to Bother You. , where a black telemarketer finds her young. the doors of professional success open to him only after he adopts a resounding “white” voice.
“It’s really another example of what we’re up against in terms of trying to make the tech industry and the products and services that come out of it reflect the real world,” Stefflbauer said.
She added that she doesn’t see how this technology would actually address racial bias in any way.
“He doesn’t even come close to that in his solution,” she said. “It basically provides support and cover for people who are going to misbehave with prominent individuals with whom they have any sort of relationship, to continue to do so.”
Call center agents told Insider they face racial animosity
Sanas’ founders said they came up with the idea for the startup after a college friend from Stanford underperformed at his call center job because of his thick Central American accent.
Call center agents Insider spoke to said their jobs can be brutal — and doubly so if they have a racially distinct accent or name.
“Unfortunately in this world, there are many people who will feel like they are better than you or will choose to talk down to you when they hear your accent,” Dafina Swann, who has worked in call centers for more . than five years, he said.
Swann, who is from Trinidad and Tobago, said she received many “hostile” and “negative” comments from callers who asked to speak to someone who is American. She had also heard of cases where colleagues were called racist names, such as the n-word, and told they were “not human, but black”.
To minimize the racial backlash they face, some call center agents told Insider they already try to imitate customers’ accents, even changing their names. Sometimes, the directive to change their names comes from agents’ managers or employers.
“After I started introducing myself as Michael O’Connor, my performance ratings from customer surveys went up — all green, green, green,” Osama Badr, a call center agent from Egypt, told Insider.
Sanas co-founder Keshava Narayana said he too had a similar experience when he worked in a call centre, where he underwent six weeks of accent training and was told to change his name to “Ethan”.
“There are some incidents that stick with you for a long time and this was one of them,” he told Insider.
Some fear that manipulated voices could signal a homogenized future in technology
Shah-Dand said she is unconvinced by the tech’s defense, saying people are exposed and can understand different accents, but only because call center workers are treated as “less than” that they receive abuse. unfair.
“There are many people who have very heavy accents, like Boutros Boutros-Ghali for example,” said Shah-Dand, referring to the former United Nations Secretary-General. “But because they are in positions of power, you make an effort to understand.”
Stefflbauer said in her work, she’s always thinking about what digital life will be like 10, 20 years in the future, and she’s worried about what technology like Sanas predicts.
“I see more and more examples of a digital life where no one is black, no one is brown, no one has an accent, no one has a history outside of the mythical North American ideal,” Stefflbauer said. “And the question is: do we want to export this mentality and bring this misery to everybody? Because that’s definitely what it is.” Other AI technologies, including facial recognition technologies, have also faced accusations of racism and homogenization.
“Who would be comfortable taking a selfie on Instagram and having their face automatically changed to look like someone of a different race?” she said. “That’s basically what it is.”
But call center workers who have to deal with racist comments in their day-to-day work say a solution like the one Sanas offers can be a blessing.
“It definitely would have made my job easier. Everybody wants to be understood,” Swann said. “There’s a job that needs to be done and if there’s something that can be implemented to make that job easier, then that’s great.”