Chinese messaging app error sees n-word used in translation

http://1.bp.blogspot.com/-Pst58sn9bfI/WeGNh7d2AAI/AAAAAAAAKzs/1z48GLKnDH8UEGvYlKgPaVrQ0BUjySrfQCK4BGAYYCw/s1600/2017-10-14_9-33-16.png

Chinese messaging app error sees n-word used in translation



Powered by Guardian.co.ukThis article titled “Chinese messaging app error sees n-word used in translation” was written by Alex Hern, for theguardian.com on Friday 13th October 2017 10.31 UTC


Chinese messaging app WeChat has reportedly apologised after an AI error resulted in it translating a neutral Chinese phrase into the n-word.


The WeChat error was reported by Shanghai-based theatre producer and actor Ann James, a black American. In a post on the service’s Twitter-like Moments feature, she wrote that it had translated hei laowai – a neutral phrase which literally means “black foreigner” – as the n-word.


“We’re very sorry for the inappropriate translation,” a WeChat spokesperson told Chinese news site Sixth Tone. “After receiving users’ feedback, we immediately fixed the problem.”


WeChat added that the translation engine it uses is a neural network-based service, which is always being tweaked for “more accurate, faithful, expressive, and elegant” output.


Such systems are notorious for incorporating the biases and errors of the data sources on which they are trained, which appears to be what occurred here.


In the example James provided, the offending sentence read “the nigger’s still late”. A series of similar tests from a local news site, That’s Shanghai, revealed the system is capable of providing a more accurate translation for positive sentences. “Black foreigners is cool” comes out more or less correct, while other translations used the n-word when the sentence called them “lazy” or “a thief”. In all the tests, the same Chinese phrase, hei laowai, was used.


The machine learning system may, then, have been trained on a corpus of Chinese and English text which contained racial slurs alongside stereotypical descriptions of black people.


Speaking to the Guardian, James said: “When I ran the translator, the n-word came up and I was gobsmacked.” She told SixthTone that she was unsurprised by the error. “If you’re a black person in China, you’ve come up against some craziness,” she said. “I know there’s a lot of curiosity and a lot of ignorance about black people [in China].”


James said she was pleased by WeChat’s response. “WeChat changed the algorithm, and now no one in 900 million daily users will ever have to translate that n-word again,” she said over email. “Pretty epic.”


Machine translation making questionable choices isn’t just a problem in China. Google’s own translation product, widely seen as the best on the market, can offer up problematic outputs. Users attempting to translate gender-neutral Turkish sentences for “they are a doctor” and “they are a nurse”, for instance, will instead find the service making sexist assumptions and translating the two sentences as masculine and feminine respectively.


And almost two years ago, Google was also forced to apologise after its machine vision system had labelled a photo of two black people as “gorillas”.


WeChat did not comment by time of publication.


guardian.co.uk © Guardian News & Media Limited 2010


Published via the Guardian News Feed plugin for WordPress.




Chinese messaging app error sees n-word used in translationhttps://goo.gl/o7njQy

0 comments:

Post a Comment

More

Whats Hot