Facebook translates ‘good morning’ into ‘attack them’, leading to arrest
Palestinian man questioned by Israeli police after embarrassing mistranslation of caption under photo of him leaning against bulldozer.
Facebook has apologised after an error in its machine-translation service saw Israeli police arrest a Palestinian man for posting good morning on his social media profile.
The man, a construction worker in the West Bank settlement of Beitar Illit, near Jerusalem, posted a picture of himself leaning against a bulldozer with the caption , or yusbihuhum, which translates as good morning.
[AdSense-A]
But Facebook’s artificial intelligence-powered translation service, which it built after parting ways with Microsoft’s Bing translation in 2016, instead translated the word into hurt them in English or attack them in Hebrew.
Police officers arrested the man later that day, according to Israeli newspaper Haaretz, after they were notified of the post. They questioned him for several hours, suspicious he was planning to use the pictured bulldozer in a vehicle attack, before realising their mistake. At no point before his arrest did any Arabic-speaking officer read the actual post.
Facebook said it is looking into the issue, and in a statement to Gizmodo, added: “Unfortunately, our translation systems made an error last week that misinterpreted what this individual posted.
“Even though our translations are getting better each day, mistakes like these might happen from time to time and weve taken steps to address this particular issue. We apologise to him and his family for the mistake and the disruption this caused.”
Arabic is considered particularly difficult for many machine translation services due to the large number of different dialects in use around the world, on top of Modern Standard Arabic, the international form of the language.
The Israeli Defence Force has been open about monitoring the social media accounts of Palestinians, looking for lone-wolf attackers who might otherwise slip through the net. It reportedly does so automatically, using algorithms to look for terms such as sword of Allah.
Machine translation mistakes are a regular occurrence for anyone using AI to translate languages, particularly ones with little relationship. Earlier this month, Chinese social network WeChat apologised after its own machine translation system translated a neutral phrase meaning black foreigner as the n-word.
“When I ran the translator, the n-word came up and I was gobsmacked,” said Ann James, who had been texting back and forth with a friend when the faulty translation appeared.