Imagine if you could conduct a bitter debate with people speaking numerous different languages all at once at an international conference or chat with some new friends from around the globe at a bar at some international vacation spot, even if you only spoke one language. These dreams and more may end up actually coming true with the use of a tiny smart translation program.
On September 27 Google sparked worldwide fears about artificial intelligence (AI) replacing human beings when it launched its latest Google Neural Machine Translation system (GNMT), which Google stated utilizes "state-of-the-art training techniques to achieve the largest improvements to date for machine translation quality."
A different approach
According to a Google technical report, GNMT surpasses previous translation systems in that it makes use of "human-rated side-by-side comparison as a metric" and "considers the entire input sentence as a unit" thus requiring fewer engineering design choices.
"The previous translation system is based on words, phrases and grammar, putting words together in order. But the GNMT system is based on recurrent neural networks and an attention mechanism," Wang Shijin - the deputy dean of iFlytek Research, a research center that develops Siri-like speech and language technology - told the Global Times on Tuesday.
According to an article on news website guancha.cn, the attention mechanism links the decode process of machine translation with the encode process thus improving the degree of parallelism and as a result accelerates the speed of translation.
For instance, when the GNMT translates a Chinese sentence into English, a notoriously difficult language pair, the network first "encodes the Chinese words as a list of vectors, where each vector represents the meaning of all the words read so far (Encoder). Once the entire sentence is read, the decoder begins, generating the English sentence one word at a time (Decoder). To generate the translated word at each step, the decoder pays attention to a weighted distribution over the encoded Chinese vectors most relevant to generate the English word."
"GNMT reduces translation errors by more than 55 to 85 percent on several major language pairs measured on sampled sentences from Wikipedia and news websites with the help of bilingual human raters," the article said.
The new translation system is one of Google's newest entries to the AI field after its Go-playing computer program AlphaGo defeated South Korea's Lee Se-dol, an 18-time Go world champion, 4-1 in a five-game match in March.
However, Chen Boxing, a research fellow at the Institute for Information Technology under the National Research Council Canada, claims that the GNMT system was far from a breakthrough in machine translation.
"What Google did is to just combine several of the latest techniques together into a good system based on the company's great engineering and computing capacities,"Chen posted on Sina Weibo on September 3.
"Actually, there is nothing new here," Chen wrote.
Wang echoed Chen's opinion saying that the GNMT system has still failed to include context into the translation process.
According to Wang, there are three development stages for AI: the intelligence to compute, for instance handling basic arithmetic; the intelligence to perceive, in other words getting AI to listen and understand; and the intelligence to cognize, which would require machines to acquire the ability to learn, remember and associate.
"The biggest difficulty we face when moving into the third stage is to provide the machine with cognitive intelligence, in other words the ability to understand, utilize, reason and forecast," Wang said.
This limitation is still the Achilles' heel of AI.
"Although image and speech cognition technologies at the second stage have reached levels of accuracy as high as 99 percent, it is still impossible for a machine to recognize people when looking at them from behind since they lack the ability to associate," Wang explained.
Chen assuaged the fears of those who worry they may soon lose their translation jobs to GNMT.
"Translators may feel worried or scared by AI translation just like cotton spinners worried in the 18th century when the steam engine was introduced, but the number of cotton spinners actually doubled after that," said Chen.