Facebook neural network translates with an AI brain
While the ability to drop a chunk of text in one language into a service like Google Translate, and have it emerge in another within milliseconds, isn’t new, Facebook’s challenge is arguably more arduous. Unlike carefully written and edited news reports, technical papers, and other documents online, Facebook statuses aren’t necessarily going to be written in perfect examples of the user’s native tongue. That has the potential to trip up a traditional system.
Until now, Facebook has been using phrase-based translation. As the name suggests, it relies on breaking down paragraphs and sentences into words or small groups of words, and then looking for the equivalent in a different language. The segments are then recombined.
All well and good, but you can miss out on some important details from that route. For a start, each language doesn’t necessarily share the same patterns of word placement and phrasing: the translated version might make rough sense, but be awkwardly organized. Similarly, it doesn’t take into account the overall context of what’s been written, because it’s only dealing with small portions of the text in isolation.
“We need to account for context, slang, typos, abbreviations, and intent simultaneously,” Facebook’s AI team writes today. “To continue improving the quality of our translations, we recently switched from using phrase-based machine translation models to neural networks to power all of our backend translation systems, which account for more than 2,000 translation directions and 4.5 billion translations each day.”
The other benefits of a neural machine learning translation system can be softer and less obvious, Facebook says, but improve user-experience nonetheless. For instance, figuring out the best possible substitute word when a target language doesn’t have a precise match, without leaving the end result feeling stilted or not making sense. To improve efficiency, Facebook taught the system with typical sentences so it has some shortcuts to the most likely interpretation of the source phrase.
Looking ahead, the neural network could get even smarter at figuring out what’s being referred to. The AI team is exploring how to use accompanying media to better educate the machine learning system: what’s visible in a photo, for instance, could help Facebook figure out how best to translate its text description. At the same time, multiple simultaneous translations into different languages, coupling the neural models together, could see inferences shared between what are currently independent processes, improving accuracy overall just as a group can often handle a task better than just two people.
Facebook isn’t short on ambitious projects. Earlier this year, at the company’s annual f8 developer conference, it revealed it was working on a mind-reading system by which users might eventually be able to type with their brains.