1. Machine translation sounds very sci-fi. What is it?
Machine translation does exactly what it says. A software program takes written text (or speech) in one language and outputs it in another language. Take Google Translate, for example. You type a word, sentence or paragraph in one language into a text box, choose the language you want the text translated into, and it appears in that language alongside your original text. It works by drawing on huge databases of text to find a match for your word, sentence or paragraph. It attempts not just to find the best possible match at the individual word level, but rather at the discourse or sentence level, for your chosen language(s).
2. Are there other programs besides Google Translate?
Several other automatic text translators like Google Translate have been around for over a decade, but more recent developments such as Skype Translator allow for automatic speech translation. Using voice recognition technology, conversations can be translated from one language to another in real time. And that does sound quite sci-fi. This video shows Skype Translator working with school children in the USA and Mexico, who are conversing in their native languages and using English–Spanish translation. Skype Translator currently supports speech translation from and to English, Spanish, Italian, German, French and Mandarin Chinese.
3. It all sounds excellent. What could possibly go wrong?
Lots. Machine translation is only as reliable as the databases it gets its information from. Despite increasingly sophisticated programs and ever larger databases of language to draw on, machine translation doesn’t always get it right. Automatic translation is good for common and formulaic phrases, but it is less able to translate nuances in meaning accurately. When translating, it doesn’t always get the grammar right or, indeed, choose the correct vocabulary in the target language. For example, a few years ago I was asked to review a text that a Spanish friend had translated from Spanish to English using an automatic translator. One phrase in the English translation – a room in a century – stumped me completely, until I figured out that it should have said a quarter of a century. The word cuarto in Spanish can mean a room or a quarter, and the automatic translator had chosen the wrong word for the context. Things have steadily improved since then, but machine translation is still far from perfect.
4. What has machine translation got to do with language teachers?
Also lots. Automatic speech translation seems to speak directly to the fear that many teachers have of being replaced by machines. If a German businessperson can hold a meeting via Skype with a Chinese businessperson, with each speaking their own language, and everything they say being translated in real time, why bother with language teachers? The good news for us is that machine translation does mean that things can get lost in translation and, when it comes to conveying nuances in meaning, there is currently no reliable replacement for either a human translator or for learning to speak a foreign language well. Language is, after all, about communication, and there is no substitute for the human element. But automatic speech translators are certainly a very close second best, and will continue to improve in accuracy.
In the meantime, students themselves have been using automatic text translation for years. Which of us hasn’t received written work from a student that has obviously been run through Google Translate (or similar), before being handed in as the student’s own work?
5. Could I use machine translation with my students?
Most certainly. Ever since text translators first appeared, teachers have been using them as ‘bad models’1. This approach involves giving the students machine translations of texts – for example, from their first language into English – and getting them to identify the errors and correct the English translation. This approach works well for higher levels. For lower levels, it’s important to sensitise the students first to how unreliable automatic translation can be – and how obvious it is when they use it for their own written work in English! Take a short text in English and run it through an automatic text translator (like Google Translate) into the students’ first language. Give the students the translation, and ask them to identify the errors. By working with a bad model in their first language, they will immediately see any errors and inaccuracies, and get a feel for how ‘strange’ machine translated text can sound. Point out that the same thing happens when they translate from their first language into English and then hand that work in to you!
Two-way translation can work well with lower levels. Take a short text in English (for example from your coursebook), and run it through Google Translate into the students’ first language. Take that translation and run it through Google Translate again, this time back into English. Give the class all three versions of the text, and examine the differences. It should soon be obvious to the students that text translators are not completely reliable.
Reference
1. Nino, A ‘Machine translation in foreign language learning: language learners’ and tutors’ perceptions of its advantages and disadvantages’ ReCALL 21 (2) 2009
Nicky Hockly is Director of Pedagogy of The Consultants-E, an online teacher training and development consultancy. Her most recent books are Digital Literacies (Routledge), an e-book: Webinars: A Cookbook for Educators (www.the-round.com), and Going Mobile (Delta Publishing), a book on mobile learning. She also maintains a blog on eModeration. [email protected]