Example-based machine translation (EBMT) is a method of machine translation often characterized by its use of a bilingual corpus with parallel texts as its main knowledge base at run-time. It is essentially a translation by analogy and can be viewed as an implementation of a case-based reasoning approach to machine learning.
Translation by analogy
At the foundation of example-based machine translation is the idea of translation by analogy. When applied to the process of human translation, the idea that translation takes place by analogy is a rejection of the idea that people translate sentences by doing deep linguistic analysis. Instead, it is founded on the belief that people translate by first decomposing a sentence into certain phrases, then by translating these phrases, and finally by properly composing these fragments into one long sentence. Phrasal translations are translated by analogy to previous translations. The principle of translation by analogy is encoded to example-based machine translation through the example translations that are used to train such a system.
Other approaches to machine translation, including statistical machine translation, also use bilingual corpora to learn the process of translation.
History
Example-based machine translation was first suggested by Makoto Nagao in 1984.[1] He pointed out that it is especially adapted to translation between two totally different languages, such as English and Japanese. In this case, one sentence can be translated into several well-structured sentences in another language, therefore, it is no use to do the deep linguistic analysis characteristic of rule-based machine translation.
Example
English | Japanese | |
---|---|---|
How much is that red umbrella? | Ano akai kasa wa ikura desu ka. | |
How much is that small camera? | Ano chiisai kamera wa ikura desu ka. |
Example-based machine translation systems are trained from bilingual parallel corpora containing sentence pairs like the example shown in the table above. Sentence pairs contain sentences in one language with their translations into another. The particular example shows an example of a minimal pair, meaning that the sentences vary by just one element. These sentences make it simple to learn translations of portions of a sentence. For example, an example-based machine translation system would learn three units of translation from the above example:
- How much is that X ? corresponds to Ano X wa ikura desu ka.
- red umbrella corresponds to akai kasa
- small camera corresponds to chiisai kamera
Composing these units can be used to produce novel translations in the future. For example, if we have been trained using some text containing the sentences:
President Kennedy was shot dead during the parade. and The convict escaped on July 15th., then we could translate the sentence The convict was shot dead during the parade. by substituting the appropriate parts of the sentences.
Phrasal verbs
Example-based machine translation is best suited for sub-language phenomena like phrasal verbs. Phrasal verbs have highly context-dependent meanings. They are common in English, where they comprise a verb followed by an adverb and/or a preposition, which are called the particle to the verb. Phrasal verbs produce specialized context-specific meanings that may not be derived from the meaning of the constituents. There is almost always an ambiguity during word-to-word translation from source to the target language.
As an example, consider the phrasal verb "put on" and its Hindustani translation. It may be used in any of the following ways:
- Ram put on the lights. (Switched on) (Hindustani translation: Jalana)
- Ram put on a cap. (Wear) (Hindustani translation: Pahenna)
See also
References
- ↑ Makoto Nagao (1984). "A framework of a mechanical translation between Japanese and English by analogy principle" (PDF). In A. Elithorn and R. Banerji (ed.). Artificial and Human Intelligence. Elsevier Science Publishers.
Further reading
- Carl, Michael; Way, Andy (2003). Recent Advances in Example-Based Machine Translation. Netherlands: Springer. doi:10.1007/978-94-010-0181-6. ISBN 978-1-4020-1400-0.