Purest ever example-based machine translation: Detailed presentation and assessment

Yves Lepage*, Etienne Denoual

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

62 Citations (Scopus)


We have designed, implemented and assessed an EBMT system that can be dubbed the "purest ever built": it strictly does not make any use of variables, templates or patterns, does not have any explicit transfer component, and does not require any preprocessing or training of the aligned examples. It uses only a specific operation, proportional analogy, that implicitly neutralizes divergences between languages and captures lexical and syntactic variations along the paradigmatic and syntagmatic axes without explicitly decomposing sentences into fragments. Exactly the same genuine implementation of such a core engine was evaluated on different tasks and language pairs. To begin with, we compared our system on two tasks of a previous MT evaluation campaign to rank it among other current state-of-the-art systems. Then, we illustrated the "universality" of our system by participating in a recent MT evaluation campaign, with exactly the same core engine, for a wide variety of language pairs. Finally, we studied the influence of extra data like dictionaries and paraphrases on the system performance.

Original languageEnglish
Pages (from-to)251-282
Number of pages32
JournalMachine Translation
Issue number3-4
Publication statusPublished - 2005 Dec
Externally publishedYes


  • Divergences across languages
  • Example-based machine translation
  • Proportional analogies

ASJC Scopus subject areas

  • Software
  • Language and Linguistics
  • Linguistics and Language
  • Artificial Intelligence


Dive into the research topics of 'Purest ever example-based machine translation: Detailed presentation and assessment'. Together they form a unique fingerprint.

Cite this