Machine Translation
English
By (author): Pushpak Bhattacharyya
Three paradigms have dominated machine translation (MT)rule-based machine translation (RBMT), statistical machine translation (SMT), and example-based machine translation (EBMT). These paradigms differ in the way they handle the three fundamental processes in MTanalysis, transfer, and generation (ATG). In its pure form, RBMT uses rules, while SMT uses data. EBMT tries a combinationdata supplies translation parts that rules recombine to produce translation.
Machine Translation compares and contrasts the salient principles and practices of RBMT, SMT, and EBMT. Offering an exposition of language phenomena followed by modeling and experimentation, the text:
- Introduces MT against the backdrop of language divergence and the Vauquois triangle
- Presents expectation maximization (EM)-based word alignment as a turning point in the history of MT
- Discusses the most important element of SMTbilingual word alignment from pairs of parallel translations
- Explores the IBM models of MT, explaining how to find the best alignment given a translation pair and how to find the best translation given a new input sentence
- Covers the mathematics of phrase-based SMT, phrase-based decoding, and the Moses SMT environment
- Provides complete walk-throughs of the working of interlingua-based and transfer-based RBMT
- Analyzes EBMT, showing how translation parts can be extracted and recombined to translate a new input, all automatically
- Includes numerous examples that illustrate universal translation phenomena through the usage of specific languages
Machine Translation is designed for advanced undergraduate-level and graduate-level courses in machine translation and natural language processing. The book also makes a handy professional reference for computer engineers.
Print Versions of this book also include access to the ebook version.
See more