Thumbnail
Access Restriction
Open

Author Zhang, Hao ♦ Gildea, Daniel
Source CiteSeerX
Content type Text
File Format PDF
Language English
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Synchronous Context-free Grammar ♦ Trigram Pas ♦ Bigram Decoder ♦ Machine Translation ♦ Trigram Language Model ♦ Performance Gap ♦ Additional Fast ♦ Bigram Language Model ♦ Second Pas ♦ N-gram Language Model ♦ Trigram Decoder ♦ Translation Model ♦ Multi-pass Approach ♦ Expected Count ♦ Correct Translation ♦ Bleu Score ♦ Bigram Pas ♦ First Pas ♦ Synchronous Context Free Grammar ♦ Resulting Parse Forest ♦ Efficient Multi-pass Decoding
Description We take a multi-pass approach to machine translation decoding when using synchronous context-free grammars as the translation model and n-gram language models: the first pass uses a bigram language model, and the resulting parse forest is used in the second pass to guide search with a trigram language model. The trigram pass closes most of the performance gap between a bigram decoder and a much slower trigram decoder, but takes time that is insignificant in comparison to the bigram pass. An additional fast decoding pass maximizing the expected count of correct translation hypotheses increases the BLEU score significantly. 1
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study
Learning Resource Type Article
Publisher Date 2008-01-01
Publisher Institution In Proceedings of the 46th Annual Conference of the Association for Computational Linguistics: Human Language Technologies (ACL-08:HLT