Yang Zhao, Jiajun Zhang, Chengqing Zong. Transformer: A General Framework from Machine Translation to Others[J]. Machine Intelligence Research, 2023, 20(4): 514-538. DOI: 10.1007/s11633-022-1393-5
Citation: Yang Zhao, Jiajun Zhang, Chengqing Zong. Transformer: A General Framework from Machine Translation to Others[J]. Machine Intelligence Research, 2023, 20(4): 514-538. DOI: 10.1007/s11633-022-1393-5

Transformer: A General Framework from Machine Translation to Others

  • Machine translation is an important and challenging task that aims at automatically translating natural language sentences from one language into another. Recently, Transformer-based neural machine translation (NMT) has achieved great breakthroughs and has become a new mainstream method in both methodology and applications. In this article, we conduct an overview of Transformer-based NMT and its extension to other tasks. Specifically, we first introduce the framework of Transformer, discuss the main challenges in NMT and list the representative methods for each challenge. Then, the public resources and toolkits in NMT are listed. Meanwhile, the extensions of Transformer in other tasks, including the other natural language processing tasks, computer vision tasks, audio tasks and multi-modal tasks, are briefly presented. Finally, possible future research directions are suggested.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return