6   Artículos

 
en línea
Wenbo Zhang, Xiao Li, Yating Yang and Rui Dong    
The pre-training fine-tuning mode has been shown to be effective for low resource neural machine translation. In this mode, pre-training models trained on monolingual data are used to initiate translation models to transfer knowledge from monolingual dat... ver más
Revista: Information    Formato: Electrónico

 
en línea
Yajuan Wang, Xiao Li, Yating Yang, Azmat Anwar and Rui Dong    
Both the statistical machine translation (SMT) model and neural machine translation (NMT) model are the representative models in Uyghur?Chinese machine translation tasks with their own merits. Thus, it will be a promising direction to combine the advanta... ver más
Revista: Information    Formato: Electrónico

 
en línea
Yirong Pan, Xiao Li, Yating Yang and Rui Dong    
Benefitting from the rapid development of artificial intelligence (AI) and deep learning, the machine translation task based on neural networks has achieved impressive performance in many high-resource language pairs. However, the neural machine translat... ver más
Revista: Future Internet    Formato: Electrónico

 
en línea
Wenbo Zhang, Xiao Li, Yating Yang, Rui Dong and Gongxu Luo    
Recently, the pretraining of models has been successfully applied to unsupervised and semi-supervised neural machine translation. A cross-lingual language model uses a pretrained masked language model to initialize the encoder and decoder of the translat... ver más
Revista: Future Internet    Formato: Electrónico

« Anterior     Página: 1 de 1     Siguiente »