Amanote Research

Amanote Research

    RegisterSign In

NICT’s Participation to WAT 2019: Multilingualism and Multi-Step Fine-Tuning for Low Resource NMT

doi 10.18653/v1/d19-5207
Full Text
Open PDF
Abstract

Available in full text

Date

January 1, 2019

Authors
Raj DabreEiichiro Sumita
Publisher

Association for Computational Linguistics


Related search

Long Warm-Up and Self-Training: Training Strategies of NICT-2 NMT System at WAT-2019

2019English

UCSMNLP: Statistical Machine Translation for WAT 2019

2019English

Fine-Tuning Silencing

Cell Stem Cell
Molecular MedicineGeneticsCell Biology
2010English

Multilingualism, Multi-Competence and (Limits To) the Interaction Between Language Systems

TEANGA, the Journal of the Irish Association for Applied Linguistics
2018English

Spanish “Fine Tuning” of Language to Describe Depression and Anxiety

Journal of Palliative Medicine
MedicineAnesthesiologyPain MedicineNursing
2009English

Materials: Fine-Tuning Optical Fibres

Nature
Multidisciplinary
2011English

Light Stops and Fine-Tuning in MSSM

European Physical Journal C
EngineeringAstronomyPhysics
2018English

Multi-lingualWikipedia Summarization and Title Generation on Low Resource Corpus

2019English

Regularization Techniques for Fine-Tuning in Neural Machine Translation

2017English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy