Amanote Research

Amanote Research

    RegisterSign In

Extending MT Evaluation Tools With Translation Complexity Metrics

doi 10.3115/1220355.1220371
Full Text
Open PDF
Abstract

Available in full text

Date

January 1, 2004

Authors
Bogdan BabychDebbie ElliottAnthony Hartley
Publisher

Association for Computational Linguistics


Related search

Extending the BLEU MT Evaluation Method With Frequency Weightings

2004English

Representation Based Translation Evaluation Metrics

2015English

Automated MT Evaluation Metrics and Their Limitations

Revista Tradumatica
LinguisticsLiteratureLiterary TheoryLanguage
2014English

An Experimental Evaluation of Tools for Estimating Bandwidth-Related Metrics

International Journal of Computer Network and Information Security
2018English

An Investigation of Machine Translation Evaluation Metrics in Cross-Lingual Question Answering

Journal of Natural Language Processing
2016English

Extending Water Resources Performance Metrics to River Ecosystems

Ecological Indicators
Decision SciencesEcologySystematicsEvolutionBehavior
2020English

Extending Continuous Cuts: Anisotropic Metrics and Expansion Moves

2009English

Inter-Rater Agreement Measures and the Refinement of Metrics in the PLATO MT Evaluation Paradigm

2005English

Using Complexity Metrics With R-R Intervals and BPM Heart Rate Measures

Frontiers in Physiology
Physiology
2013English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy