Amanote Research

Amanote Research

    RegisterSign In

Latent Semantic Information in Maximum Entropy Language Models for Conversational Speech Recognition

doi 10.3115/1073445.1073453
Full Text
Open PDF
Abstract

Available in full text

Date

January 1, 2003

Authors
Yonggang DengSanjeev Khudanpur
Publisher

Association for Computational Linguistics


Related search

Hierarchical Bayesian Language Models for Conversational Speech Recognition

IEEE Transactions on Audio, Speech, and Language Processing
2010English

Latent Words Recurrent Neural Network Language Models for Automatic Speech Recognition

IEICE Transactions on Information and Systems
Electronic EngineeringPattern RecognitionHardwareComputer VisionElectricalArchitectureArtificial IntelligenceSoftware
2019English

Exploration of the Impact of Maximum Entropy in Recurrent Neural Network Language Models for Code-Switching Speech

2014English

Adapting N-Gram Maximum Entropy Language Models With Conditional Entropy Regularization

2011English

The Latent Maximum Entropy Principle

English

Maximum Entropy Modeling in Sparse Semantic Tagging

2004English

Generating and Evaluating Segmentations for Automatic Speech Recognition of Conversational Telephone Speech

English

Recognition of Conversational Telephone Speech Using the JANUS Speech Engine

English

Semantic Cache Model Driven Speech Recognition

2010English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy