Amanote Research

Amanote Research

    RegisterSign In

Dependency-Based Self-Attention for Transformer NMT

doi 10.26615/978-954-452-056-4_028
Full Text
Open PDF
Abstract

Available in full text

Date

October 22, 2019

Authors
Hiroyuki DeguchiAkihiro TamuraTakashi Ninomiya
Publisher

Incoma Ltd., Shoumen, Bulgaria


Related search

Dependency-Based Relative Positional Encoding for Transformer NMT

2019English

Retrosynthesis With Attention-Based NMT Model and Chemical Analysis of “Wrong” Predictions

RSC Advances
ChemistryChemical Engineering
2020English

Joint Self-Attention Based Neural Networks for Semantic Relation Extraction

Journal of Information Hiding and Privacy Protection
2019English

Attention-Guided Spatial Transformer Networks for Fine-Grained Visual Recognition

IEICE Transactions on Information and Systems
Electronic EngineeringPattern RecognitionHardwareComputer VisionElectricalArchitectureArtificial IntelligenceSoftware
2019English

Syntax-Enhanced Self-Attention-Based Semantic Role Labeling

2019English

Attention and “Visual Field Dependency” in the Pigeon1

Journal of the Experimental Analysis of Behavior
Behavioral NeuroscienceExperimentalCognitive Psychology
1973English

NMT Kikker Under Panseret

Norsk medietidsskrift
2020English

Attention Dependency in Implicit Learning of Repeated Search Context

Quarterly Journal of Experimental Psychology
NeuropsychologyCognitive PsychologyPhysiological PsychologyMedicinePhysiologyExperimentalPsychology
2007English

Benefits of Data Augmentation for NMT-based Text Normalization of User-Generated Content

2019English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy