Amanote Research

Amanote Research

    RegisterSign In

Question Answering Using Hierarchical Attention on Top of BERT Features

doi 10.18653/v1/d19-5825
Full Text
Open PDF
Abstract

Available in full text

Date

January 1, 2019

Authors
Reham OsamaNagwa El-MakkyMarwan Torki
Publisher

Association for Computational Linguistics


Related search

Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering

2018English

Multi-Passage BERT: A Globally Normalized BERT Model for Open-Domain Question Answering

2019English

Calling Attention to Passages for Biomedical Question Answering

Lecture Notes in Computer Science
Computer ScienceTheoretical Computer Science
2020English

Question Answering Using Word Associations

English

Using Coreference for Question Answering

1999English

On the Voice-Activated Question Answering

IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews)
2012English

Question Answering Passage Retrieval Using Dependency Relations

2005English

Question Answering for Suicide Risk Assessment Using Reddit

2019English

QU-BIGIR at SemEval 2017 Task 3: Using Similarity Features for Arabic Community Question Answering Forums

2017English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy