Amanote Research

Amanote Research

    RegisterSign In

Transformer and Seq2seq Model for Paraphrase Generation

doi 10.18653/v1/d19-5627
Full Text
Open PDF
Abstract

Available in full text

Date

January 1, 2019

Authors
Elozino EgonmwanYllias Chali
Publisher

Association for Computational Linguistics


Related search

Intrinsic and Extrinsic Automatic Evaluation Strategies for Paraphrase Generation Systems

Journal of Computer and Communications
2020English

Training Tips for the Transformer Model

The Prague Bulletin of Mathematical Linguistics
2018English

Introduction of a New Paraphrase Generation Tool Based on Monte-Carlo Sampling

2009English

High Frequency Short-Circuit Inductance for Model Transformer

Journal of International Council on Electrical Engineering
2017English

Convolutional Neural Network for Paraphrase Identification

2015English

Problems of Paraphrase: Bottom's Dream

Baltic International Yearbook of Cognition, Logic and Communication
2007English

Molecular Transformer – A Model for Uncertainty-Calibrated Chemical Reaction Prediction

2019English

Coupling Model for an Extended-Range Plasmonic Optical Transformer Scanning Probe

Light: Science and Applications
OpticsMolecular Physics,OpticalAtomicMagnetic MaterialsElectronic
2014English

An Extensive Tensor Algebraic Model of Transformer

ECTI Transactions on Computer and Information Technology
Electronic EngineeringInformation SystemsComputer NetworksCommunicationsManagementElectrical
2020English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy