NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED616115
Record Type: Non-Journal
Publication Date: 2020
Pages: 8
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-
EISSN: N/A
Sequence-to-Sequence Models for Automated Text Simplification
Botarleanu, Robert-Mihai; Dascalu, Mihai; Crossley, Scott Andrew; McNamara, Danielle S.
Grantee Submission, Paper presented at the International Conference on Artificial Intelligence in Education (AIED) (21st, 2020)
A key writing skill is the capability to clearly convey desired meaning using available linguistic knowledge. Consequently, writers must select from a large array of idioms, vocabulary terms that are semantically equivalent, and discourse features that simultaneously reflect content and allow readers to grasp meaning. In many cases, a simplified version of a text is needed to ensure comprehension on the part of a targeted audience (e.g., second language learners). To address this need, we propose an automated method to simplify texts based on paraphrasing. Specifically, we explore the potential for a deep learning model, previously used for machine translation, to learn a simplified version of the English language within the context of short phrases. The best model, based on an Universal Transformer architecture, achieved a BLEU score of 66.01. We also evaluated this model's capability to perform similar transformation to texts that were simplified by human experts at different levels. [This work was also supported by a grant from the Romanian National Authority for Scientific Research and Innovation. This paper was published in: I. I. Bittencourt, et al (Eds.), "Proceedings of the 21st International Conference on Artificial Intelligence in Education" (AIED 2020). Lecture Notes in Artificial Intelligence (LNAI, Vol. 12164, pp.31-36). Springer, Cham.]
Publication Type: Speeches/Meeting Papers; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Institute of Education Sciences (ED); Office of Naval Research (ONR) (DOD)
Authoring Institution: N/A
IES Funded: Yes
Grant or Contract Numbers: R305A190063; N000141712300; N000141912424