NotesFAQContact Us
Collection
Advanced
Search Tips
ERIC Number: ED618427
Record Type: Non-Journal
Publication Date: 2021-Jun-15
Pages: 10
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-
EISSN: N/A
Contextual Definition Generation
Yarbro, Jeffrey T.; Olney, Andrew M.
Grantee Submission, Paper presented at the International Workshop on Intelligent Textbooks (3rd, 2021)
This paper explores the concept of dynamically generating definitions using a deep-learning model. We do this by creating a dataset that contains definition entries and contexts associated with each definition. We then fine-tune a GPT-2 based model on the dataset to allow the model to generate contextual definitions. We evaluate our model with human raters by generating definitions using two context types: short-form (the word used in a sentence) and long-form (the word used in a sentence along with the prior and following sentences). Results indicate that the model performed significantly better when generating definitions using short-form contexts. Additionally, we evaluate our model against human-generated definitions. The results show promise for the model, showing that the model was able to match human-level fluency. However, while it was able to reach human-level accuracy in some instances, it failed in others. [This paper was published in: "Proceedings of the Third International Workshop on Intelligent Textbooks," Vol. 2895, CEUR-WS.org, 2021, pp. 74-83.]
Publication Type: Speeches/Meeting Papers; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: National Science Foundation (NSF); Institute of Education Sciences (ED)
Authoring Institution: N/A
IES Funded: Yes
Grant or Contract Numbers: 1918751; 1934745; R305A190448