Publication Date
| In 2015 | 0 |
| Since 2014 | 0 |
| Since 2011 (last 5 years) | 41 |
| Since 2006 (last 10 years) | 99 |
Descriptor
Source
| Assessing Writing | 99 |
Author
| Knoch, Ute | 3 |
| Beck, Sarah W. | 2 |
| Condon, William | 2 |
| East, Martin | 2 |
| Gebril, Atta | 2 |
| Huang, Jinyan | 2 |
| James, Cindy L. | 2 |
| Jeffery, Jill V. | 2 |
| Llosa, Lorena | 2 |
| Plakans, Lia | 2 |
| More ▼ | |
Publication Type
| Journal Articles | 99 |
| Reports - Research | 55 |
| Reports - Evaluative | 35 |
| Reports - Descriptive | 9 |
| Opinion Papers | 2 |
Education Level
| Higher Education | 43 |
| Postsecondary Education | 26 |
| Elementary Secondary Education | 18 |
| Secondary Education | 11 |
| Elementary Education | 4 |
| High Schools | 4 |
| Adult Education | 3 |
| Grade 4 | 2 |
| Grade 5 | 2 |
| Grade 6 | 2 |
| More ▼ | |
Audience
| Teachers | 8 |
| Practitioners | 1 |
| Students | 1 |
Showing 1 to 15 of 99 results
di Gennaro, Kristen – Assessing Writing, 2013
A growing body of literature in second-language writing suggests that the writing ability of international second language (L2) learners, who attend post-secondary education abroad after having completed high school in their home countries, and the so-called Generation 1.5 population, that is, L2 learners who enter post-secondary education after…
Descriptors: Writing Instruction, Administrators, Grammar, Writing Ability
Esfandiari, Rajab; Myford, Carol M. – Assessing Writing, 2013
We compared three assessor types (self-assessors, peer-assessors, and teacher assessors) to determine whether they differed in the levels of severity they exercised when rating essays. We analyzed the ratings of 194 assessors who evaluated 188 essays that students enrolled in two state-run universities in Iran wrote. The assessors employed a…
Descriptors: Foreign Countries, Severity (of Disability), Essays, Gender Differences
Lam, Ricky – Assessing Writing, 2013
Research into portfolio assessment ("PA") typically describes teachers' development and implementation of different portfolio models in their respective teaching contexts, however, not much attention is paid to student perceptions of the portfolio approach or its impact on the learning of writing. To this end, this study aims to investigate how…
Descriptors: Portfolios (Background Materials), Student Attitudes, Portfolio Assessment, Foreign Countries
Fritz, Erik; Ruegg, Rachael – Assessing Writing, 2013
Although raters can be trained to evaluate the lexical qualities of student essays, the question remains as to what extent raters follow the "lexis" scale descriptors in the rating scale when evaluating or rate according to their own criteria. The current study examines the extent to which 27 trained university EFL raters take various lexical…
Descriptors: Accuracy, Rating Scales, English (Second Language), Essays
Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal – Assessing Writing, 2013
This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…
Descriptors: Writing Evaluation, Scoring, Writing Instruction, Essays
Ramineni, Chaitanya; Williamson, David M. – Assessing Writing, 2013
In this paper, we provide an overview of psychometric procedures and guidelines Educational Testing Service (ETS) uses to evaluate automated essay scoring for operational use. We briefly describe the e-rater system, the procedures and criteria used to evaluate e-rater, implications for a range of potential uses of e-rater, and directions for…
Descriptors: Educational Testing, Guidelines, Scoring, Psychometrics
Condon, William – Assessing Writing, 2013
Automated Essay Scoring (AES) has garnered a great deal of attention from the rhetoric and composition/writing studies community since the Educational Testing Service began using e-rater[R] and the "Criterion"[R] Online Writing Evaluation Service as products in scoring writing tests, and most of the responses have been negative. While the…
Descriptors: Measurement, Psychometrics, Evaluation Methods, Educational Testing
Weigle, Sara Cushing – Assessing Writing, 2013
This article presents considerations for using automated scoring systems to evaluate second language writing. A distinction is made between English language learners in English-medium educational systems and those studying English in their own countries for a variety of purposes, and between learning-to-write and writing-to-learn in a second…
Descriptors: Scoring, Second Language Learning, Second Languages, English Language Learners
Deane, Paul – Assessing Writing, 2013
This paper examines the construct measured by automated essay scoring (AES) systems. AES systems measure features of the text structure, linguistic structure, and conventional print form of essays; as such, the systems primarily measure text production skills. In the current state-of-the-art, AES provide little direct evidence about such matters…
Descriptors: Scoring, Essays, Text Structure, Writing (Composition)
Ramineni, Chaitanya – Assessing Writing, 2013
In this paper, I describe the design and evaluation of automated essay scoring (AES) models for an institution's writing placement program. Information was gathered on admitted student writing performance at a science and technology research university in the northeastern United States. Under timed conditions, first-year students (N = 879) were…
Descriptors: Validity, Comparative Analysis, Internet, Student Placement
Wardle, Elizabeth; Roozen, Kevin – Assessing Writing, 2012
This article offers one potential response to Yancey's (1999) call for a fourth wave of writing assessment able to capture writing development in all of its complexity. Based on an ecological perspective of literate development that situates students' growth as writers across multiple engagements with writing, including those outside of school,…
Descriptors: Writing Evaluation, Writing Tests, Ecology, Writing Instruction
Camp, Heather – Assessing Writing, 2012
This article reviews key developmental theories that have been adopted by writing development researchers over the last fifty years. It describes how researchers have translated these theories into definitions of writing development capable of influencing curricular design and interpretations of student writing and explores the implications for…
Descriptors: Writing (Composition), Writing Evaluation, Researchers, Theories
Slomp, David H. – Assessing Writing, 2012
This article discusses three sets of challenges involved in the assessment of writing from a developmental perspective. These challenges include defining a workable theory of development, developing a suitable construct, and overcoming limitations in technocentric approaches to writing assessment. In North America in recent years, a burgeoning…
Descriptors: Writing (Composition), Writing Evaluation, Writing Tests, Writing Ability
Plakans, Lia; Gebril, Atta – Assessing Writing, 2012
An increasing number of writing programs and assessments are employing writing-from-sources tasks in which reading and writing are integrated. The integration of reading and writing in such contexts raises a number of questions with regard to writers' use of sources in their writing, the functions these sources serve, and how proficiency affects…
Descriptors: Undergraduate Students, Protocol Analysis, Writing Skills, Reading
Johnson, David; VanBrackle, Lewis – Assessing Writing, 2012
Raters of Georgia's (USA) state-mandated college-level writing exam, which is intended to ensure a minimal university-level writing competency, are trained to grade holistically when assessing these exams. A guiding principle in holistic grading is to not focus exclusively on any one aspect of writing but rather to give equal weight to style,…
Descriptors: Writing Evaluation, Linguistics, Writing Tests, English (Second Language)

Peer reviewed
Direct link
