NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 9 results Save | Export
Shaw, Emily J. – College Board, 2011
Presented at the 23rd Annual Historically Black Colleges & Universities (HBCU) Conference in Atlanta, GA, in September 2011. Admitted Class Evaluation Service (ACES) is the College Board's free online service that predicts how admitted students will perform at a college or university generally, and how successful students will be in specific…
Descriptors: College Admission, Student Placement, Test Validity, Graphs
Shaw, Emily J.; Mattern, Krista D. – College Board, 2012
The current study will explore the validity and potential of using the SAT, in conjunction with HSGPA, to arrive at a predicted FYGPA to improve student retention at four-year postsecondary institutions. Specifically, this study examined whether college students who did not perform as expected (observed FYGPA minus predicted FYGPA) were more…
Descriptors: College Entrance Examinations, Test Validity, Grade Point Average, High School Students
Patterson, Brian F.; Kobrin, Jennifer L. – College Board, 2011
This study presents a case for applying a transformation (Box and Cox, 1964) of the criterion used in predictive validity studies. The goals of the transformation were to better meet the assumptions of the linear regression model and to reduce the residual variance of fitted (i.e., predicted) values. Using data for the 2008 cohort of first-time,…
Descriptors: Predictive Validity, Evaluation Criteria, Regression (Statistics), College Freshmen
Kobrin, Jennifer L.; Kim, Rachel; Sackett, Paul – College Board, 2011
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Predictive Validity
Mattern, Krista D.; Shaw, Emily J.; Kobrin, Jennifer L. – College Board, 2010
Presented at the national conference for the American Educational Research Association (AERA) in 2010. This presentation describes an alternative way of presenting the unique information provided by the SAT over HSGPA, namely examining students with discrepant SAT-HSGPA performance.
Descriptors: College Entrance Examinations, Grade Point Average, High School Students, Scores
Mattern, Krista D.; Shaw, Emily J. – College Board, 2010
Presented at the national conference for the American Educational Research Association (AERA) in 2010. This presentation examines the relationship between academic self-beliefs and outcomes.
Descriptors: Academic Achievement, Academic Ability, Self Concept, Correlation
Kobrin, Jennifer L.; Patterson, Brian F. – College Board, 2010
There is substantial variability in the degree to which the SAT and high school grade point average (HSGPA) predict first-year college performance at different institutions. This paper demonstrates the usefulness of multilevel modeling as a tool to uncover institutional characteristics that are associated with this variability. In a model that…
Descriptors: Scores, Validity, Prediction, College Freshmen
Shaw, Emily; Mattern, Krista; Patterson, Brian – College Board, 2009
Presented at the national conference for AERA (American Educational Research Association) . This study examined whether there are distinct differences in the demographic characteristics, HSGPA, first-year college performance, and second-year college retention rates among students who have discrepant CR-W performance.
Descriptors: Critical Reading, Reading Achievement, Writing Achievement, Scores
Hendrickson, Amy; Patterson, Brian; Melican, Gerald – College Board, 2008
Presented at the Annual National Council on Measurement in Education (NCME) in New York in March 2008. This presentation explores how different item weighting can affect the effective weights, validity coefficents and test reliability of composite scores among test takers.
Descriptors: Multiple Choice Tests, Test Format, Test Validity, Test Reliability