NotesFAQContact Us
Collection
Advanced
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ829206
Record Type: Journal
Publication Date: 2007-Aug
Pages: 12
Abstractor: ERIC
Reference Count: 0
ISBN: N/A
ISSN: ISSN-1536-6367
Assessing Elemental and Structural Validity: Data from Teachers, Non-Teachers, and Mathematicians
Hill, Heather C.; Dean, Carolyn; Goffney, Imani Masters
Measurement: Interdisciplinary Research and Perspectives, v5 n2-3 p81-92 Aug 2007
Validation efforts typically focus around what, exactly, is measured by an instrument(s), and whether what is measured corresponds to the theoretical domain(s) originally specified. The authors' goal for this article is to build instruments focused around measuring the mathematical knowledge used in teaching: not only the content that teachers teach to students directly, but also the professional knowledge that helps support the teaching of that content. Following Kane (2001; 2004a) and as reported in Schilling & Hill. The authors developed two assumptions and related inferences to represent this thinking: (1) Elemental assumption, the items reflect teachers' mathematical knowledge for teaching and not extraneous factors such as test taking strategies or idiosyncratic aspects of the items (e.g., flaws in items); and (2) Structural assumption, the domain of mathematical knowledge for teaching can be distinguished by both subject matter area (e.g., number and operations, algebra) and the types of knowledge deployed by teachers, including content knowledge (CK), which contains common content knowledge (CCK), or knowledge that is common to many disciplines and the public at large; specialized content knowledge (SCK) or knowledge specific to the work of teaching; and knowledge of content and students (KCS), or knowledge concerning students' thinking around particular mathematical topics. For this analysis, the authors focus mainly on distinctions between CK and KCS items. Results from their examination suggest that it is mathematical processes, by far, that underlie answers to the CK items. Test-taking and guessing occurred at relatively low rates, and inconsistencies between individuals' thinking and answers were within what they would probably consider to be normal bounds. Results from the KCS items were more mixed, however. While some teachers did use what the authors coded as KCS in their answers to these items, other teachers, non-teachers, and mathematicians also relied on mathematical reasoning in generating their answers. (Contains 3 tables and 4 footnotes.)
Psychology Press. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Evaluative
Education Level: Elementary Secondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A