Literaturnachweis - Detailanzeige
Autor/inn/en | Stiller, Jurik; Hartmann, Stefan; Mathesius, Sabrina; Straube, Philipp; Tiemann, Rüdiger; Nordmeier, Volkhard; Krüger, Dirk; Upmeier zu Belzen, Annette |
---|---|
Titel | Assessing Scientific Reasoning: A Comprehensive Evaluation of Item Features That Affect Item Difficulty |
Quelle | In: Assessment & Evaluation in Higher Education, 41 (2016) 5, S.721-732 (12 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 0260-2938 |
DOI | 10.1080/02602938.2016.1164830 |
Schlagwörter | Logical Thinking; Scientific Concepts; Difficulty Level; Test Items; Scores; Item Response Theory; Thinking Skills; Scientific Principles; Higher Education; Preservice Teachers; Scientific Methodology; Teacher Education; Scientific Literacy; Science Education; Observation; Longitudinal Studies; Multiple Choice Tests; Bayesian Statistics; Comparative Analysis; Foreign Countries; Achievement Tests; International Assessment; Secondary School Students; Germany (Berlin); Program for International Student Assessment Schwierigkeitsgrad; Test content; Testaufgabe; Item-Response-Theorie; Denkfähigkeit; Hochschulbildung; Hochschulsystem; Hochschulwesen; Lehrerausbildung; Lehrerbildung; Naturwissenschaftliche Bildung; Beobachtung; Longitudinal study; Longitudinal method; Longitudinal methods; Längsschnittuntersuchung; Multiple choice examinations; Multiple-choice tests, Multiple-choice examinations; Multiple-Choice-Verfahren; Ausland; Achievement test; Achievement; Testing; Test; Tests; Leistungsbeurteilung; Leistungsüberprüfung; Leistung; Testdurchführung; Testen; Sekundarschüler |
Abstract | The aim of this study was to improve the criterion-related test score interpretation of a text-based assessment of scientific reasoning competencies in higher education by evaluating factors which systematically affect item difficulty. To provide evidence about the specific demands which test items of various difficulty make on pre-service teachers' scientific reasoning competencies, we applied a general linear mixed model which allows estimation of the impact of item features on the response observations. The item features had been identified during a standard setting process. Results indicate important predictive potential of one formal item feature (length of response options), two features based on cognitive demands (processing data from tables, processing abstract concepts) and one feature based on solid knowledge (specialist terms). The revealed predictive potential of item features was in accordance with the cognitive demands operationalised in our competence model. Thus, we conclude that the findings support the validity of our interpretation of the test scores as measures of scientific reasoning competencies. (As Provided). |
Anmerkungen | Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site: http://www.tandf.co.uk/journals |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2020/1/01 |