Literaturnachweis - Detailanzeige
Autor/inn/en | Hardcastle, Joseph; Herrmann-Abell, Cari F.; DeBoer, George E. |
---|---|
Titel | Comparing Student Performance on Paper-and-Pencil and Computer-Based-Tests |
Quelle | (2017), (16 Seiten)
PDF als Volltext |
Zusatzinformation | Weitere Informationen |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Monographie |
Schlagwörter | Academic Achievement; Computer Assisted Testing; Comparative Analysis; Student Evaluation; Multiple Choice Tests; Science Tests; Probability; Scores; Item Response Theory; Elementary School Students; Secondary School Students; Quasiexperimental Design; Test Format; Statistical Analysis; Gender Differences; Language Minorities Schulleistung; Schulnote; Studentische Bewertung; Multiple choice examinations; Multiple-choice tests, Multiple-choice examinations; Multiple-Choice-Verfahren; Wahrscheinlichkeitsrechnung; Wahrscheinlichkeitstheorie; Item-Response-Theorie; Sekundarschüler; Testentwicklung; Statistische Analyse; Geschlechterkonflikt; Sprachminderheit |
Abstract | Can student performance on computer-based tests (CBT) and paper-and-pencil tests (PPT) be considered equivalent measures of student knowledge? States and school districts are grappling with this question, and although studies addressing this question are growing, additional research is needed. We report on the performance of students who took either a PPT or one of two different CBT containing multiple-choice items assessing science ideas. Propensity score matching was used to create equivalent demographic groups for each testing modality, and Rasch modelling was used to describe student performance. Performance was found to vary across testing modalities by grade band, students' primary language, and the specific CBT system used. These results are discussed in terms of the current literature and the differences between the specific PPT and CBT systems. (As Provided). |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2020/1/01 |