Literaturnachweis - Detailanzeige
Autor/inn/en | Crotts, Katrina; Sireci, Stephen G.; Zenisky, April |
---|---|
Titel | Evaluating the Content Validity of Multistage-Adaptive Tests |
Quelle | In: Journal of Applied Testing Technology, 13 (2012) 1, (26 Seiten)
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
Schlagwörter | Computer Assisted Testing; Adaptive Testing; Content Validity; Test Content; Expertise; Test Items; Item Banks; Reading Tests; Adult Students; Adult Basic Education; Massachusetts |
Abstract | Validity evidence based on test content is important for educational tests to demonstrate the degree to which they fulfill their purposes. Most content validity studies involve subject matter experts (SMEs) who rate items that comprise a test form. In computerized-adaptive testing, examinees take different sets of items and test "forms" do not exist, which makes it difficult to evaluate the content validity of different tests taken by different examinees. In this study, we evaluated content validity of a multistage-adaptive test (MST) using SMEs' content validity ratings of all items in the MST bank. Analyses of these ratings across the most common "paths" taken by examinees were conducted. The results indicated the content validity ratings across the different tests taken by examinees were roughly equivalent. The method used illustrates how content validity can be evaluated in an MST context. (Contains 5 tables, 3 footnotes and 1 figure.) (As Provided). |
Anmerkungen | Association of Test Publishers. 601 Pennsylvania Avenue NW, South Building Suite 900, Washington DC 20004. Tel: 866-240-7909; Fax: 717-755-8962; e-mail: wgh.atp@att.net; Web site: http://www.testpublishers.org |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2017/4/10 |