Analysis of Process Data to Advance Computer-Based Assessments in Multilingual Contexts

Hlosta, Martin; Herzing, Jessica M. E.; Seiler, Simon; Nath, Sukanya; Keller Zai, Florian; Bergamin, Per; Erzinger, Andrea B. (2024). Analysis of Process Data to Advance Computer-Based Assessments in Multilingual Contexts. In: Sahin, Muhittin; Ifenthaler, Dirk (eds.) Assessment Analytics in Education. Advances in Analytics for Learning and Teaching (pp. 207-233). Cham: Springer 10.1007/978-3-031-56365-2_11

[img] Text
978-3-031-56365-2_11.pdf - Published Version
Restricted to registered users only
Available under License Publisher holds Copyright.

Download (1MB)

Unlike traditional large-scale assessments, computer-based assessments collect process data that provide information about test-takers actions in the assess
ment platform while solving test items, for example, response and reaction times, number of clicks, and skipping. In computer-based assessments in multilingual con
texts, the comparability of test scores across different assessment languages is essential and challenges assessment design principles regarding fairness. To achieve comparability, test items must be independent of language specificities, test-takers should be homogeneous in their ability to solve the test items, and the test setting must be standardized across test-takers. Using process data in assessments is often limited to identifying cognitive response mechanisms. On the other hand, little is known about how process data can contribute to multilingual assessments’ test development and administration process, hence, quality assurance. Based on the existing literature, we conceptualize a framework with five aspects of how process data can be used for assessment quality improvement. In addition, we set these aspects in the context of computer-based assessments in multilingual contexts. We illustrate three ways in which process data can contribute to the quality assurance of standardized educational computer-based assessments: (a) assisting in identifying suspicious items or blocks across different languages, (b) investigating potential issues of position effects and how they might affect different languages, and (c) clustering test-takers to understand their evolving behavior. This strand of research aims to better understand how process data in computer-based assessments can contribute to quality assurance and inform assessment developers, practitioners, and researchers in large-scale assessments.

Item Type:

Book Section (Book Chapter)

Division/Institute:

09 Interdisciplinary Units > Interfakultäres Zentrum für Bildungsforschung (ICER) > Zentrum für Bildungsforschung ICER (WISO)
09 Interdisciplinary Units > Interfakultäres Zentrum für Bildungsforschung (ICER) > Zentrum für Bildungsforschung ICER (PHILHUM)
07 Faculty of Human Sciences > Institute of Education
07 Faculty of Human Sciences > Institute of Education > Sociology of Education
07 Faculty of Human Sciences > Institute of Education > School and Teaching Research

UniBE Contributor:

Herzing, Jessica, Seiler, Simon, Nath, Sukanya, Keller Zai, Florian, Erzinger, Andrea

Subjects:

300 Social sciences, sociology & anthropology > 370 Education
300 Social sciences, sociology & anthropology
300 Social sciences, sociology & anthropology > 310 Statistics

ISBN:

978-3-031-56364-5

Series:

Advances in Analytics for Learning and Teaching

Publisher:

Springer

Funders:

[UNSPECIFIED] BeLEARN Booster Funding

Projects:

[UNSPECIFIED] PANDA

Language:

English

Submitter:

Jessica Martina Esther Herzing

Date Deposited:

06 Jun 2024 07:40

Last Modified:

06 Jun 2024 07:40

Publisher DOI:

10.1007/978-3-031-56365-2_11

Uncontrolled Keywords:

Test design · Computer-based testing · Cross-cultural comparisons · Log data · Fairness and equivalence of assessment · Evidence-centered design · Assessment in multiple languages · Comparability of test scores · International LSA · Cross-lingual assessment

BORIS DOI:

10.48350/197593

URI:

https://boris.unibe.ch/id/eprint/197593

Actions (login required)

Edit item Edit item
Provide Feedback