Mathematics modelling is a vital competency for students of all ages. In this study, we aim to fill the research gap about valid and reliable tools for assessing and grading mathematical modeling problems, particularly those reflecting multiple steps of the modelling cycle. We present in this paper the design of a reliable and valid assessment tool aimed at gauging the level of mathematical modelling associated with real-world modeling problems in a scientific-engineering context. The study defines and bases the central modelling processes on the proficiency levels identified in PISA Mathematics. A two-dimensional rubric was developed, reflecting the combined assessment of the type and level of a modelling process. We identified criteria that enable a clear comparison and differentiation among the different levels across each of the modelling processes. These criteria allow for concrete theoretical definitions for the various modelling processes, introducing a well-defined mathematical modelling framework from a didactical viewpoint, which can potentially contribute to promoting modelling competencies or the understanding of modelling by teachers and students. Theoretical, methodological and practical implications are discussed.
A rubric for assessing mathematical modelling problems in a scientific-engineering context.
Abstract