Creating an R package for a reproducible workflow in educational assessment

Creating an R package for a reproducible workflow in educational assessment

Abstract

The assessment of individual’s knowledge, skills or abilities is a fundamental aspect of any educational system or program. Without some form of assessment, it is impossible to know whether or not an individual has gained the necessary skills for the intended purpose or goals of the educational program. The need to collect and evaluate data on individuals’ attainment of intended skills has resulted in the development of a variety of large scale standardized assessments. These assessments target knowledge for a specific population (e.g., 8th grade general education students) and subject area (e.g., mathematics) to assess whether individuals have acquired enough knowledge or skill to move on in their education, attain a certification or degree in their field, or become licensed to practice in their field. These tests can occur as part of regular K-12 assessment, as college entrance exams like the ACT or SAT, graduate exams for medical school (Medical College Admission Test) or law school (Law School Admission Test), or licensure tests for medical doctors (United State Medical Licensing Examination) or nurses (National Council Licensure Examination). With this wide array of assessments, there is a growing need for producing scores and score report information that can serve many purposes including federal requirements for assessment, as well as actionable next steps for learning and instructional activities. The more complex our needs become for useful assessment results, the more complex our underlying statistical models and data requirements become. This talk outlines the process of creating an R package for the Dynamic Learning Maps® Alternate Assessment System to implement a sophisticated statistical model to meet the needs of our stakeholders that provides reproducible results and automated documentation that can be used to provide evidence for the quality of the assessment.

Publication
Annual useR! Conference
Date