Document Type

Article

Publication Date

12-2017

Abstract

This article illustrates an argument-based approach to presenting validity evidence for assessment items intended to measure a complex construct. Our focus is developing a measure of teachers’ ability to analyze and respond to students’ mathematical thinking for the purpose of program evaluation. Our validity argument consists of claims addressing connections between our item-development process and the theoretical model for the construct we are trying to measure: attentiveness. Evidence derived from theoretical arguments in conjunction with our multiphased item-development process is used to support the claims, including psychometric evidence of Rasch model fit and category ordering. Taken collectively, the evidence provides support for the claim that our selected-response items can measure increasing levels of attentiveness. More globally, our goal in presenting this work is to demonstrate how theoretical arguments and empirical evidence fit within an argument to support claims about how well a construct is represented, operationalized, and structured.

Copyright Statement

This document was originally published in The Elementary School Journal by The University of Chicago Press. Copyright restrictions may apply. doi: 10.1086/694269

Share

COinS