Instructional Measurement


Our current work on instructional measurement revolves around measuring high-quality math instruction and equitable teaching practices. Through a study funded by the Bill & Melinda Gates Foundation and through research-practice partnerships with school districts, we are collecting a large data corpus of mathematics classroom videos, transcripts, lesson materials, student and teacher surveys, and administrative data so that we can develop robust measures of instruction. We are currently working on developing equity measures, such as how teachers attribute competence to students. This involves linking students' contributions to class discussions to their demographic information and identities.

During the past few years, a fast-growing literature has shown that natural language processing (NLP) techniques provide a potentially transformative approach for instructional measurement and feedback. Different from conventional human-based scoring, NLP analysis of classroom transcripts can be done in scalable and adaptable ways (Alic et al., 2022; Demszky, et al., 2021; Hunkins, 2022; Kelly et al., 2018; Liu & Cohen, 2021; Suresh et al., 2019). Scholars have to date focused on capturing dialogic instruction, or teachers’ use of talk moves that promote student thinking and activity, for instance by eliciting student reasoning (Alic et al., 2022; Suresh et al., 2019), revoicing and taking up student ideas (Demszky et al., 2021; Suresh et al., 2019) and prompting students to respond to others’ ideas (Suresh et al., 2019). Scholars have also begun to focus on students’ contributions – for instance the density of the mathematical language within student talk (Himmelsbach et al., 2023) and evidence of student mathematical reasoning (Hill et al., in progress). These instructional moves are indicators of rigorous instruction and, when enacted with parity across students, equitable instruction. To date, scholars primarily have used these measures to provide private, on-demand feedback to teachers (Suresh et al., 2019; Demszky et al., in press; Demszky et al., 2023), in some cases leading to positive impacts on educators’ instruction quality and selected student outcomes across different teaching contexts (Demszky et al., in press; Demszky & Liu, 2023).

If you’re interested in learning more about our instructional measurement work or participating in a related study, please contact Research Project Manager Hannah Rosenstein at hrosenst@umd.edu.

Highlighted Publications

The NCTE transcripts: a dataset of elementary math classroom transcripts


Dorottya Demszky, Heather Hill

arXiv preprint arXiv:2211.11772, arXiv, 2023 May


Computationally identifying funneling and focusing questions in classroom discourse


Sterling Alic, Dorottya Demszky, Zid Mancenido, Jing Liu, Heather Hill, Dan Jurafsky

arXiv preprint arXiv:2208.04715, arXiv, 2022


Measuring teaching practices at scale: a novel application of text-as-data methods


Jing Liu, Julie Cohen

Educational Evaluation and Policy Analysis, vol. 43, 2021 Dec, pp. 587-614

Share

Tools
Translate to