Publication: Investigating Sources of Treatment Effect Heterogeneity in Intervention Research
No Thumbnail Available
Open/View Files
Date
2021-05-12
Authors
Published Version
Published Version
Journal Title
Journal ISSN
Volume Title
Publisher
The Harvard community has made this article openly available. Please share how this access benefits you.
Citation
Asher, Catherine Armstrong. 2021. Investigating Sources of Treatment Effect Heterogeneity in Intervention Research. Doctoral dissertation, Harvard University Graduate School of Arts and Sciences.
Research Data
Abstract
Intervention research in education investigates whether and how particular educational programs help students learn. Treatment effect heterogeneity, or variation in a program’s impact, makes this effort more complicated, because it means more work must be done to understand what works, for whom, and in what contexts. This dissertation consists of three papers that explore distinct sources of treatment effect heterogeneity in the context of educational interventions.
The first paper, co-authored with Ethan Scherer, uses a factorial design to compare the effectiveness of three features in a reading-focused text messaging intervention: personalized information about students’ observed reading behaviors, goal setting for summer reading, and framing the purpose of reading as useful for skill-building (an instrumental view), a fun activity (an entertainment view), or both. We find that that personalized messages increase the amount of summer reading and students’ reading skills when they return to school in the fall. The effects of personalized information on test scores are amplified when families receive a combination of instrumental and entertainment-framed messages. We find no evidence of impacts for goal setting.
The second paper investigates how the effects of a district-run, universal-access pre-K program vary based on the access of the control group to different counterfactual options. Using administrative records and program waitlists, I reconstruct a matched sample that was plausibly randomized, either to receive a space in the district pre-K program or not. While on average I find no evidence of an impact of the pre-K program on kindergarten readiness, I find large positive effects in communities where the control group students are more likely to report either not attending pre-K or attending a different subsidized pre-K program.
The third paper considers an emerging experimental design in education research: Sequential Multiple Assignment Randomized Trials (or SMARTs), which are used to develop, refine, and test adaptive interventions. I use principal stratification to identify endogenous subgroups determined by how individuals respond to each of the Phase 1 treatments, and I use these strata to highlight a latent assumption in current analytic techniques. I also present a simulation study to explore the magnitude of estimation bias when this assumption is violated.
Description
Other Available Sources
Keywords
Education policy, Educational evaluation, Education
Terms of Use
This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service