Person: Ho, Andrew
Loading...
Email Address
AA Acceptance Date
Birth Date
Research Projects
Organizational Units
Job Title
Last Name
Ho
First Name
Andrew
Name
Ho, Andrew
23 results
Search Results
Now showing 1 - 10 of 23
Publication Validation Methods for Aggregate-Level Test Scale Linking: A Rejoinder(American Educational Research Association (AERA), 2021-03-15) Ho, Andrew; Reardon, Sean F.; Kalogrides, DemetraIn Reardon, Kalogrides, and Ho (2021), we developed precision-adjusted random effects models to estimate aggregate-level linking error, for populations and subpopulations, for averages and progress over time. We are grateful to past editor Dan McCaffrey for selecting our paper as the focal article for a set of commentaries from our colleagues, Daniel Bolt, Mark Davison, Alina von Davier, Tim Moses, and Neil Dorans. These commentaries reinforce important cautions and identify promising directions for future research. In this rejoinder, we clarify aspects of our originally proposed method. 1) Validation methods provide evidence of benefits and risks that different experts may weigh differently for different purposes. 2) Our proposed method differs from “standard mapping” procedures using the National Assessment of Educational Progress not only by using a linear (vs. equipercentile) link but also by targeting direct validity evidence about counterfactual aggregate scores. 3) Multilevel approaches that assume common score scales across states are indeed a promising next step for validation, and we hope that states enable researchers to use more of their common-core-era consortium test data for this purpose. Finally, we apply our linking method to an extended panel of data from 2009 to 2017 to show that linking recovery has remained stable.Publication Specifying the Three Ws in Educational Measurement: Who Uses Which Scores for What Purpose?(Wiley, 2022-12) Ho, AndrewI argue that understanding and improving educational measurement requires specificity about actors, scores, and purpose: Who uses which scores for what purpose? I show how this specificity complements Briggs’ frameworks for educational measurement that he presented in his 2022 address as president of the National Council on Measurement in Education.Publication HarvardX and MITx: The First Year of Open Online Courses, Fall 2012-Summer 2013(2014) Ho, Andrew; Reich, Justin; Nesterko, Sergiy O.; Seaton, Daniel Thomas; Mullaney, Tommy; Waldo, James; Chuang, IsaacHarvardX and MITx are collaborative institutional efforts between Harvard University and MIT to enhance campus-based education, advance educational research, and increase access to online learning opportunities worldwide. Over the year from the fall of 2012 to the summer of 2013, HarvardX and MITx launched 17 courses on edX, a jointly founded platform for delivering massive open online courses (MOOCs). In that year, 43,196 registrants earned certificates of completion. Another 35,937 registrants explored half or more of course content without certification. An additional 469,702 registrants viewed less than half of the content. And 292,852 registrants never engaged with the online content. In total, there were 841,687 registrations from 597,692 unique users across the first year of HarvardX and MITx courses. This report is a joint effort by institutional units at Harvard and MIT to describe the registrant and course data provided by edX in the context of the diverse efforts and intentions of HarvardX and MITx instructor teams.Publication PH207x: Health in Numbers and PH278x: Human Health and Global Environmental Change: 2012-2013 Course Report(2014) Reich, Justin; Nesterko, Sergiy O.; Seaton, Daniel Thomas; Mullaney, Tommy; Waldo, James; Chuang, Isaac; Ho, AndrewIn the 2012-2013 academic year, the first two Harvard School of Public Health courses were offered through HarvardX on the edX platform: PH207x: Health in Numbers and PH278x: Human Health and Global Environmental Change. They were taught by Professors Earl Francis Cook and Marcello Pagano, and Aaron Bernstein and Jack Spengler, respectively. This report describes the structure of these two courses, the demographic characteristics of registrants, and the activity of students. This report was prepared by researchers external to the course teams and is based on examination of the courseware, analyses of the data collected by the edX platform, and interviews and consultations with the course faculty and team members.Publication Heroesx: The Ancient Greek Hero: Spring 2013 Course Report(2014) Reich, Justin; Emanuel, Jeff; Nesterko, Sergiy O.; Seaton, Daniel Thomas; Mullaney, Tommy; Waldo, James; Chuang, Isaac; Ho, AndrewCB22x: The Ancient Greek Hero, was offered as a HarvardX course in Spring 2013 on edX, a platform for massive open online courses (MOOCs). It was taught by Professor Greg Nagy. The report was prepared by researchers external to the course team, based on examination of the courseware, analyses of the data collected by the edX platform, and interviews and consultations with the course faculty and team members.Publication ER22x: JusticeX: Spring 2013 Course Report(2014) Reich, Justin; Nesterko, Sergiy O.; Seaton, Daniel Thomas; Mullaney, Tommy; Waldo, James; Chuang, Isaac; Ho, AndrewER22x was offered as a HarvardX course in Spring 2013 on edX, a platform for massive open online courses (MOOCs). It was taught by Professor Michael Sandel. The report was prepared by researchers external to the course team, based on an examination of the courseware, analyses of data collected by the edX platform, and interviews with the course faculty and team members.Publication The Epidemiology of Modern Test Score Use: Anticipating Aggregation, Adjustment, and Equating(Informa UK Limited, 2013) Ho, AndrewPublication Response switching and self-efficacy in Peer Instruction classrooms(American Physical Society (APS), 2015) Miller, Kelly; Schell, Julie; Ho, Andrew; Lukoff, Brian; Mazur, EricPeer Instruction, a well-known student-centered teaching method, engages students during class through structured, frequent questioning and is often facilitated by classroom response systems. The central feature of any Peer Instruction class is a conceptual question designed to help resolve student misconceptions about subject matter. We provide students two opportunities to answer each question—once after a round of individual reflection and then again after a discussion round with a peer. The second round provides students the choice to “switch” their original response to a different answer. The percentage of right answers typically increases after peer discussion: most students who answer incorrectly in the individual round switch to the correct answer after the peer discussion. However, for any given question there are also students who switch their initially right answer to a wrong answer and students who switch their initially wrong answer to a different wrong answer. In this study, we analyze response switching over one semester of an introductory electricity and magnetism course taught using Peer Instruction at Harvard University. Two key features emerge from our analysis: First, response switching correlates with academic selfefficacy. Students with low self-efficacy switch their responses more than students with high self-efficacy. Second, switching also correlates with the difficulty of the question; students switch to incorrect responses more often when the question is difficult. These findings indicate that instructors may need to provide greater support for difficult questions, such as supplying cues during lectures, increasing times for discussions, or ensuring effective pairing (such as having a student with one right answer in the pair). Additionally, the connection between response switching and self-efficacy motivates interventions to increase student self-efficacy at the beginning of the semester by helping students develop early mastery or to reduce stressful experiences (i.e., high-stakes testing) early in the semester, in the hope that this will improve student learning in Peer Instruction classrooms.Publication Castles in the Clouds: The Irrelevance of Vertical Scales for Most Practical Concerns(Informa UK Limited, 2016) Ho, AndrewPublication Discrepancies Between Score Trends from NAEP and State Tests: A Scale-Invariant Perspective(Wiley-Blackwell, 2007) Ho, AndrewState test score trends are widely interpreted as indicators of educational improvement. To validate these interpretations, state test score trends are often compared to trends on other tests such as the National Assessment of Educational Progress (NAEP). These comparisons raise serious technical and substantive concerns. Technically, the most commonly used trend statistics – for example, the change in the percent of proficient students – are misleading in the context of cross-test comparisons. Substantively, it may not be reasonable to expect that NAEP and state test score trends should be similar. This paper motivates then applies a “scale-invariant” framework for cross-test trend comparisons to compare “high-stakes” state test score trends from 2003 to 2005 to NAEP trends over the same period. Results show that state trends are significantly more positive than NAEP trends. The paper concludes with cautions against the positioning of trend discrepancies in a framework where only one trend is considered “true.”
- «
- 1 (current)
- 2
- 3
- »