More Less
More Less

Assessing the effect of online instruction on university students’ learning

Opinion image

With the sudden onset of the Covid-19 pandemic in early 2020, universities worldwide were abruptly forced to transition from in-person to online instruction in order to mitigate the spread of the disease. In conjunction with this disruption, a key question is whether the move to online reduced student learning. If so, by how much? Were some groups of students disproportionately affected? And are there effective teaching methods that can be deployed to minimize a reduction in learning?

Detecting an effect of online instruction on student learning is difficult without a standardized metric. Course grades would not be an appropriate measure because instructors might have graded more leniently in the face of the sudden transition. Indeed, many universities in the US allowed students to opt-in to a pass/fail grade and bypass letter grades entirely, and others explicitly urged faculty to be generous to students in this difficult time. Furthermore, in many cases, tests shifted from being closed-book and proctored in person to an open-book unproctored form, making scores non-comparable across terms. The ideal assessment to measure a change in student learning would be a standardized set of questions administered at the end of the pandemic-affected term and a previous unaffected term. 

In recent work we made use of a battery of standardized assessments that had been administered to our students before and during the pandemic. The multiple-choice assessments were developed at Cornell University and administered in seven intermediate-level economics courses at four different US institutions. We compare the scores for the Spring 2020 assessments with similar scores from Fall 2019 and Spring 2019.

Our findings indicate that the pandemic-related disruption to instruction had a negative impact on student learning. As measured by our assessments, student learning fell by nearly 19% of a standard deviation when pooled across all students in all courses of our sample. In other words, a good estimate is that the median student before the crisis scored at the 42nd percentile during the crisis. This overall effect masks substantial differences across courses: in one course we saw student learning fall by over 80% of a standard deviation, while in another course we saw an increase of 19% of a standard deviation.

An important question is whether students from more disadvantaged backgrounds were disproportionately harmed by the pivot to online learning, as some have speculated. We find that, for the most part, these students did not differentially suffer. The exception is first-generation students, whose scores were 31% of a standard deviation worse during the pandemic.

Our final set of findings show how teaching practices affected losses in learning during the pandemic. We incorporated data on instructors’ online teaching experience and their use of peer interaction activities and in-class polling. We found that instructors having previous online teaching experience nearly offset learning losses due to the pandemic. Other innovative pedagogical methods showed smaller, but still significant, positive impacts.

Despite learning losses from pandemic-induced disruptions, our findings leave us optimistic for future student learning. First, many university instructors gained online teaching experience during the Spring 2020 term. This means that losses in student learning should not be as great during the Fall 2020 and Spring 2021 terms when many students are or will be learning online. Second, the fact that disadvantaged students were not disproportionately affected is a positive takeaway. Third, some innovative teaching methods such as think-pair-share and small group activities have low implementation costs and can significantly improve online student learning.

© George Orlov, Douglas McKee, and Tyler Ransom

George Orlov is postdoctoral associate in the Department of Economics at Cornell University, USA.
Douglas McKee is senior lecturer in the Department of Economics at Cornell University, USA.
Tyler Ransom is assistant professor of economics at the University of Oklahoma, USA, and a Research Affiliate of IZA.

Find more IZA World of Labor coronavirus content on our curated topics pages: National responses to Covid-19 and Covid-19—Pandemics and the labor market.

Please note:
We recognize that IZA World of Labor articles may prompt discussion and possibly controversy. Opinion pieces, such as the one above, capture ideas and debates concisely, and anchor them with real-world examples. Opinions stated here do not necessarily reflect those of the IZA.