AdobeStock_331412577.jpeg
VoxEU Column COVID-19 Education

Learning during the COVID-19 pandemic

Many higher learning institutions have shifted to remote learning in response to the COVID-19 pandemic. Although research has found that online classes can be just as effective as in-person classes, there is evidence that suggests disadvantaged students may perform relatively worse. This column compares student performance on a set of standard assessments at four PhD-granting institutions in the US before and after the switch to online classes. It finds little evidence that disadvantaged groups were further disadvantaged by the pandemic in their college learning. Instructor experience with online teaching and the use of active-learning techniques have a positive effect on student outcomes.

In the spring of 2020, the COVID-19 pandemic forced many higher education institutions around the world to rapidly switch to remote learning. Although some activities were brought back to campuses in the autumn, many classes at these institutions are still in hybrid or online form and seem likely to remain this way for some time. This situation begs two important questions: Has the transition to online teaching negatively affected student learning? And if so, how can we ameliorate these effects?

Previous research suggests that online classes can be just as effective as in-person classes, with student learning measured in terms of grades, teacher perceptions of learning, and student perceptions of learning (Swan 2003). However, this claim is not uncontested. For example, Xu and Jaggars (2014) observe that findings that online and in-person instruction yield similar learning outcomes have mostly been from studies conducted at elite institutions and seem not to generalise to community colleges. Furthermore, they find that certain demographic groups – students who are male, Black, younger, or have lower Grade Point Averages (GPAs) – perform relatively worse when taking online courses. 

If there are pre-existing differences in performance in online classes, these could be exacerbated by the pandemic forcing classes to be taught remotely. For example, the Centers for Disease Control and Prevention reports that racial and ethnic minorities are more likely to become ill and die from COVID-19. The factors behind disproportionate morbidity effects might create disproportionate stress on students from these demographic groups. One might also worry that when institutions went online, many students who were already disadvantaged returned to environments with relatively fewer resources to support learning, thus putting them at a further disadvantage.

There are several ways to potentially increase the effectiveness of online learning, and many of them fall under the broad category of ‘active learning’. Active learning includes any method that allows students to engage with the material through application, problem-solving, and discussion. For example, students might be asked to answer conceptual questions or solve problems during class, and this has been shown to improve learning outcomes by inducing students to engage in lectures and giving both students and teachers feedback on understanding (Knight and Wood 2014, Balaban et al. 2016). Students may also be asked to work in pairs or small groups to solve problems and to engage in peer instruction (Crouch and Mazur 2001). 

These active-learning elements are associated with greater effectiveness in online instruction as well. When developing the Student Evaluation of Online Teaching Effectiveness, Bangert (2005) found that ‘Active Learning and Cooperation Among Students’ was among the interpretable factors of online teaching effectiveness.

In a recent paper (Orlov et al. 2020), we examine the role played by online instruction, demographics, and active learning in student learning during the pandemic. We first look at the effect the transition online had on performance on standardised assessments of economic skills. We then look more closely to see if negative effects were concentrated in certain demographic groups (organised by gender, race, first-generation college-goer status, and non-native English speaker status). Finally, we estimate the effect on student learning of certain active-learning practices and instructor experience with online teaching.

Data

Our data were collected during the spring and autumn 2019 semesters and the spring 2020 semester from seven courses at four US R1 PhD-granting institutions. Students in these courses took demographic surveys and multiple-choice assessments of their learning. These standard assessments were developed at Cornell University as part of its Active Learning Initiative and included the Intermediate Economics Skills Assessment–Microeconomics, the Economic Statistics Skills Assessment, the Applied Econometrics Skills Assessment, and the Theory-based Econometrics Skills Assessment. 

The use of the standard assessments allowed us to compare the performance of students in all seven courses before and during the pandemic. Furthermore, because the questions were mapped to specific learning goals, we could separate out the topics that were covered during the latter portion of the semester, when students were learning remotely.

Instructor data on teaching practices before and during the pandemic and the extent of material coverage during the pandemic semester were collected through a survey administered at the end of spring 2020. We were interested in the instructor’s prior experience teaching online and their implementation of active-learning practices, particularly their use of polling and promotion of peer interaction. We considered a course to have used polling if students were asked at least two questions per class for all or all but one or two classes during the semester. Any course that used think-pair-share activities or small group activities, encouraged students to work together outside class in pre-assigned small groups, or allowed students to work together on exams was considered a course that encouraged peer interaction.

Results

We first compare assessment scores in spring 2020 to scores in spring 2019 and autumn 2019. The assessments sub-score for material learned in the remote portion of the semester dropped (statistically significantly) by 0.096 standard deviations. This result suggests that the switch to remote teaching did take a toll on student learning, though it is difficult to say whether this cost was incurred because the classes were online or because of the stress and other factors related to the pandemic.

We then examine student achievement on the assessments in pre-pandemic and pandemic semesters while controlling for student demographics. On average, women and underrepresented-minority students performed worse than non-underrepresented-minority men in both semesters. However, this gap was not made wider by the pandemic. Our data suggests that scores of first-generation college-goers were moderately lower during the pandemic semester, but in general, the pandemic did not seem to disproportionately affect any particular demographic group.

Finally, we compare assessment performance during the pandemic semester across courses to determine which course characteristics allowed students to best weather the challenges caused by remote learning and the pandemic. Courses taught by instructors with experience teaching online had higher scores overall (0.611 standard deviations) and for material learned remotely (0.625 standard deviations). This effect almost completely ameliorates the learning loss due to the pandemic. 

Planned peer interaction also positively affected student performance: classes that incorporated peer interaction saw scores that were 0.315 standard deviations higher than those classes without peer interaction for the material learned remotely. However, no such difference was found between courses that used in-class polling (e.g. PollEverywhere or iClicker) and courses that did not.

Discussion and conclusion

The pandemic has posed many challenges for students and teachers, and unsurprisingly, student performance has suffered in response. However, there is cause for optimism. We find little evidence that disadvantaged groups were further disadvantaged by the pandemic in their college learning. 

Furthermore, the factors that help ameliorate the pandemic’s negative effects are well within reach of many courses. Instructor experience is one such factor, and many instructors have already acquired some online-teaching experience during the spring 2020 switch to remote teaching. 

Increasing peer interaction in the synchronous virtual classroom will take more effort, but many instructors have already shown it can be done. Active-learning techniques like think-pair-share and small group activities have revolutionised the physical classroom, and they seem to be quite effective in an online environment too. 

These findings suggest that, although classes may remain online for some time, the quality of students’ learning has the potential to reach pre-pandemic standards.

References

Adams, W K, and C E Wieman (2011), “Development and validation of instruments to measure learning of expert-like thinking”, International Journal of Science Education 33(9): 1289–312.

Balaban, R A, D B Gilleskie and U Tran (2016), “A quantitative evaluation of the flipped classroom in a large lecture principles of economics course”, Journal of Economic Education 47(4): 269–87.

Bangert, A W (2005), “Identifying factors underlying the quality of online teaching effectiveness: An exploratory study”, Journal of Computing in Higher Education 17(2): 79–99.

Crouch, C H, and E Mazur (2001), “Peer instruction: Ten years of experience and results”, American Journal of Physics 69(970).

Centers for Disease Control and Prevention (2020), “Health equity considerations and racial and ethnic minority groups”, CDC, 24 July.

Cornell University (c2020), “University-wide Active Learning Initiative”, Office of the Provost.

Kalaian, S A, R M Kasim and J K Nims (2018), “Effectiveness of small-group learning pedagogies in engineering and technology education: A meta-analysis”, Journal of Technology Education 29(2): 20–35.

Knight, J K, and W B Wood (2005), “Teaching more by lecturing less”, Cell Biology Education 4(4): 298–310.

Mazur, E (1997), Peer instruction: A user’s manual, Saddle River NJ: Prentice Hall. 

Orlov, G, D McKee, J Berry, A Boyle, T DiCiccio, T Ransom, A Rees-Jones and J Stoye (2020), “Learning during the COVID-19 pandemic: It is not who you teach, but how you teach”, NBER Working Paper 28022.

Swan, K (2003), “Learning effectiveness: What the research tells us”, in J Bourne and J C Moore (eds.), Elements of Quality Online Education, Practice and Direction, Needham MA: Sloan Center for Online Education, 13–45.

Xu, Di, and S S Jaggars (2014), “Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas”, Journal of Higher Education 85(5): 633–59.

13,018 Reads