In-person coaching versus technology: Proactive, constant contact matters

Philip Oreopoulos, Uros Petronijevic

13 November 2016



Policymakers and academics share growing concerns about stagnating college completion rates and negative student experiences. Recent figures suggest that only 56% of students who pursue a bachelors’ degree complete it within six years (Symonds et al. 2011), and it is increasingly unclear whether students who attain degrees acquire meaningful new skills along the way (Arum and Roska 2011). Students enter college underprepared, with those who procrastinate, do not study enough, or have superficial attitudes about success performing particularly poorly (Beattie et al 2016).

Personalised coaching to improve outcomes

A promising tool for improving students’ college outcomes and experiences is personalised coaching. At both the high school and college levels, an emerging recent literature demonstrates the benefits of helping students foster motivation, effort, good study habits, and time-management skills through structured tutoring and coaching. Cook et al. (2014) find that cognitive behavioural therapy and tutoring generate large improvements in maths scores and high school graduation rates for troubled youth in Chicago, while Oreopoulos et al. (forthcoming) show that coaching, tutoring, and group activities lead to large increases in high school graduation and college enrolment among youth in a Toronto public housing project. At the college level, Scrivener and Weiss (2013) find that the Accelerated Study in Associates Program – a bundle of coaching, tutoring, and student success workshops – in CUNY community colleges nearly doubled graduation rates and Bettinger and Baker (2014) show that telephone coaching by Inside Track professionals boosts two-year college retention by 15% across several higher-education institutions.

While structured, one-on-one support can have large effects on student outcomes, it is often costly to implement and difficult to scale up to the student population at large (Bloom 1984). Noting this challenge, we set out to build on recent advances in social-psychology and behavioural economics, investigating whether technology – specifically, online exercises, and text and email messaging – can be used to generate comparable benefits to one-on-one coaching interventions but at lower costs among first-year university students (Oreopoulos and Petronijevic 2016).

Several recent studies in social-psychology find that short, appropriately timed interventions can have lasting effects on student outcomes (Yeager and Walton 2011, Cohen and Garcia 2014, Walton 2014). Relatively large improvements on academic performance have been documented from interventions that help students define their long-run goals or purpose for learning (Morisano et al. 2010, Yeager et al. 2014), teach the ‘growth mindset’ idea that intelligence is malleable (Yeager et al. 2016), and help students keep negative events in perspective by self-affirming their values (Cohen and Sherman 2014). In contrast to these one-time interventions, other studies in education and behavioural economics attempt to maintain constant, low-touch contact with students or their parents at a low cost by using technology to provide consistent reminders aimed at improving outcomes. Providing text, email, and phone call updates to parents about their students’ progress in school has been shown to boost both parental engagement and student performance (Kraft and Dougherty 2013, Bergman 2016, Kraft and Rogers 2014, Mayer et al. 2015), while direct text-message communication with college and university students has been used in attempts to increase financial aid renewal (Castleman and Page 2014) and improve academic outcomes (Castleman and Meyer 2016).

Can lower-cost alternatives to one-on-one coaching be effective?

We examine whether benefits comparable to those obtained from one-on-one coaching can be achieved at lower cost by either of two specific interventions (Oreopoulos and Petronijevic 2016). We examine a one-time online intervention designed to affirm students’ goals and purpose for attending university, and a full-year text and email messaging campaign that provides weekly reminders of academic advice and motivation to students. We work with a sample of more than 4,000 undergraduate students who are enrolled in introductory economics courses at a large representative college in Canada, randomly assigning students to one of three treatment groups or a control group. The treatment groups consist of:

  1. A one-time, online exercise completed during the first two weeks of class in the autumn;
  2. The online intervention plus text and email messaging throughout the full academic year; and
  3. The online intervention plus one-on-one coaching in which students are assigned to upper-year undergraduate students who act as coaches.

Students in the control group are given a personality test measuring the Big Five personality traits.

Figure 1 summarises our main results on course grades. Overall, we find large positive effects from the coaching programme, amounting to approximately a 4.92 percentage-point increase in average course grades; we also find that coached students experience a 0.35 standard-deviation increase in GPA. In contrast, we find no effects on academic outcomes from either the online exercise or the text messaging campaign, even after investigating potentially heterogeneous treatment effects across several student characteristics, including gender, age, incoming high school average, international-student status, and whether students live on residence.

Figure 1. Main effects of interventions

Our results suggest that the benefits of personal coaching are not easily replicated by low-cost interventions using technology. Many successful coaching programmes involve regular student-coach interaction facilitated either by mandatory meetings between coaches and students or proactive coaches regularly initiating contact (Scrivener and Weiss 2013, Bettinger and Baker 2014, Cook et al. 2014, Oreopoulos et al. forthcoming). Our coaches initiated contact and built trust with students over time, in person and through text messaging. Through a series of gentle, open-ended questions, the coaches could understand the problems students were facing and provide clear advice, ending most conversations with students being able to take at least one specific action to help solve their current problems.

Our text messaging campaign offered weekly academic advice, resource information, and motivation, but did not initiate communication with individual students about specific issues (e.g. help with writing or an upcoming mid-term). The text-messaging team often invited students to reply to messages and share their concerns but was unable to do this with the same efficacy as a coach, nor were we able to establish the same rapport with students. Our inability to reach out to all students and softly guide the conversation likely prevented us from learning the important details of their specific problems. Although we provided answers and advice to the questions we received, we did not have as much information on the students’ backgrounds as our coaches did, and thus could not tailor our responses to each student’s specific circumstances.

Our coaches were also able to build trust with students by fulfilling a support role. Figure 2 provides an example of how the coaching service was more effective than the text messaging campaign in this respect. The text messages attempted to nudge students in the right direction, rather than provide tailored support. The left panel of Figure 2 shows three consecutive text messages, in which we provide a tip on stress management, an inspirational quote, and a time-management tip around the exam period. As in this example, it was often the case that students would not respond to such messages. In contrast, the student-coach interaction in the right panel shows our coaches offering more of a supportive role rather than trying to simply nudge the student in a specific direction. The coach starts by asking an open-ended question, to which the student responds, and the coach then guides the conversation forward. In this example, the coach assures the student that they will be available to help with a pending deadline and shows a genuine interest in the events in the student’s life.

Figure 2. Distinguishing the text-messaging campaign and the coaching programme

Coaches also kept records of their evolving conversations with students and could check in to ask how previously discussed issues were being resolved. Although we kept a record of all text message conversations, a lack of resources prevented us from conducting regular check-ups to see how previous events had unfolded, which likely kept us from helping students effectively with their problem and from establishing the trust required for students to share additional problems.

Concluding remarks

In sum, the two key features that distinguish the coaching service from the texting campaign are that coaches proactively initiated discussion with students about their problems and could establish relationships based on trust in which students felt comfortable to openly discuss their issues. Future work attempting to improve academic outcomes in higher education by using technology to maintain constant contact with students may need to acknowledge that simply nudging students in the right direction is not enough. A more personalised approach is likely required, in which coaches or mentors initially guide students through a series of gentle conversations and subsequently show a proactive interest in students’ lives. These conversations need not necessarily occur during face-to-face meetings, but the available evidence suggests that they should occur frequently and be initiated by the coaches. While such an intervention is likely to be costlier than the text messaging campaign in our study, it is also likely to be more effective but still less costly than the personalised coaching treatment.


Arum, R and J Roksa (2011) Academically adrift: Limited learning on college  Campuses, Chicago, IL: University of Chicago Press.

Beattie, G, J-W P Laliberté and P Oreopoulos (2016) "Thrivers and divers: Using non-academic measures to predict college success and failure", National Bureau of Economic Research, Working Paper 22629.

Bergman, P (2016) “Parent-child information frictions and human capital investment: Evidence from a field experiment”, Columbia University Working Paper.

Bettinger, E and R Baker (2014) “The effects of student coaching: An evaluation of a randomized experiment in student advising”, Educational Evaluation and Policy Analysis, 35(1): 3-19.

Bloom, B (1984) “The 2 Sigma problem: The search for methods of group instruction as effective as one-to-one tutoring”, Educational Researcher, 13(6): 4-16.

Castleman, B and L Page (2014) “Freshman year financial aid nudges: An experiment to increase FAFSA renewal and college persistence”, Center for Education Policy and Workforce Competitiveness, University of Virginia, Working Paper No 28.

Castleman, B and K Meyer (2016) “Can text message nudges improve academic outcomes in college? Evidence from a West Virginia Initiative”, Center for Education Policy and Workforce Competitiveness, University of Virginia, Working Paper No 43.

Cohen, G and J Garcia (2014) “Educational theory, practice, and policy and the wisdom of social psychology”, Policy Insights from the Behavioral and Brain Sciences,1(1): 13-20.

Cohen, G and D Sherman (2014) “The psychology of change: Self-affirmation and social psychological intervention”, Annual Reviews of Psychology, 65: 333-371.

Cook, P J, K Dodge, G Farkas, R G Fryer, Jr, J Guryan, J Ludwig, S Mayer, H Pollack, L Steinberg (2014) “The (surprising) efficacy of academic and behavioral intervention with disadvantaged youth: Results from a randomized experiment in Chicago”, National Bureau of Economic Research, Working Paper 19862.

Kraft, M A and S M Dougherty (2013) “The effect of teacher–family communication on student engagement: Evidence from a randomized field experiment”, Journal of Research on Educational Effectiveness, 6(3): 199-222.

Kraft, M A and T Rogers (2014) “The underutilized potential of teacher-to-parent communication: Evidence from a field experiment”, Harvard Kennedy School, Faculty Research Working Paper Series, RWP-14-049.

Mayer, S E, A Kalil, P Oreopoulos and S Gallegos (2015) “Using behavioral insights to increase parental engagement: The parents and children together (PACT) intervention”, National Bureau of Economic Research, Working Paper 21602.

Morisano, D, J Hirsh, J Peterson, R Pihl and B Shore (2010) “Setting, elaborating, and reflecting on personal goals improves academic performance”, Journal of Applied Psychology, 95(2): 255–264.

Oreopoulos, P, A Lavecchia and R S Brown (forthcoming) "Pathways to education: An integrated approach to helping at-risk high school students", Journal of Political Economy.

Oreopoulos, P and U Petronijevic (2016) "Student coaching: How far can technology go?" National Bureau of Economic Research, Working Paper 22630.

Scrivener, S and M J Weiss (2013) “More graduates: Two-year results from an evaluation of accelerated study in associate programs (ASAP) for developmental education students”, Policy Brief, MDRC.

Symonds, W C, R Schwartz and R F Ferguson (2011) “Pathways to prosperity: Meeting the challenge of preparing young Americans for the 21st century”, Pathways to Prosperity Project, Harvard University Graduate School of Education.

Yeager, D, C Romero, D Paunesku, C S Hulleman, B Schneider, C Hinojosa, H Y Lee, J O’Brien, K Flint, A Roberts and J Trott (2016) “Using design thinking to improve psychological interventions: The case of the growth mindset during the transition to high school”, Journal of Educational Psychology, 108(3): 374–391.

Yeager, D, M D Henderson, D Paunesku, G M Walton, S D’Mello, B J Spitzer, A L Duckworth (2014) “Boring but important: A self-transcendent purpose for learning fosters academic self-regulation”, Journal of Personality and Social Psychology, 107(4): 559–580.

Yeager, D and G Walton (2011) “Social-psychological interventions in education: They’re not magic”, Review of Educational Research, 81(2): 267–301.

Walton, G (2014) “The new science of wise psychological interventions”, Current Directions in Psychological Science, 23(1): 73-82.



Topics:  Education

Tags:  experiment, coaching, nudges, teaching, education, university, bachelor, student performance

Professor of Economics and Public Policy, University of Toronto

Assistant Professor, Department of Economics, York University