Skip to content
Q&A

A Milestone Study: Why the Bottom Line Report is a Win Within Evidence-Based Policy

We sat down with AV’s Kim Cassel to find out why this study was so important and what this means for research and policy in higher education.

Rich Pierre is an example of how the Bottom Line program can turn things around for at-risk students. He graduated in four years and immediately launched a career in finance. (David Degner/For Arnold Ventures)

In October, the college access and success program Bottom Line released six-year findings from a large, well-conducted randomized controlled trial that shows a sizable increase in bachelor’s degree attainment. When the results came out, the Evidence-Based Policy team at Arnold Ventures, perhaps the most un-hyperbolic set of people you might ever meet, said the study results were blockbuster, among the most meaningful of their entire careers. The release of the study coincided with the emergence of the College Completion Fund in the Build Back Better Act, which has funds set aside for evidence-based college completion programs like Bottom Line.

We sat down with Kim Cassel, director of evidence-based policy at Arnold Ventures, to find out why this study was so important and what this means for research and policy in higher education.

Arnold Ventures

Tell me about what you do.

I work to advance the use of evidence in policymaking. Fundamentally, that includes building the body of social interventions backed by strong, replicated evidence of important improvements in people’s lives. That involves the use of rigorous evaluation methods to identify such programs. 

I help ensure that research studies are well-designed and well-implemented, and that the research we support is useful to policymakers. As part of that work, I advise and collaborate with federal, state, and local policymakers as well as other funders on how to use research evidence in policy and funding decisions. 

Arnold Ventures

What are some of the hallmarks of high quality research? 

A lot of research can be high-quality, provided that both the methodology used is appropriate for the question the research is meant to answer and the research itself is well-implemented. 

Because we’re aiming to grow the number of effective social programs, we’re often looking for a strong signal from prior research of various kinds that a program or intervention might be producing the hoped-for outcomes. We call it prior evidence.” We primarily support evaluations that are capable of drawing direct, causal conclusions about a program’s impact. 

Well-conducted randomized controlled trials (RCTs) are widely regarded as the strongest method for evaluating a program’s effectiveness because they ensure an apples-to-apples comparison. By randomly assigning a large number of people to a program group or a control group, you can be sure that there are no systematic differences between the two groups in either observable characteristics (such as income or education) or unobservable characteristics (such as motivation or family support). So, any differences in outcomes observed between the two groups can confidently be attributed to the program itself.

That said, we strongly believe that there’s an important role for a wide range of research methodologies, and such prior evidence” plays a major role in informing when we might support a randomized controlled trial of a particular program. Before embarking on an RCT, we typically recommend that a program has already undergone critical, earlier-stage development and research — for example, we like to see that the program is steadily operating in a real-world setting, and that processes are in place to monitor implementation quality and fidelity to the program model. 

There may also be opportunities for program providers to partner with researchers to conduct quasi-experimental studies — like well-matched comparison group studies, where a group that receives a program is compared to a similar group that does not — to examine whether the program group appears to be experiencing improved outcomes relative to the comparison group. 

What we learn from these various research-related efforts can help build the case that investment in an RCT is warranted: to ultimately determine whether the program is indeed producing the hoped-for effects. 

It isn’t wishful thinking that our country can improve postsecondary achievement rates, and Bottom Line is now established as a program that can be deployed for that purpose.
Kim Cassel director of evidence-based policy at Arnold Ventures
Arnold Ventures

Why did the team choose to support an RCT of the Bottom Line program?

Bottom Line ticked a lot of boxes on our is the program ready for an RCT” checklist. The student mentoring program was established in 1997 and was already operating in three cities — Boston and Worcester, Massachusetts, and New York City — when we became aware of the program. It had time to fully develop the program, establish a replicable model, and serve hundreds of students per year.

The program had also partnered with an independent research team to conduct a high-quality quasi-experimental study to examine Bottom Line’s effectiveness. The study capitalized on the fact that Bottom Line used a strict cutoff to determine eligibility for its services: a GPA of 2.5. Students just above the cutoff (who were provided access to Bottom Line’s services) were compared to those just below the cutoff (who were not), and the study found that those who were deemed eligible for Bottom Line based on that cutoff were more likely to enroll and stay enrolled in 4‑year colleges. This made a compelling case that Bottom Line might be a special program. Additionally, Bottom Line was oversubscribed — meaning more students were eligible and interested than Bottom Line could serve with existing resources. This created the conditions to implement a random lottery to create two equivalent groups: one that was given the offer of Bottom Line services, and one that was not. And the main outcomes of interest — college enrollment, persistence, and degree attainment — could all be measured at modest cost through data collected by the National Student Clearinghouse. With support from the Laura and John Arnold Foundation (now Arnold Ventures), our team launched our Low-Cost RCT Competition in 2014, and Bottom Line applied for funding. Based on what I just shared, it was clear that Bottom Line was ready for an RCT, and it had a good chance of demonstrating positive impacts on important outcomes of student achievement. And so they were awarded an RCT grant in that first round. 

Arnold Ventures

So this was back in 2014 — and now we’re in 2021. What did the study find?

You’re reminding me of how I’ve aged! Yes, it’s been some time since this RCT kicked off, and part of what makes this one special for me is that I do feel some pride based on my affiliation with it, and the fact that it was the first RCT we supported — of around 100 total RCTs since then. 

We’ve been tracking study implementation carefully since the study began, working with a truly world-class research team in Andrew Barr, associate professor at Texas A&M, and Ben Castleman, associate professor at University of Virginia, and an incredibly dedicated and talented team at Bottom Line. 

As the study progressed, we first saw sizable gains in college enrollment, and then student persistence year-over-year. The study is now reporting outcomes six years after the point of study entry — which is five years after students’ expected high school graduation — at which point Bottom Line students were significantly more likely to earn a bachelor’s degree. Specifically, 55% of Bottom Line students had graduated with a bachelor’s degree, compared with 47% of the control group. 

We have never seen impacts in a well-conducted RCT of this magnitude on bachelor’s receipt for a college access program like this before. We are just thrilled and so excited to identify a program that truly helps the students it serves. 

Arnold Ventures

What do you hope to see as a result of this study?

These findings arrive at the perfect time for the current policy climate. Backing up for a moment: For evidence-based policy to deliver on its promise to improve people’s lives in meaningful ways, we essentially need two key things. We need to develop and identify programs that truly work,” and we need policy structures that prioritize evidence and get effective programs out to those who can benefit. Policymakers — and this is a bipartisan issue — are currently focused on findings ways to improve college graduation rates. There are a range of federal programs in the policy pipeline that are designed to support postsecondary achievement, and to do so in a way that incentivizes the development and use of rigorous evidence from RCTs like this one.

Until the Bottom Line study results came out, there wasn’t yet a relatively low-cost program backed by such strong, causal evidence on bachelor’s receipt. These RCT findings are critical for making the case that it’s possible to increase the likelihood of bachelor’s receipt through a college access and success program like Bottom Line. It isn’t wishful thinking that our country can improve postsecondary achievement rates, and Bottom Line is now established as a program that can be deployed for that purpose. 

Arnold Ventures

Why is this one of the most important RCTs of your career?

That’s a great question. For someone like me, who has spent her career in evidence-based policy, there’s just so much to the story of this RCT that I’m proud to be a part of: the mission of Bottom Line to help students get into and graduate college; Bottom Line’s courage to put their program to a rigorous test — first with a high-quality quasi-experimental study, and then with an RCT — to determine if they are truly producing positive impacts on students’ lives; the fact — which I hadn’t mentioned before in this conversation — that the RCT was conducted at very modest cost, around $160,000 total (a fraction of the traditional cost of RCTs, especially at the time when this study was launched); the incredible integrity of the research team to commit to pre-registering the full study design, and to report the findings in a way that is completely consistent with what they set out to do; that the study found that Bottom Line really does substantially increase the likelihood of bachelor’s degree receipt; and the policy moment that’s calling for rigorous evidence in higher education.

When we launched back in 2014, this is what I hoped for. It’s incredibly rewarding to see the efforts of so many smart, talented, and dedicated people culminate in actionable knowledge about what works to improve postsecondary education.