Session Submission Summary
Share...

Direct link:

Rapid Impact Assessment for Adaptive Learning: Building rigorous research systems to support continuous program improvement

Mon, April 26, 6:15 to 7:45am PDT (6:15 to 7:45am PDT), Zoom Room, 121

Group Submission Type: Formal Panel Session

Proposal

While many countries have made great strides in educational access over the last few decades, learning levels remain low. According to research from the Brookings Institution, it will take over 100 years for learning levels in many low income countries to reach levels in high-income countries (Winthrop and McGivney 2015). In early 2020, when the coronavirus pandemic hit, schools across the globe closed. This threatened to slow and potentially reverse any progress towards making education accessible and effective for all (Slade, et al 2017).

In this global context, we face a myriad of challenges in the quest to provide high quality education to the world’s children. Tackling those challenges will require innovative and flexible education programs. We must be constantly piloting, evaluating, and iterating solutions which are tailored to the unique set of challenges we are facing today. The global education community needs to set a new precedent for creative solutions to solve the learning crisis. And with those creative ideas, we will also need to adapt our research approaches to generate rigorous, meaningful evidence about what programs effectively improve student learning and skills.

Researchers and practitioners have joined forces to produce rigorous evidence to meet this demand. Impact evaluations such as randomized controlled trials have proven immensely useful in determining what programs effectively lead to learning and skill formation (Banerjee, et al 2017, Evans and Yuan 2019, Kremer et al 2013). However, the established strategies that allow us to rigorously evaluate program effectiveness are not always aligned with the need to innovate rapidly to make sufficient improvements to learning (R4D 2019). Instead, many tools in our research toolbox assess static program models and take months, if not years, to return results. While waiting for results, contexts and strategies can change drastically, making it difficult to base program decisions on rigorous evidence.

Governments and implementers need a more adaptive research toolkit to collect and analyze the data necessary build programs based on evidence. This research approach necessitates the construction of new systems. These systems must allow implementers and policymakers to move iteratively through the stages of program development, from problem identification to solution design to piloting and experimentation. During the experimentation phase, the program must generate rigorous and routine evidence of effectiveness, based on both qualitative and quantitative assessments.

This panel will present the perspectives of four organizations working at the nexus between research and implementation in a quickly-changing world that demands ever-evolving solutions. The presenters will discuss their work pioneering new techniques in evaluation: adaptive learning and rapid impact assessments. Each of the four organizations will share how they used rapid impact assessments (RIA) and adaptive learning systems in their work. Specifically: an evidence-based youth NGO that works with primary and secondary students will introduce case studies to outline the principles and how-tos to maximize the benefits of rapid, internal RCTS; a secondary skill development social enterprise will discuss the opportunity that school closures presented to build a RIA system and how they have identified short-term indicators to measure youth skill development; an education operator will present how they’ve used A/B testing to evaluate a range of instructional design interventions and school-based programs based on both cost and effectiveness to provide a holistic analysis of potential program designs; and a research organization will illustrate their framework for adaptive learning, which includes experimenting with new approaches to virtual data collection in the COVID world.

After the presentation, the speakers will engage each other and the audience in a deep discussion of how to build a data-based organizational culture, how to form effective partnerships for external research collaborations, the ways in which we conceptualize measurement and the implications of these understandings within our research, and how funders can create more effective programs by investing in the process of learning via internal impact assessments. Ultimately, we hope that participants leave the session with a deeper understanding of how measuring impact can improve learning, the most effective and rapid ways to do so in different contexts, and how we can partner to establish a precedent of continuous learning and rapid improvement in our work.

Sub Unit

Chair

Individual Presentations

Discussants