ExtensionEngine Blog

A/B Testing and Online Education

by omarkus

According to its public roadmap, sometime during 2014, edX will release A/B testing support, a feature that will allow course teams and researchers to optimize their courses by measuring and analyzing usage data and learn about the effectiveness of their courses (learn about learning).

A/B testing has been widely used in web and mobile products over the last decade. The idea is to optimize the user experience or a certain performance metric by running an experiment. The experiment includes randomly serving multiple versions of the same page or functionality to different users and record the outcome. Since the web allows us to reach hundreds, thousands, and sometimes millions of users within days, it’s possible to collect a large enough dataset required for statistical analysis and to assess the performance of each version against the goal.

A notable example of the power of A/B testing in the real world is the 2008 Obama campaign.

So if companies can use A/B to raise money and maximize user engagement, why not use that for a noble goal of improving people’s lives, abilities and skills. That’s where A/B testing for online education comes into the picture.

With traditional in-class education, it’s hard to experiment. The average course is given once a semester for a few dozen students. The number of variables is large and the sample size is too small to make a data driven analysis of the effectiveness of a course.

But with online learning, we can test the effectiveness of the exact same teaching method over thousands of students in a very short period of time. A company which is already doing it today is called DuoLingo, a free web and mobile application to learn new languages (won Apple’s app of the year in 2013). According the founder, Luis von Ahn, the team didn’t know much about teaching languages at the beginning, so they relied on books and experts. But after they had enough users, they started doing experiments, which helped them improve their curriculum, and find out, for example, when is the right time to learn plurals. Researchers are doing versions of A/B testing, but have to sift through reams of data months or years after the fact.

A/B testing for online learning opens the door to many interesting opportunities. Looking at usability data, we can very quickly learn if a lesson is interesting, or if a test is too hard. We can compare students' performance before and after the training and learn how fast people can learn. We can experiment with new and innovative teaching methods. We can even drill deeper and correlate that data with a bunch of other variables like the student's background, location, social and professional attributes.

All of these can teach us about how humans learn and how we can improve our learning experience to make us smarter, happier, and more skilled. We're looking forward to getting our hands on the A/B testing features in Studio later this year.