Questioning the Numbers Behind Udacity's SJSU Experiment


Questioning the Numbers Behind Udacity's SJSU Experiment

Sep 3, 2013

Is it still apple juice if you're using oranges? This metaphor may come in handy as one takes a closer look at the latest results from Udacity's ongoing partnership with San Jose State University to provide for-credit, online classes.

Last week, Udacity released an "update" with results from its summer pilot. In contrast to the hotly criticized results from the spring pilot, where a majority of students failed to pass entry level courses like Entry Level Math and College Algebra, the summer results showed a substantial improvement; in some cases the passing rate increased from 25% to 73%.

These results appear to redeem Udacity's standing; outlets like TechCrunch noted that "the failure was premature." But a deeper dive from Phil Hill from e-Literature suggests there's more behind the numbers. He noted that the student populations in the spring and summer classes were "completely different, to the point where other comparisons, such as passing rates or completion rates, should not be made." For instance, 50% of the students in the spring class were high school students, while 15% of the summer class were high school students and 53% had a college degree.

TechCrunch reported that the company will be releasing "more sophisticated statistical analysis of the courses." Perhaps the numbers will better illuminate whether the results between the two classes really showed an improvement or just a shift in demographics.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up