How Your Company Must Co-opt the Business Model Canvas

By

When edtech entrepreneurs hustle and undertake a fruitful customer discovery process, product pilots are easy to come by. Scaling, however, can be tough going. In five years of advising edtech startups at the Toronto-based incubator MaRS Discovery District, we have found that ventures that plan their pilots carefully for maximum impact have a better chance of scaling their businesses down the road. Education entrepreneurs, especially, are under pressure to chart not only their business models, but also their learning models.

One of the tools we use with startups, especially if their founders are educators, is Austrian business theorist Alex Osterwalder’s Business Model Canvas. The simple framework of nine boxes makes it easier for entrepreneurs to diagram their business models by mapping out the components needed for a successful pilot and the metrics that might show any associated learning gains.

Using this framework as inspiration, we have developed the Learning Model Canvas as a way for education startups to map out the logic of their learning models. It can take years and millions of dollars of randomized control trials to show true cause and effect for an educational product. But entrepreneurs can use tools like the learning canvas to organize their thoughts and identify leading indicators that learning improvements might be taking place.

Box 1: The heart of the learning canvas (analogous to Osterwalder’s value proposition), is the learning proposition--a hypothesis about what learning gains teachers should see if they use the product. It pays to be as specific as possible. “Getting better at math” is too vague, but “20% improvement on fractions unit test” is helpful.

Box 2: Who are the end users--i.e. students--experiencing this gain? The students should be described in as much detail as possible, starting with age, grade and subject and, if relevant, additional information such as gender, socioeconomic status, and any special needs they might have.

Box 3: Next, entrepreneurs should consider the technology needed to properly test the product. Remember, technology can be high-tech (iPad, microscope, SmartBoard) or low-tech (handout, test tube, volleyball).

Box 4: Another logistical concern is the product use schedule. Is the product designed to be used on demand, or is it best used for a couple of hours each week during social studies? Many entrepreneurs themselves haven’t thought about optimal schedules; usage ends up being driven by when and even if teachers can find free time.

Learning is very context-sensitive and these first four boxes are variables that can be tweaked. They may also be changed to test for different conditions in order to learn how teachers and students best interact with the product. For example, the makers of learn-to-read game Ooka Island found that classes that used the product 20 minutes per day performed better on reading assessments than classes that binged on the game in longer and less frequent stretches. And although the game was designed for K-1 students, it also worked for ESL students in Grade 4 or 5.

Boxes 1 - 4 on the canvas represent the front-end of a learning model--the parts of a pilot that are visible to teachers or administrators. The next section represents the back-end of a learning model--the behind-the-scenes structure that’s set up ahead of time to allow a pilot to run smoothly.

Box 5: Here, identify two or three metrics that can be measured during the course of the pilot. A simple way to do this is to poll the students before the pilot and then again afterwards. Grades, attendance, and qualitative observations are all valid data when used in the right way. The most important thing here is to choose metrics that the school or district understands, and can use to make decisions about scale.

Box 6: Your learning proposition should be supported by educational research and theory. Get started with thought leader John Hattie’s Visible Learning scale for a simple list of which teaching practices actually lead to learning gains. Also check out the federal Institute of Education Science’s What Works Clearinghouse for information on a wide range of programs, products and policies.

Box 7: Entrepreneurs often underestimate the number of partners they need to make sure their pilots succeed. It’s important to consider the people required to support the project, including parents, the IT department, union reps, caretakers, secretaries, teachers, department heads, student councils, etc.

At MaRS, we like to use the Learning Model Canvas in a lean fashion, by identifying assumptions in the canvas and fashioning rapid tests to validate them. Like the resources in entrepreneur Eric Ries’s lean startup toolkit, the Learning Model Canvas should be used iteratively. Every week during the pilot, entrepreneurs can take note of what has changed from their initial assumptions.

The canvas can also be used to help prepare a school for a pilot. “I use the Learning Canvas early on when I’m on the phone with a school,” says Meaghan Daly, founder of Forward Vision Games, a financial strategy game for aboriginal students. “When I’m talking with a pilot site, it helps me ask questions to make sure they are ready to implement,” she says. “It makes sure I know how they plan to use the game, and how they plan to judge if it fits their needs. Often they haven't thought about access to computers or how they’ll measure a successful pilot.”

Box 8: At the end of a pilot, startups should be able to confirm or deny that some learning gains have taken place. Likely, the end results will be somewhere in between. The goal is to identify the early indicators demonstrating that learning is taking place, so that entrepreneurs can tweak their learning propositions.

Box 9: What went wrong? Here entrepreneurs can document the learning obstacles students faced when interacting with the product. These can include lost passwords, unclear instructions, and frustrations students had with a new learning resource.

Following a pilot, entrepreneurs can use the nine-box framework to issue a clear summary on what did and did not work. Long, boring reports rarely get read. One-page infographics that highlight successes, however, can help administrators sell a product internally.

Try mapping your own learning model using the blank Canvas below; let us know what you think.

Joseph Wilson is a Senior Strategist in the Education Technology Cluster at the MaRS Discovery District in Toronto, Ontario.

Stay up to date on edtech. Sign up to have top stories delivered weekly.

Who we are

EdSurge helps schools find, select and use the right technology to support all learners.
© 2011-2016 EdSurge Inc. All rights reserved. Every student succeeds.