A Clever Way to Measure How Students Actually Use Edtech (and Whether It...


A Clever Way to Measure How Students Actually Use Edtech (and Whether It Works)

By Tony Wan     Jan 17, 2018

A Clever Way to Measure How Students Actually Use Edtech (and Whether It Works)

If you buy it, you better use it. That especially holds true for K-12 school officials who altogether spend more than $8.3 billion on education software each year, according to estimates from the Software and Information Industry Association.

Yet it can be tedious to manually keep track of how students interact with different pieces of software—or whether these tools are even being used. For many districts, it takes time and a bit of Excel wizardry to download data from each software provider, then combine and distill the information into a single dashboard.

“We had to go into different websites, find different usage reports, talk to vendors,” says Matthew Raimondi, an assessment and accountability coordinator at U-46, a school district in the northwestern suburbs of Chicago. “You run around to a bunch of different places to get information about whether students are actually making progress through the tools.”

The days of hacking together spreadsheets may be coming to an end, however. Since last fall, Raimondi, along with technologists at 19 other school districts, have piloted a new service that automatically provides data in how students engage with different online education programs.

Dubbed “Clever Goals,” the tool lets educators, students and parents see how long a pupil spent on different digital learning programs, along with their “progress,” which is defined differently depending on the kind of edtech software. (It could be books read, quizzes finished, activities completed.) Teachers can also set weekly usage targets for individual students and track their progress against those goals.

School dashboard showing different product usage by class

Clever Goals is the latest product from Clever, a San Francisco-based company that aims to simplify how schools and students use educational software. Its first tool, an API that syncs a school’s rostering data with software providers, allows user accounts to be created and updated automatically. Then, the company created a single sign-on solution that lets students log in to different applications with one credential (so that they don’t have to remember different passwords).

Clever claims usage in more than 60,000 U.S. K-12 schools, giving the company a strong pulse on what software teachers and students are using. On top of that, more than 300 education applications are connected to its ecosystem. So far this year, roughly 20 percent of all U.S. students have logged into an app using Clever, claims Tyler Bosmeny, the company’s CEO.

The company already shares student login data with district administrators. But school leaders wanted more granular information about how each tool was being used by individual students. With Clever Goals, Bosmeny adds, “we’re trying to help schools go beyond the click, and look at actual student progress and actual student usage.”

In fall 2016, the company dispatched its chief product officer, Dan Carroll, to visit schools that were manually pulling such data. What he found were color-coded labyrinths of spreadsheets that often took several full days of work to cobble together.

After nearly a year of development, the company invited 20 districts to pilot the platform last fall. For Raimondi and his team at U-46, where eight different educational tools are used, it’s been a time saver. “Before, we would self-collect usage data and have people at the school level report how often the students were using it or not using it,” he tells EdSurge. “Now, we don’t have to figure out how to pull data from all these applications.”

Currently 30 of the roughly 300 apps accessible on Clever’s platform are synced to Goals. The company has to work with each of those product’s development team to pull students’ usage data.

Getting quicker, and more reliable data on usage has helped his teachers intervene when students hit a roadblock in a program, says Raimondi. “In a computer lab it may look like all the kids are engaged if their screens are on the right applications. But you wouldn’t know if the kids are stuck, or are supposed to be doing what they’re doing.” If a teacher sees that a student is spending a lot of time but not completing activities, that’s a sign that a check-in is needed.

Student view of their progress

Administrators can also see an overview of product usage trends at the school level, and drill down to see how software is being used in individual classes and students. They can also create reports for parents to see their child’s usage of and progress through different edtech tools.

The launch of Clever Goals also marks a potential new revenue stream for the company, which until now has only charged companies for access to its API. “This will be the first time that Clever has a premium offering that schools can purchase if they want,” says CEO Tyler Bosmeny. He kept mum on pricing details, other than it will be based on the size of a district and professional development needs.

Clever, which has raised more than $43 million from investors including Sequoia Capital and GSV since its start in 2012, did not disclose whether it is profitable yet. The company currently has approximately 120 employees.

Broader Implications

Getting a better glimpse into how much students actually use education software can offer clues to a mystery that has long dogged the edtech industry: Does technology actually help kids learn?

Like medicine, some edtech products recommend a certain level of usage before users can expect to see improvement. For example, DreamBox, an online math software, recommends that students in grades 6-8 use it for 60 to 90 minutes a week.

Yet inconsistencies in usage make it difficult to know whether a tool “worked” as intended. In a small study of 73 schools, LearnTrials, a company that helps K-12 districts purchase technology, found that 35 percent of student software licenses were never activated, and only 9 percent of students met usage goals.

University researchers have also been frustrated in their efforts to link edtech products to student outcomes. In their study of DreamBox, Harvard researchers found that “most students did not reach the recommended levels of usage.”

The data from Clever Goals can be valuable on multiple levels, says Alexandra Resch, a senior researcher Mathematica Policy Research. First, it can show whether a school’s usage of software matches the levels recommended by the developer. The information could also be “potentially useful,” she adds, in establishing a causal relationship between product usage and student outcomes. “My brain is spinning with all the analyses we can do.”

How students’ usage data is used or shared is determined strictly by school district officials. “Schools will continue to own all student data in Clever,” says Bosmeny, and the company promises it will not “disclose personally identifiable information to anyone, including researchers, without explicit direction to do so by a school.”

Those possibilities excite people like Resch, and she hopes that teachers “have the time, training and capacity to make good use of the data” from Clever Goals. At the very least, she says, the data from Clever Goals can help school leaders see whether their software licenses they paid for are actually being used. “If a district is paying for 20 products but five of them are not being used, it’s a pretty clear step to then ask: ‘Why not?’”

That information can—and should—hold district purchasers more accountable, says U-46’s Raimondi. With “an easier way to identify excess purchase or under-utilization [of software], there’s definitely potential for cost-saving and more efficient use of software funds.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up