Mar 9, 2013
Earlier in the week Reuters published a story about a $100 million education database being built by inBloom, a non-profit funded by the Gates Foundation and the Carnegie Corporation. The database will house data on children across the country, and it has the potential to help educators, researchers, policy makers, and yes, for-profit businesses. For the most part, the story is couched in concerns over the risks associated with privacy issues or the misuse of data. Yet despite the alarmist tone, there are no specifics that go beyond boilerplate quotations from fearful parents. Alexander Russo asks for more:
Are there problems with state databases being being hacked and releasing sensitive student data? Tell us about them. How do these issues compare to data security problems in general? How many states include Social Security data in their student records, and how does this compare with other public agency databases, which have their own Social Security problems? Once again, some context and comparative data would be more helpful than isolated data points suggestively linked together to convey fear.
The Hechinger Report also published a short piece on the database. Class Size Matters head honcho Leonie Haimson was tasked with making the case against the database, and her logic borders on incoherent:
“There are no limitations on the time-frame, or the kind of data. There’s no provision for parental consent or opt-out. The point is to give our kids’ data away for free, and share it as widely as possible with for-profit ventures to help them market and develop their learning products,” she says.
You may be asking yourself, “Is her argument really that we should stop the construction of data systems that could help millions of kids because we’re afraid it might make it easier for companies to create better educational products?” To answer your question, yes, it is. And it’s nothing more than mood affiliation. For-profit companies are bad, and therefore anything with the potential to help them must also be bad. In reality, there’s no reason this should be the case. Over the last few years I’ve requested and received “confidential data” from a number of schools and departments of education. It has always been for the purpose of doing research for a non-profit organization, but it’s conceivable that a for-profit company could use the same data to improve a product, and in that case it’s likely their use of the data would have a much bigger positive impact on American education than my use of it. So as long as the data is protected, why should I have access to it but not them? Perhaps Haimson is worried about these terrible things they might do (via the Reuters article):
CompassLearning will join two dozen technology companies at this week’s SXSWedu conference in demonstrating how they might mine the database to create custom products – educational games for students, lesson plans for teachers, progress reports for principals.
Progress reports? The horror! The end of Haimson’s remarks is particularly telling.
”For-profit vendors are slavering right now at the prospect of being able to get their hands on this info. and market billions of dollars of worth of so-called solutions to our schools.”
What happens when “so-called solutions” are marketed to schools? Somebody at the school makes a decision about whether to buy them. There’s not some magic machine companies have where they input student data and a mind-control wave forces people to start buying their products. What happened to letting principals decide what’s best for their schools?
None of this is to say that privacy isn’t a legitimate concern, but there’s nothing new or exceptional here. At this moment thousands of school and government employees have access to confidential student data, and that data is protected by the same types of laws that will protect the data in the inBloom database.
Last night on Twitter Josh Barro said something insightful about Rand Paul’s fillibuster, but I think it applies here too:
I spend a lot of time with large sets of education data (both public and confidential), and so I’m admittedly biased toward believing the slope from “useful data sharing” to “gross violations of privacy” is not that slippery. Nevertheless, I think it’s important to investigate privacy issues and the weaknesses in data protection systems. That’s what keeps the slope unslippery. The problem with the arguments in the articles above is that they mention broad, hypothetical, worst-case scenarios instead of identifying specific problems. You can’t just yell “the slope is slippery,” you need to explain where and why the slope is slippery. Until critics come up with better reasons than “It makes me nervous” or “It might help a business be successful,” there’s no reason to believe the benefits of inBloom’s project won’t outweigh the costs.
I also want to make it clear that I’m not glossing over parent concerns about consent. The reason I didn’t spend much time addressing them is that the debate over the database essentially boils down to the moral ambiguity of benevolent paternalism. My view is that the new database creates no additional risk--the same data is being entrusted to the same people who have the same intentions as always--and therefore collecting the data will help kids. That’s my moral reason for supporting this “regulation” on data collection.
Parents also object on moral grounds. They hate the database because something they didn’t explicitly agree to is happening and they feel violated. And that’s a perfectly legitimate reason for not wanting data to be collected. When it comes to the database there’s not one side that has the moral high ground. It’s like Bloomberg outlawing 20 oz drinks. You can’t say that either side was objectively “right.” One side fought for public health and the other for personal freedom.
The inBloom database has the same moral ambiguity. One side wants to improve schools and the other side doesn’t want data about their family taken from them. My complaint is that even if you have the moral defense of “it’s wrong,” you should still have a practical defense, particularly if the other side can also claim morality is on their side. For example, even if you think you think the Constitution gives you the right to buy a gigantic soda, you should still try to explain the specific downsides of a 16 oz limit. Hating the database because it’s unjust is perfectly logical, but the argument against the database would be stronger if there were specifics about new dangers.