2016 has been a busy year for data privacy advocates. University of California, Berkeley students filed suit against Google for illegally scanning their emails. Pokemon Go launched its wildly popular location-based and data-hungry augmented reality game. And at least 14 states have passed student data privacy laws, so far.
Never ones to pass up an opportunity to dig into topics that spark controversy and spotlight opportunity, EdSurge sat down with education technology lawyer Gretchen Shipley at our recent California Tech for Schools Summit. Shipley is a partner at Fagen Friedman & Fulfrost LLP—F3, as it’s popularly known—a law firm representing more than 400 California educational institutions. She co-chairs F3’s eMatters Practice Group, advising clients on the legal ramifications of implementing technology in California’s public schools.
We asked Shipley how schools can adopt new technology without running afoul of the many state and federal privacy laws. While California-specific, her insights layout the challenges for school districts and edtech companies throughout the country.
E: Is California the strictest state in the nation when it comes to public schools and data privacy?
Shipley: One of the strictest, which is a good thing when we’re talking about keeping children safe. In one specific statute, for example, there are nine elements that need to be in any contract between a school district and a software vendor, if they are using pupil information. And with all this new, individualized instruction, they’re using pupil data to provide individualized instruction.
Our job is not to say, “No, you can’t use this software.” It’s to ask, “How can we work with this software company?” We try to figure out whether we can give a school district the tools to work with a software company to either modify their agreement or, in some instances, modify their product, so that it’s legally compliant in California.
Is there still a big gulf in between the [California] schools’ understanding of data privacy and full compliance with the law?
Yes. Many school districts don’t know every time a teacher is downloading an app for all of the devices in the classroom. That’s one area where we’ve put together free resources, like the Data Privacy Guide. We also made a short, five-minute video for professional development, called Ask Before You App, [to help] teachers know what to look for before they download an app in the classroom. It’s things like: Is there a social media component? Because there’s a law prohibiting that. Are they advertising to children? There’s a law prohibiting that. Are they collecting geolocation data for kids under 13? There’s a law governing that.
If they have any questions or concerns, they should probably be running it up the flagpole. Because . . . you don’t want to be a superintendent or a school board member where you have parents, community members, privacy advocates who are going to challenge the district or question the safety of their students and their privacy and all of their data.
One challenge I’ve run into recently is at the CTO and Director of Technology level. They really understand and they’re really doing everything they can. But I think they need the resources, like professional development time or the ability to work with legal counsel to vet contracts. So sometimes these issues need to also be more on the forefront for cabinet level and for superintendents and for school board members.
On the other side, there are the companies that sell to schools. How can companies increase awareness of privacy policies?
In my experience, vendors—both extremely large and extremely small—struggle in complying the most. How many people know the education code inside and out? There are over 100,000 education code statutes.
There may be a vendor salesperson who may not always be communicating with the engineering side of the house, who isn’t communicating adequately with their lawyer. The person who engineers the product may not be able to actually engineer it in a way that complies with the data privacy protocol. And then the salesperson might tell you a third thing. That is sort of the trifecta that we run into in working with vendors.
School districts should not have to be working so hard to make sure that what they’re buying is a safe product. All of us want to make sure that there are safe products out there for kids and that their data is protected. We don’t need to be adversaries; we all want to work together. But right now, I feel like the school districts themselves are doing the lion’s share of the work.
Whether it’s some sort of labeling or standard, I do think more pressure needs to be put back on the industry.
A hot thing right now is augmented reality; apps like Pokemon Go seem to be everywhere. What are the privacy implications?
Pokemon Go operates off of geolocation data. Some school districts are trying to be innovative and think “Is there any way we can incorporate this into curriculum?” And I say, “Yes, you can, so long as you get that written parental consent.”
What about cameras and Skype for Classrooms?
Skype for Classrooms is interesting. Because in a classroom, you’re going to have other students who are raising their hand or are taking tests, and is their pupil record, so to speak, being protected? It’s not just the individual student who’s at home who is Skyping into the classroom.
So you need to usually get some sort of sign-off. Or can you configure it in a way where you’re not getting a view of every other kid in the classroom?
It seems like the law is catching up with school practices and maybe the two might clash.
Well, in some areas, law hasn’t caught up with the technology. And in some areas, the law is overly aggressive and too restrictive.
We want innovation; individualized learning is a wonderful thing. But to do so, it uses student data. How do we balance that and still provide a safe environment for kids?