What It’s Like Navigating the Strictest Student Privacy Law in the Country

EdSurge Podcast

What It’s Like Navigating the Strictest Student Privacy Law in the Country

By Emily Tate     Jun 18, 2019

What It’s Like Navigating the Strictest Student Privacy Law in the Country

This article is part of the guide The EdSurge On Air Podcast.

We often hear how important it is to protect the privacy of student data. But the consequences can seem pretty abstract.

When kids and adults alike are downloading a new app, or creating an online account, they tend to click “I agree” and “Allow” without a second thought. After all, they ask themselves, what’s the worst that could happen?

In Louisiana, educators don’t have that luxury. According to a student data privacy law passed there a few years ago, anyone who collects or shares students’ personally identifiable information (or PII, as the shorthand goes) can be punished by up to six months in prison or $10,000 in fines.

That certainly raised the stakes on an issue that, until then, many people hadn’t been taking very seriously.

But it also led to an environment where educators, schools and districts became so afraid of breaking the law that they curtailed their collection and use of all kinds of data, from batting averages and touchdown stats at school sporting events to Students of the Month and Honor Rolls hanging in the hallways.

When the possibility of a mistake meant steep fines and prison time, many teachers, naturally, decided it wasn’t worth the risk.

Then Kim Nesmith got involved.

Nesmith is the director of data governance, privacy, and edtech for the Louisiana Department of Education, and she recently sat down with EdSurge to talk about the strictest student data privacy law in the country—and what it takes to help Louisiana educators face their fears and offer technology services to students and families in spite of that law (and with a healthy awareness of privacy).

Striking that balance isn’t always easy, and it requires some real creativity. But one good thing that’s come from the law, Nesmith says, is that in Louisiana, there’s no mistaking the importance of student data privacy—not anymore.

Listen to the discussion on this week’s EdSurge On Air podcast. You can follow the podcast on the Apple Podcast app, Spotify, Stitcher, Google Play Music or wherever you listen. Or read a portion of the interview below, lightly edited for clarity.

EdSurge: Anybody working in student privacy today has a difficult role, but yours, at the Louisiana Department of Education, is especially challenging—thanks to the privacy law that was passed there in 2014. Can you tell us about that law and what makes it somewhat unique?

Nesmith: Sure. We are not allowed to collect personally identifiable information at the department. So that makes the job of the department incredibly difficult because you don't have that PII. Additionally, our law, and I'm fairly sure of this, is the only law with criminal penalties for violations. So six months in jail or $10,000, and they're personal penalties, too, so a teacher is at risk. I'm at risk. Anybody that might release the data in a way that is not aligned with the law could be prosecuted.

How has that made your job difficult?

It's tough because it's scary, right? So people are afraid, and I hate that. But the good side of it is, it's caused people to be aware of student privacy. I'm not out there trying to convince people how important student privacy is anymore; the law has made it important.

But the difficult side of it is people are afraid, and people are scared, and it's a challenging situation. The consequences are what makes it scary. The thing that makes it most challenging, I think, is the fact that the department isn't able to have PII. Because, when you think about all the things that you do for districts, it is strictly, totally limited.

So to the point of putting children's names on their test booklets prior to the kid taking the test—that's an incredibly difficult job for us now because we aren't allowed to have PII. So, how do you make sure that Johnny gets Johnny's test and then Johnny gets Johnny's score, and then Johnny's teacher, you know for value-added, is associated with her.

Is there a lot of conversation or pushback in the state about it being so strict that it kind of ties educators’ hands and what they are able to do in ways that can be beneficial to students?

There are people that feel that way, and I think that it is challenging for educators. But I think the reason the law came about is because parents had a real fear of what was happening with their children's data. And it was a very real fear for them. They were afraid of what researchers might be doing with the kids' data. They were afraid of where we might share the data. And so they had this very real fear.

So you have to balance the two. You have to figure out how you can allow teachers to do what they do best, and be enabled and accelerate the work with technology, and at the same time work within the law. And so, that's the challenging part, is to find that framework, and to get everybody educated and come up with the structure that allows us to go ahead and move forward and still do our work, while also keeping an eye on—and making sure we are taking care of—kids' data.

I imagine you are working with schools and districts pretty closely on this. How do you help them navigate the law and not be afraid of technology? I can see it being like an either/or—either we follow the law or we take this big risk and introduce technology that, in a lot of cases, is going to collect some student data.

In the beginning, I think people definitely felt like it was an either/or and kind of steered clear of it. But as we went forward, what we did was we worked out ways to make it happen. The first thing we did was, I developed a data governance and privacy guidebook, and that outlined how you develop a team, how you can get people on board, here are some things about security, and it outlined all the laws. Our teachers aren't necessarily aware of the federal laws, even, that guards students' data. I felt like it was a really unfair place for them to be in. So the guidebook that we developed was about helping them understand, here are the different laws, here are the things that are applicable to you, and here's the way to keep data secure and private and protect kids.


This week’s podcast sponsor is Emporia State University’s Instructional Design and Technology program: designed for those interested in creating dynamic, interactive learning environments in both public and private sectors, the master’s in IDT from ESU can be completed quickly and entirely online, preparing educators for the new age of the technology-driven learning environment. Learn more here.


But then, what we did after that, after we developed the book is, Louisiana is blessed with this structure of training districts. And so every quarter, department people go out and go to the different regions of the state and actually train. So there's this great forum that was already existing that I was able to just step into and leverage. So we went through and we trained on that guidebook. We also went through there, and one of the things that you're supposed to do is develop policies. So we talked through with districts and let them talk with each other, “OK, what kind of policy can we put in place that's going to protect teachers but also show them the path to be able to use technology?” And so letting those district leaders talk it out and come up with good solutions on their own, and work through that, I think that was probably the most valuable thing that we did.

So we came up with different ways to be able to utilize the technology, but also not necessarily give students' information. So we either are going to have a data-sharing agreement with that company on how they're going to protect kids' information and that way the district can share that information if they need to. Or we could possibly get parental consent to use a piece of technology if we think it's safe. And then the third option we came up with was somehow de-identify the kids' information. So they pick their favorite superhero, or they use their initials, or they use something that identifies them as them in the software, but does not give away any of their personally identifiable information. So those were the three ways we chose to handle using technology in the classroom but also protecting kids' privacy.

And what would determine which of those three options a classroom might pursue? Is that up to the teacher’s discretion?

In the data governance guidebook, one of the things we recommended was you've gotta have an online policy for your teachers. You've gotta have an online-tools policy. You have to be able to guide that teacher through it. They are going to use technology. And so the best method for a district is to figure out how they can help that teacher do it. And so I strongly encouraged they didn't prohibit anything, but that they set the tone of come talk to me about it and we'll figure it out.

So in some districts, if we set up this policy, this online policy of maybe it's when something's purchased it has to come by that person responsible for privacy by their desk, and they had to see it, or maybe they did trainings with the teachers and they said, “Here are all the software that we have picked out, that we have agreements with that we feel are safe. So if you want something that's kind of quizzing, here's the software that kind of gives you a quiz. If you want something that kind of allows you collaboration, here's one that we have approved.”

Would you say the companies that have come on board with Louisiana districts have had to be pretty flexible and nimble in what they are able to accommodate, or is it not that big of a lift on their end?

To be honest, when the department is providing a service to districts, what we did is we developed a data-sharing agreement with that entity, and we established a third-party stipulation in the agreement which allowed districts to opt into that agreement unilaterally by signing one little piece of paper. And so for those I negotiated those [agreements], and worked with some district leaders that also kind of wanted to have input, and we found compromises. But it was challenging. Because some of the things that some people in the state wanted were really, really restrictive, and some of the companies just couldn't agree to those things, and so we had to negotiate and have some back and forth.

Has the law been enforced?

No one has been put in jail or fined. I’m very thankful for that because I consider that one of my jobs, to help people out and to make sure that doesn't happen.

I am not a lawyer, but I think it would have to be a district attorney wanting to take that on, probably.

Talk about the upside of the law. Do you see educators in Louisiana being more thoughtful about this, than maybe those in other states?

Absolutely. I think that's one of the benefits of the law. So there's this part that makes it hard and challenging, but there's this other side of it where I was doing data governance and privacy before the law came about, and it was a struggle to get anyone to care. Like people didn't even know what FERPA was, they didn't care. I mean they care about students, but they didn't understand this movement that has come across the nation has truly made a difference I think for everyone.

The last several years have caused us to all to be more cognizant of keeping our information private, protecting ourselves from identity theft, and you know, cybersecurity attacks are on the increase for educational institutions. Last year, the U.S. Department of Education even put out warnings and information. So kids' information is very valuable, it's like they don't have a credit history yet. They are like this blank slate, and it's going to be a long time before they are ever found out because they are young and they aren't even going to be entering this credit world, until later. So there's this period of time, and from everything I've read, it's very valuable.

So I think that this law has helped us all become very, very aware, and has caused us to think about something. Do I think that maybe it was a little too strict in certain areas or challenging to work with? Absolutely. But do I think the benefit of protecting kids’ data there? Absolutely.

 

Next Up

The EdSurge On Air Podcast

Can Work Be Dignified in an Automated World?
Can Work Be Dignified in an Automated World?
EdSurge Podcast

Can Work Be Dignified in an Automated World?

By Sydney Johnson
Jun 11
EdSurge Podcast

Can Work Be Dignified in an Automated World?

Transgender Students Are Still at Risk, But Schools Can Help
Transgender Students Are Still at Risk, But Schools Can Help
EdSurge Podcast

Transgender Students Are Still at Risk, But Schools Can Help

By Stephen Noonoo
Jun 4
EdSurge Podcast

Transgender Students Are Still at Risk, But Schools Can Help

Is There Still a Meaningful Difference Between For-Profit and Public Higher Ed?
Is There Still a Meaningful Difference Between For-Profit and Public Higher Ed?
EdSurge Podcast

Is There Still a Meaningful Difference Between For-Profit and Public Higher Ed?

By Jeffrey R. Young
May 30
EdSurge Podcast

Is There Still a Meaningful Difference Between For-Profit and Public Higher Ed?

Trending

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up