Insights interview: how to respect data privacy as you grow your edtech
What’s the purpose of this interview and who’s it for?
What should you as an edtech leader be aware of in terms of best practices, existing laws, and emerging laws to appropriately respect the data-privacy rights of students? How do these considerations change for designing, testing, and improving your services, or sharing results? How do you respect data privacy as you grow your edtech?
Edtech promises to provide data-driven learning. Learning that’s personalized to the individual student. Teaching that’s formatively guided by rich data insights. Curricula and pedagogies that are guided by empirical findings. And, edtech that’s been engineered for outcomes and improved through respectful data-mining and formative impact research.
But we shouldn’t forget that the deeper we tap into the data-driven potential of edtech, the better informed we need to be about data privacy. And, the more rigorous our systems and processes need to be to avoid privacy risks and ensure students are well served.
Existing laws are often onerous to edtech companies. For example, the 2018 General Data Protection Regulation (GDPR) for the EU comprises 99 articles detailing the rights of individuals and obligations of businesses. And, they’re often not fully understood by your customers (educators and institutions).
Given these challenges, I thought it very timely to interview a forward-looking legal expert in this space.
Insights from an expert
Elana Zeide is the PULSE Fellow in Artificial Intelligence, Law and Policy at the UCLA School of Law. She’ll be starting as an Assistant Professor at the University of Nebraska’s College of Law in the fall as part of their new Nebraska Governance and Technology Center.
Elana’s research focuses on new and emerging edtech and how these technologies impact privacy, equality, and access for learners. From many interesting discussions with Elana, what I’m so impressed by is the breadth of her curiosity and understanding—from evidence-based teaching and learning models, through what edtech does or could enable, to what are the legal and ethical considerations or implications. This is wonderfully encapsulated in Elana’s infographic (I want one of these too!). A great example of how Elana is able to synthesize edtech capabilities, teaching practice, and legal considerations to provide practical guidance is encapsulated in 19 Times Data Analysis Empowered Students and Schools. Although published in 2016, it remains an essential read.
PLEASE NOTE: this interview is not legal advice
First, Elana, thank you so much for making time for this! Second, context. I help edtech startups to grow and big education companies to digitally transform. A key driver is helping them to use learner data to power more effective products, more actionable insights for teachers, and product improvements. With that audience and those goals in mind, I think you could provide fantastic guidance and insights in four key areas: 1. understanding current best practices and law for handling student data, 2. ethical considerations and limitations for testing, 3. emerging, special considerations for using AI, and 4. opportunities and challenges for edtech in the future. Does that sound good?
Great. These are all important concerns that edtech developers should consider as they design, market, and implement their products. It’s better to take these into account from day one than try to address them after development. This is called values- or privacy-by-design.
The law on data privacy: facts and misconceptions
In your experience, what are the biggest misunderstandings of current legislation in how edtech companies should handle student data? What’s your simplest explanation of the law for edtech serving K12 students? HED students?
In the US, many edtech vendors say that they are “FERPA-compliant,” referring to the most prominent federal student privacy law, the Family Educational Rights and Privacy Act. The law requires schools at all levels to get consent or be sure an exception applies when they share personally identifiable information—including edtech companies providing academic and administrative tools. If a school has a policy or practice of not doing so, then it risks losing all federal funding. K-12 schools must get consent from parents, and higher education institutions from students. Schools typically share information with edtech vendors under the school official exception, which, among other things, requires that the disclosure promotes a legitimate educational interest.
Edtech vendors can’t be FERPA-compliant, because the law doesn’t apply to them. They can, however, design tools to allow students and parents to correct information, control third parties’ access to student data, and provide enough information about their privacy practices to allow schools to see that they would not be violating FERPA merely by using the software as provided. Terms of service that say “FERPA compliant” are a red flag that the company or its counsel don’t understand the basics of student privacy law.
For edtech companies aiming to serve multiple countries, what advice would you give them to ensure they fully understand and adhere to local laws?
The privacy legislation landscape is constantly changing. Recently, a European court invalidated key parts of the Privacy Shield that governs EU-US data transmission. The key here is to have well-informed counsel. Companies should ideally engage an attorney that specializes in student privacy law—and children’s privacy if its products target learners younger than 13 years old—in the country in which they plan to deliver the service.
Getting started and current best practice for handling data
For an edtech company starting out, what advice would you give them on best practice of handling student data? What resources would you point them to depending on the age group and country they’re serving?
Again, the safest route is to hire an attorney or certified privacy professional. To get a general idea of best practices and the legal landscape, companies can consult a variety of resources from the government (for example, the U.S. Department of Education), non-profits (like the Future of Privacy Forum), or trade groups (like CoSN).
Ethical and data-privacy considerations for testing and improving edtech
A key opportunity for edtech leaders is to use data to understand how their users choose to use their product to guide how they can improve it. This can range from “anonymized data mining” to IRB-approved, on-the-ground impact research. What rules of the road would you advise for these two activities?
Anonymized data mining to detect trends is permitted under most privacy regimes. Edtech companies should be careful, however, to examine the impact on particular demographics as well as the larger cohort. Researchers have found that traditionally marginalized populations often fare less well, or have different false positive and negative rates, that can reproduce existing social inequalities.
IRB approval is the gold standard for academic research, but people have to be careful that they are still conducting research in an ethical manner—particularly when one experimental group is performing much better than another. Consider setting a threshold in advance that will trigger a re-examination of the experiment or application of the better condition for all cohorts.
Demonstrating edtech impact and what to avoid
Many companies want to demonstrate the value of their edtech by researching results customers achieve and using the results in marketing. What advice would you give them on what’s legally appropriate and what isn’t?
Companies using case stories about students in non-profit or public education in America (at all levels) must be careful not to share students’ personal information without parental or older students’ consent—which includes information that might identify them to members of their community. For example, results that indicate the only Latina female in a class had a certain result. The U.S. Department of Education has guidance on how to handle these small-cell outcomes.
Edtech vendors also have to be careful when offering software or other incentives to social media influencers. Anything that might be considered sponsorship must identify itself as an advertisement under the rules of the U.S. Federal Trade Commission.
Anonymization: a tractable or impossible goal?
What is a legally reasonable level of anonymization edtech data architects should aim for? As hackers get more sophisticated, do you think true anonymization will become an impossible (or too expensive) goal?
Anonymization used to be a simple way out of most privacy problems, but that’s no longer true. This is not only because of data breaches and hackers, but day-to-day accidents by school and vendor employees that inadvertently disclose students’ personal information. The primary solution to this is encrypting as much as possible.
Beyond that, research has shown that it’s very easy to re-identify individuals with only a few pieces of information. The best way to handle that is with data minimization and retention limits. That is, to only collect and store what is necessary to provide services and to delete information that is not longer needed. People talk about data as the new oil. However, while it can lead to amazing insights, in the context of inadvertent disclosure or reidentification it’s more like nuclear waste.
Looking ahead: planning for emerging legislation
What do you think will be the biggest changes in legislation on handling student data in the next five years that edtech leaders should be anticipating?
Predicting privacy regulation is a tricky business and it’s hard to project too far in the future. I think applications that try to contact-trace students to manage Covid and the increasing incorporation of sensor devices in classrooms and on campuses may lead to more scrutiny of device security and concerns about the collection of biometric information. Europe’s General Data Protection Regulation, which went into effect two years ago, has had a tremendous impact on how U.S. companies collect and process personal data. California passed similar provisions in its new California Consumer Privacy Act and several states have begun to draft their own legislation based on its model.
Emerging, special considerations for AI-powered edtech
AI holds lots of promise for personalizing learning. Equally, many are concerned that it will have built-in biases due to how the algorithms are trained. Do you have advice for how to mitigate this risk?
This will be a key issue moving forward. Companies can take technological and procedural approaches to mitigate the risk of outcomes that disproportionately hurt marginalized populations. First, they should use high-quality and comprehensive data to train models. They should also review the distribution of outcomes for various subpopulations and tweak their algorithms as necessary to ensure fair outcomes.
Second, edtech vendors should be as transparent as possible about these processes and results. They should have procedures in place for students, parents, teachers, and administrators to request an explanation and contest unexpected outcomes. Straightforward communication, as well as an acknowledgement of potential shortcomings, goes a long way to cultivating trust.
Opportunities and challenges for edtech in the future
What do you see as three of the biggest data-driven opportunities for edtech leaders to improve teaching and learning and why?
My thoughts on this have shifted with the pivot to remote learning as a response to Covid. I still think personalized learning remains the holy grail for education applications. But there are also great opportunities to help teachers create better remote courses—including curating material and offering tools that allow for group projects or small-group exercises in a virtual environment. The same is true of ways to facilitate more experiential learning, both in classrooms and online.
What have I missed that you’d want to tell any edtech CEO?
So many people who go into edtech are on a mission to improve learning and expand opportunities for students. They are fighting the good fight. But as they develop, pitch, and promote their products, people sometimes forget that not everyone knows that they have good intentions or that they are (hopefully) being scrupulous and rigorous.
It’s important to remember that the students, teachers, parents, administrators, and communities you serve need information and evidence to trust you. They want to know that you’ve considered their point of view. So be open about the limitations of your products.
Finally, don’t push for growth and visibility at the expense of privacy or pedagogy. People are much less tolerant of the “move fast and break things” mentality than they used to be.
Thank you, Elana. This has been fantastic.
Thank you, Adam!
Elana provides several career-earned lessons on data privacy that really resonate with my own experiences building edtech. In particular:
- Architect privacy into your edtech: Engineer your edtech to take into account data privacy from day one—“privacy-by-design”. It’s much more expensive and time-consuming to re-architect your product for these requirements at a later date.
- Don’t say you’re FERPA compliant! It’s a red flag that you don’t understand student privacy law. Instead, ensure you design your edtech to allow users and stakeholders to correct and control how student data is shared. And provide prospective institutions with sufficient information that they can be confident they can be FERPA compliant using your product.
- Start with local advice if you want to go global. Privacy laws are changing quickly and differ significantly by geography. Enlist a legal expert who specializes in student privacy law in the countries in which you want to sell.
- Embrace the awesome responsibility of delivering educational equity. Many products don’t deliver fair outcomes for all students. Anonymized data mining may give you directional insights. But, get IRB (Institutional Review Board) approval to partner with institutions to really understand if your product helps specific populations and how to improve it.
- Be transparent if you want to win your customers’ trust. You are probably passionate about your edtech because you want to help students and teachers to succeed. But, your customers don’t know that. Be proactively transparent with your market about your technology, processes, and results and honest about your claims and their limits.
- “Move fast and break things”—but not with students. Lean design and iterative development are powerful techniques for refining technology. But, edtech comes with special considerations. Remember that pedagogies are critical. That learning takes time. That you need to engage real students and teachers. Strive to improve outcomes and respect data privacy as you grow your edtech.
If you need help on how to respect data privacy as you grow your edtech with iterative testing, data mining, impact research, and marketing results (takeaways 4–6), please contact us today with the form below. If you need legal advice on takeaways 1–3, please contact Dr. Elana Zeide at email@example.com.