Ed Tech Was a Godsend During Pandemic, But It May Have Opened a Pandora’s Box of Data Privacy and Security Issues, Says CSUN Prof

Educational technology, while a godsend during the pandemic, may have left behind “cookies”that could ultimately build a profile of the child’s behavior that could inadvertently shape educational opportunities and career choices, according to a new CSUN study. Photo by VioletaStoimenova, iStock.

Educational technology, while a godsend during the pandemic, may have left behind “cookies”that could ultimately build a profile of the child’s behavior that could inadvertently shape educational opportunities and career choices, according to a new CSUN study. Photo by VioletaStoimenova, iStock.


Educational technology appeared to be a godsend during the COVID-19 pandemic. It helped keep students on track academically, while at the same time providing teachers and parents ways to track a child’s progress.

But it may have also left behind “cookies” — micro files created by software or app developers to collect information about the user to allegedly improve the experience. These cookies could ultimately build a profile of the child’s behavior as they continue to use educational technology throughout their academic career, a legacy that could inadvertently shape educational opportunities and career choices.

image of Kristen Walker sitting with hands on lap

Kristen Walker

“When the pandemic hit, educators from kindergarten to college were looking for innovative solutions to ensure that students could continue to get access to learning materials, and many of those solutions involved educational technology,” said California State University, Northridge marketing professor Kristen Walker, an expert on technology and data privacy in the David Nazarian College of Business and Economics. “The speed at which those solutions had to occur — the speed at which educators all over began to rely on educational technology — opened a Pandora’s box that could lead to long-term repercussions for students and society.”

Walker and a team of CSUN researchers explored the risks and vulnerability of K-12 students’ personal data in a study, “Compulsory technology adoption and adaptation in education: A looming student privacy problem,” published recently in the Journal of Consumer Affairs. Her co-authors include Kiya Bodendorf, a graduate student in the Michael D. Eisner College of Education; fellow marketing professor Tina Kiesler; Georgie de Mattos, a graduate student in the College of Social and Behavioral Sciences; Mark Rostom, an MBA student in the Nazarian College; and Amr Elkordy, an electrical and computer engineering undergraduate student in the College of Engineering and Computer Science.

The researchers examined the adoption and use of educational technology by public schools in California, in part because the state has some of the nation’s strictest laws and policies when it comes to data privacy.

What they found was confusion between school district employees about the use of technology in schools. The information technology staff are more focused on securing student data, while the educational technology professionals (teachers and administrators) are focused on using technology to enhance learning. The research team found this confusion created a “privacy-security chasm” in schools that underscored “the growing need to understand student privacy protections as part of children’s digital well-being,” Walker said.

School districts throughout California have policies to protect their information technology and data from security breaches and hackers. But few, if any, have clear polices to ensure that manufacturers of educational technology do not collect or track the data of students using their software or apps.

Educational institutions have designated information technology (IT) teams responsible for procuring and securing devices such as laptops and servers. But when it comes to educational technology, those same institutions leave those responsibilities to teachers or school administrators.

“That’s where the privacy-security chasm comes in,” Walker said. “The IT folks are focused on cybersecurity and the teachers and administrators were focusing on teaching and learning technology for those specific purposes.

“The teachers on the ed tech side aren’t necessarily focused on cybersecurity or privacy,” she continued. “They’re focused on the teaching experience, and rightly so. The vendors assure them that they only share and collect student data to help develop better teaching and learning software tools and apps. But they don’t mention that they also share the data with their partners and affiliates, and not necessarily for the purpose of developing new educational technology. They’re just sharing it.”

That sharing can become particularly insidious when a vendor may be a subsidiary of international technology giants who have several subsidiaries that the data could be shared with, Walker said.

“The data a vendor collects on students could be shared all the way up the food chain, and parents and teachers would not know what was shared or how it was used,” she said. “Data about a student, initially collected when they started using EdTech in kindergarten or first grade could be added to and built on as they continue to use educational technology throughout their academic journey — through middle school, high school and even college.

“This raises several red flags,” Walker said, noting that students often don’t confine their use of school-recommended learning apps and software to school-issued devices.

“Instead, they’ll use mom or dad’s laptop because it’s faster or they’ll use their personal cell phone because it’s easier and, whatever device they are on, the EdTech is going to leave behind cookies and collect data,” she said.

Since students and parents are told that whatever EdTech they are using is educationally necessary, they are more inclined to accept the cookies that pop up when they launch the program, Walker said.

“This is part of the learning process, so they assume someone else is protecting their privacy,” she said. “As students continue do this during their learning experience, they adapt to the idea that someone else is going to protect their privacy, and that’s not happening.”

Walker pointed out that the data collection takes place during a critical window of time in a student’s life and can be used to predict future academic potential.

“Perhaps the student just learned a parent has cancer. Their community has just suffered a natural disaster. There were riots in their city,” she said. “All of these could impact a student’s performance, but the algorithm isn’t capable of taking these very personal experiences into account, the way a human teacher can, in building a profile of that student. But these ‘blips’ remain in that student’s digital profile and could be used, down the road, to influence college choices and even career paths. The algorithm doesn’t allow for individual whims or exploration. Your ability to learn at your own pace and with your own agency is going to be impacted.”

Walker said the move to virtual learning during the height of the pandemic led to a deluge of educational technology in the classroom.

“Educators had to do what they could to ensure their students continued to learn,” she said. “We need to take a moment now that the pressure of the pandemic has subsided and ask, who is watching out for our children’s privacy and what are going to do to ensure that our children and their privacy are protected.”

, , , , , , , , , ,