Article
Resources
Article
ChatGPT’s Impact on Education and Student Data Privacy
Data privacy professionals have characterized the data privacy risks associated with ChatGPT as a “nightmare.” In order to function, open artificial intelligence programs (“OpenAI”) like ChatGPT require huge amounts of data in order to learn and evolve. Where do the programmers get that data? From you and me without our knowledge or permission. Developers of ChatGPT used 300 billion words systematically collected from the internet, including from books, websites, articles, and online posts, in order to create the algorithm behind ChatGPT. As you use ChatGPT, it also saves your questions and uses that information to learn and evolve. The problem is, in collecting the data that formed ChatGPT, personal information necessarily was collected without permission or compensation. Even though this personal information may be publicly available, it may breach what is called contextual integrity, which dictates that individuals’ information is not revealed outside of the context in which it was originally created. There is also no way for an individual to determine whether ChatGPT holds the individual’s personal information as part of its algorithm.
With each advancement in educational technology, especially technology that is used in an educational setting but was not designed exclusively for educational purposes, the issue of student privacy is necessarily at the forefront. Student privacy is protected by The Family Educational Rights and Privacy Act (“FERPA”). Pursuant to FERPA, educational institutions that receive federal funding have a legal duty to protect students’ personal identifying information (“PII”). Because the U.S. does not have a universal privacy law, educational institutions must ensure compliance with both FERPA and the privacy laws in the state or states in which they operate. Because you do not know what information forms OpenAI programs like ChatGPT’s algorithms, and therefore you are unable to ask developers to delete personal information they may hold, these OpenAI programs may be in violation of some state privacy laws that provide a right to delete personal information, like the California Privacy Rights Act and the Virginia Consumer Data Protection Act.
Education experts have repeatedly warned that the increased use of technology tools and apps in the classroom puts students’ data at risk. Recently, with the rise of ChatGPT, that risk has only increased. As discussed above, there are inherent privacy risks associated with the use of ChatGPT. Recently, it was discovered that a majority of educational technology companies use tracking technologies and share students’ personal information with third parties. Did those third parties include the developers of OpenAI programs like ChatGPT? We do not know. What we do know is that any information shared with ChatGPT or other OpenAI programs in an educational setting can then be shared by that program elsewhere. Additionally, these OpenAI applications can be used to search for student PII that may have been used to create the algorithm or that was used through the use of the OpenAI program.
What is the answer then to the question of whether ChatGPT or other OpenAI programs should be used in an educational setting? Some advocate for an outright ban. Others side with educating educators about the benefits and risks associated with using ChatGPT as an educational tool. This education necessarily includes making educators aware of the inherent risks to students’ data, and teaching students about these risks if they use an OpenAI program at home to assist them with their school work. Educational experts are afraid that if schools outright ban the use ChatGPT or other OpenAI programs instead of using them responsibly that it will create a rift between the educators and their students, and possibly create a digital learning gap. If you have questions about protecting your students’ PII in today’s ever-increasing technological world, please reach out to a member of Spilman’s Education Practice Group for assistance.