ChatGPT & Generative AI

AI and the Law: What Educators Need to Know

As teachers incorporate AI into their classrooms, it’s important to educate students about data security, privacy, and cyberbullying.

October 29, 2024

Your content has been saved!

Go to My Saved Content.
Chelsea Beck for Edutopia

Artificial intelligence (AI) is rapidly transforming the world of education. As with any new technology, there can be some hesitancy about not knowing enough, how to use it, and where to begin. As an attorney and educator interested in AI, I’ve become familiar with legal issues that educators may not fully understand or even be aware of, especially with so many new tools and platforms being developed daily. Regarding AI, we must be even more proactive because of issues such as data privacy, intellectual property rights, and cyberbullying, which can present new challenges that require extra consideration.

This article explores relevant laws and actionable tips for educators and schools to ensure compliance with applicable laws. When we all have a clear understanding of these issues, we can help protect all students and families, educators, and the school community from potential legal risks.

First, let’s look at a few key laws.

FERPA (Family Educational Rights and Privacy Act): A federal law that protects the privacy of student education records. Schools must ensure that any AI tools used in the classroom comply with FERPA regulations, particularly when sharing student data with third-party providers. 

COPPA (Children’s Online Privacy Protection Act): This law applies to websites and online services that collect personal information from children under the age of 13. COPPA requires parental consent and imposes strict rules about how any data obtained may be used. When using AI that requires student input of data, educators must ensure COPPA compliance. 

GDPR (General Data Protection Regulation): A European Union law that governs data protection and privacy. While it is a European law, it is relevant to U.S. schools that use tools provided by the EU. GDPR tends to be stricter than COPPA and FERPA, but you want to make sure that student information is protected. If you have noticed, many sites now have pop-ups that ask about cookies and whether all are allowed or not, or you can customize them only for those that improve functionality. 

When we decide to bring AI-powered tools into our teaching practice and classrooms, there are specific areas that we must focus on. In my classroom, I explain the importance of close evaluation of a tool and the type of data it requires for my students. I always say, “You should not have to go on a scavenger hunt” to find out what is happening with your information. In particular, we may share information that we think does not qualify as personally identifiable information (PII); however, it could provide just enough to lead to a problem. For example, sharing a pet’s name or a favorite restaurant in our town can be PII and seem completely innocent and untraceable; however, it could provide enough information to impact our data privacy.

Data Privacy and Security

AI tools require massive amounts of data, which includes obtaining PII from its users. The data obtained can then be used to train it or other AI models or improve the tool’s effectiveness and functionality. It is important to monitor what happens to the data entered because it can subject the user to security breaches or potentially be misused by third-party providers, of which we may be unaware. For example, a website may have a disclaimer that they do not use or share our information, but there may be other integrations in the platform that have access to our information and can share it. The best we can do is to continue to review the information provided on a website and also review and customize the cookie settings.

Actionable step: When bringing AI tools into our classrooms, conducting a thorough vetting process is crucial to ensure student safety and data privacy. First, start by reviewing the terms of service, privacy policies, and evidence of compliance documentation. Look at the site to find references to its compliance with FERPA and COPPA and any additional privacy standards or disclaimers it may offer. Confirm that student data is only being used within the tool for educational purposes and not being shared with third parties. Having a team in place, along with your school’s IT department, will help ensure that the platform is fully compliant with privacy laws before you use it in the classroom.

After vetting them carefully, there are some tools that I now use regularly in my classroom. I recommend Brisk Teaching, CoGrader, Diffit, Eduaide, Magic School AI, Quizizz, and Snorkl. I have found that these comply with laws to keep student data safe, but it‘s always a good idea to check with your administration and IT team to make sure.

AI and Cyberbullying

There are two key considerations when it comes to AI and cyberbullying. On one hand, AI has been used within social networks to monitor, flag, and, ideally, prevent cyberbullying. Because of natural language processing, AI-powered tools have programming that enables them to detect and flag abusive language and monitor social media posts. In educational settings, platforms can also have this functionality and alert educators to real-time incidents. This capability is highly beneficial, especially with so many of our students interacting with each other via Instagram, Snapchat, and other platforms. Some social media platforms immediately remove content that is questionable or harmful, and this is a result of AI functionalities. 

However, there is an increase in using AI to cyberbully. One example is the use of deepfakes, which are AI-generated media (audio, images, video) that simulate real people and have been used to harass or defame students. If you have seen deepfakes, you know it can be difficult to determine whether it is a real image or video. However, we have to ensure that students know how to evaluate these media and keep themselves safe. Asymmetrical features such as eyes that are not the same shape; ears that are not aligned; or extra fingers, arms, or other body parts that are interconnected in awkward and unlikely ways are clear giveaways of deepfakes and can show up under close examination. 

Actionable step: We must be as proactive as possible. Take time to teach all students about the ethical use of AI. Guide them as they learn about the short- and long-term impacts of creating or sharing deepfakes and how to spot them. When there is a system for understanding how to interact responsibly, a system for reporting any issues, and intervention protocols in place, we can keep our students safe and avoid the legal implications. 

Protecting Our Students

To keep students and their data safe, educators should focus on transparency and informed consent. Before using any of the AI tools, it’s important to provide clear communication with families and students about the data being collected, how it will be used, and what protections are in place. When working with students under the age of 13, compliance with COPPA may require written parental consent. Be sure to provide parents with access to informational resources and opportunities to ask questions about data privacy measures and security protocols in place.

It is also important to continue to teach about digital citizenship to all students, especially with so many new technologies available to them. Focusing on ethical, safe, and responsible use will help students better evaluate, recognize, and respond to risks such as deepfakes, cyberbullying, or potential biases in AI-generated content. Schools should have clear policies on managing student data and encourage students to actively participate in conversations about online safety. By building a culture of transparency and digital responsibility, educators can ensure that AI tools are used safely and effectively, empowering students to navigate these technologies with confidence.

Share This Story

  • email icon

Filed Under

  • ChatGPT & Generative AI
  • Education Trends

Follow Edutopia

  • facebook icon
  • twitter icon
  • instagram icon
  • youtube icon
  • Privacy Policy
  • Terms of Use
George Lucas Educational Foundation
Edutopia is an initiative of the George Lucas Educational Foundation.
Edutopia®, the EDU Logo™ and Lucas Education Research Logo® are trademarks or registered trademarks of the George Lucas Educational Foundation in the U.S. and other countries.