This item originally published on the CoSN blog and is republished here with permission.
Key points:
In recent years, school districts have shown growing interest in the potential of generative AI (GenAI) to revolutionize education. GenAI offers the promise of enhancing personalized learning, streamlining administrative tasks, and providing innovative educational resources. However, as districts rush to adopt these cutting-edge technologies, they must carefully select the right AI tools for their particular needs. This rapid adoption carries significant risks, particularly in terms of data privacy and accessibility.
Ensuring that AI tools protect student data and meet accessibility standards is crucial to creating an inclusive and secure educational environment. This blog post will explore expert recommendations for selecting GenAI tools, helping districts effectively address these challenges.
Data Privacy Considerations and Recommendations for Adopting GenAI in Schools
Linnette Attai, project director for CoSN’s Student Data Privacy Initiative and president of compliance consulting firm PlayWell, LLC, shares her insights on the data privacy risks associated with tool adoption GenAI and offers guidance for responsible implementation.
While security breaches are a common concern, Linnette emphasizes that protecting student privacy and data is about more than preventing breaches. There is a broader responsibility to protect the emotional well-being and personal information of students, or as she calls it, a “responsibility of care.” Key privacy considerations include:
Data ownership and control:
District leaders should be careful when using large language models not specifically designed for educational purposes. These models could use student data to perfect AI, raising concerns about the commercial use of personal information and the potential exposure of sensitive data. Additionally, for some districts, any type of commercial use of personal information is illegal.
Linnette advises districts to adhere to core practices when adopting new tools:
- Have a clear objective: Despite the growing popularity of GenAI tools, districts should identify a specific reason for their use. This approach ensures the tool aligns with district needs and maximizes its impact on student outcomes.
- Be informed before testing: Districts should fully understand the tool, including its privacy practices, security measures, and contract terms, before committing. In particular, districts must ensure that the tool is used only for educational purposes.
- Start with the staff: Testing AI tools with staff rather than students helps avoid premature exposure of student data. Some companies offer beta testing or sandboxes that allow staff to simulate student experiences, which can be a valuable way to gauge the effectiveness of the tool.
A practical example: Hinsdale Township School District 86
Keith Bockwoldt, chief information officer for Hinsdale Township School District 86 in Illinois, shares his district’s thoughtful approach to GenAI. Keith’s “Reimagining Learning Through Innovation” program allows teachers to test new tools funded by the district’s IT budget. Teachers submit proposals for evaluation, which are assessed for compliance with data privacy policies prior to pilot implementation. Teachers must then provide evidence of the tool’s impact by the end of the year, and the department considers whether the tool should be adopted more widely.
Keith highlights two key considerations:
- Supplier Compliance: It ensures that vendors are aware of and comply with data privacy policies, such as the Student Online Privacy Act (SOPPA). It discusses data protection measures, including data deletion and storage practices.
- Ongoing supplier engagement: Ongoing communication with vendors is crucial to maintaining compliance with data privacy standards.
Ensuring accessibility
Jordan Mroziak, project director for AI and education at InnovateEDU, highlights the need for a deliberate approach to adopting new technologies. He warns against the educational arms race of adopting unproven or potentially dangerous AI products. Instead, districts should focus on meeting the needs of all students, especially those who are underserved or disadvantaged. As a helpful resource, Jordan has shared his and colleagues’ work with the EdSAFE AI Industry Council, which aims to provide trusted guidance and standards for districts exploring GenAI tools. Companies join this alliance by demonstrating how their products adhere to the SAFE Framework for AI, which emphasizes safety, accountability, fairness, equity and efficiency. This collective effort ensures that AI tools are developed with these essential principles in mind, thereby promoting responsible and effective use.
Additionally, the recent update to ADA Title II requires accessibility to be a priority from the start. Districts should choose AI tools that meet ADA standards and ensure equitable access for all students. This process includes evaluating tools for compliance with accessibility guidelines, engaging various stakeholders in testing, and making adjustments to meet various learning needs. By proactively addressing these requirements, districts can ensure their AI tools are inclusive, effective, and legally compliant, maximizing the benefits of the technology for every student.
For more recommendations on accessible implementation of GenAI, read blog 5 in this series: Adapting to ADA Title II: Effective Strategies for Accessible AI in Education.
Integrating generative AI tools into education offers significant opportunities to improve learning and efficiency. However, this also poses challenges related to data privacy and accessibility. Thoughtful implementation and ongoing evaluation are essential to maximize the benefits of these tools while ensuring the protection and support of all students.