SUPATMAN/STOCK.ADOBE.COM
Artificial intelligence in the form of ChatGPT burst onto the educational scene in late 2022. As AI use continues to grow, many predict that it will revolutionize the way we work across all education disciplines. School board members have been tasked with creating policy for AI usage in their classrooms and schools.
It is easy in these situations to take a wait-and-see approach. But this leaves school districts flat-footed and puts them in a position to play an arduous game of catch-up when the rubber (undoubtedly and quickly) hits the road. Some state legislatures, such as California, have already enacted legislation around AI that affects K-12 education. Vendors are reaching out to school districts to pilot new AI tools. Students and educators can access powerful AI tools in their own right now.
We want to help you get ahead of the AI policymaking curve. In fact, we want to help you shape the AI curve in the education space through policymaking. That work starts through a better, collective understanding of the AI landscape. Below are six steps that can help your board start or continue that journey together.
1. Decide on your scope
Before you dig into your district’s AI activities, your board and administrative team should be aligned and clear on the direction for the work. This agreed-upon direction is best captured as a sentence or bulleted list that will serve as a foundation for more extensive goal setting and planning as a board or committee. Consider it your North Star.
School boards need to draft and approve AI policy language and effectively provide oversight of those policies in action, adjusting where necessary. Boards are in the front lines of vetting which AI technologies will benefit students and educators, ensuring that AI policies are appropriate to the evolving future with the technology.
2. Convene a task force
AI policymaking is an interdisciplinary effort. To this end, we recommend convening an AI task force that will help advise on AI policy and draws from a variety of knowledge areas.
Your task force should:
- Help you understand the possibilities and repercussions of your AI policymaking and related efforts from a variety of angles.
- Identify resources that you might not have otherwise found on your own.
- Provide you the pulse on the ground before decisions or content are released.
- Amplify your efforts by acting as an ambassador to their own communities.
These folks shouldn’t just be data and technology experts, though you will want representatives with those knowledge sets at the table. Consider also including the following areas of expertise: data privacy, professional development, technology literacy, English Learner equity, special education equity, racial equity, and technology accessibility. You may want to include parents on this task force.
And before you convene your task force, we advise you to clarify the role of your members up front. What are their specific responsibilities? Where does their decision-making authority start and end? What is their expected time commitment?
The task force will need to know the district’s budget, the internet connectivity of the community, and the capacity of the district’s IT and data staff.
3. Ground your task force and your colleagues
Not everyone on your task force or the team supporting the task force efforts will have the same level of expertise in AI. It should devote the first meetings to establishing baseline understandings of:
- How AI is used and could be used in education.
- Common terms used in the AI field.
- Existing state legislation, policies, or guidance that must be acknowledged and/or addressed. Note that these may be general and not education-specific.
- Basic education problems of practice in implementing AI tools successfully.
The baseline can be established through conversations, invited guest speakers, and assigned reading, videos, and webinars on select topics.
4. Build from existing resources
The education world is awash in AI resources, making it unnecessary to start from scratch. But before you start downloading other state or district policies, make sure you have identified, read, and understood any existing laws, policies, and guidance related to AI (either specifically or generally) in your own state.
Once you know your own AI regulatory landscape, we recommend starting with , which offers a comprehensive set of resources (including a customizable AI overview presentation template, policy resources, instructive webinars, and a guidance toolkit for schools). Its resources were developed in collaboration with leading national, state, and nonprofit education organizations (including Code.org; CoSN; Digital Promise; ISTE; UNESCO; AASA, The Superintendents Association; and the 九色视频).
State level guidance on AI also can be informative of your policy discussions and is worth considering. As of May 2024, the following states have published AI guidance: California, Kentucky, North Carolina, Ohio, Oregon, Virginia, Washington, and West Virginia.
Let your AI task force help identify and decide together which resources are most appropriate for your organization’s efforts. And then—this is the important part—debate and adapt them to fit your specific context.
5. Update many policies, not just one big policy
A misconception about AI policymaking is that the result is one policy documenting all considerations around AI. It is better to think of the work as updating a series of policies, expanding their language to include factors related to AI. Teach AI provides a starting list of policies you should consider revising, including responsible use, privacy, academic integrity, equity, safety, procurement, and security policies.
There is a tendency for education AI discussions to focus heavily on student use of generative AI tools, like ChatGPT. The concerns range from plagiarism to accessing inaccurate and potentially harmful content. That can be a helpful place to start your discussions, because it is a highly recognizable and discussed topic in the education community. But we encourage you to discuss AI’s use in education comprehensively.
Additional topics include educator use of generative AI to enhance curriculum and the use of predictive AI by educators and administrators to produce products like enrollment projections and student outcome analyses (among many other things). TeachAI’s Sample Guidance on the Use of AI provides a useful example of the topics that ought to be considered during policy discussions.
6. Embrace rapid evolution with AI use cases
The education policy world must help shape its future proactively and in real time, and we must be willing to engage in this work as a long-term, iterative effort. To this end, your AI task force may eventually turn into a smaller, ongoing advisory committee once initial policies have been released.
Existing AI guidance and policy language provide a great jumping-off point to gain quick wins. We encourage you to use your state school boards association, education organizations, and other school districts and states as resources. But we are each responsible for helping to create new AI pathways in educational settings. AI use cases for your district can help. Use cases are descriptions of the ways in which a user interacts with a system or product. They are meant to help flesh out future AI issues that can be addressed through policymaking, guidance, or other structures. Your task force may be able to help create these potential scenarios for your consideration.
Use cases typically contain information on:
- The challenge AI is trying to solve.
- The stakeholders most affected by the challenge and its potential AI solution.
- The opportunities AI presents in solving that challenge.
- The risks of using AI in this context.
- Solutions or ways to mitigate those risks.
AI has the potential to benefit the education space if it is actively and creatively embraced, thoughtfully examined, proactively planned for, purposefully managed, and appropriately used. And we are excited for what each of our futures hold.
Ann Willemssen (awillemssen@crocusllc.com) and Paul Butler-Nalin (pbutler-nalin@crocusllc.com) are consultants with Crocus, LLC, a technology firm based in Washington, D.C., that guides social impact in education. They recently worked with the California School Boards Association’s AI Task Force to set the direction of its AI policymaking work.
Share this content