Picture this: Tomorrow’s graduates walk into workplaces where AI tools are as common as email — diagnosing patient symptoms, analyzing market trends, optimizing supply chains or designing new infrastructure. From healthcare to marketing to engineering, nearly every field is being transformed. Are our schools preparing them for this new reality? And do we have an effective method of assessing such readiness?
At Gwinnett County Public Schools (GCPS), educators are determined to make sure both answers are “yes.” Their mission is to ensure every student is “AI ready” — prepared to use emerging technologies, like generative AI, in an ethical and responsible manner in school, life and future work, regardless of where those careers take them. To support this goal, GCPS led the development of both an AI readiness framework and a companion diagnostic assessment.
In 2019, GCPS, in collaboration with multiple partners, created an AI readiness framework that focuses on six core areas: Data Science, Mathematical Reasoning, Creative Problem Solving, Ethics, Applied Experiences and Programming. The framework was developed with input from district subject matter experts (including computer science, math and science teachers) and external partners.
In order to help make the framework informative and actionable, the district partnered with the ISTE research team in 2025 to develop a diagnostic assessment tool that measures student AI readiness across select skills outlined in the framework. Diagnostic assessments, as opposed to summative assessments, measure students’ current knowledge and skills, helping educators identify gaps and areas for growth, and guide teachers and school leaders toward where students might need additional instruction, resources or support to meet learning outcomes.
A Systematic Approach to Test Design
Here’s how the district and the research team brought the AI readiness diagnostic assessment to life:
Defining objectives and developing the framework
The team had to account for practical considerations: Who would take the test? How would it be delivered? What time constraints existed?
While the AI readiness framework covers preK-12, the team began by designing a diagnostic for high school students in grades 9-12. They knew the assessment needed to be digital (to maximize flexibility) and quick, ideally 10 to 15 minutes. These factors influenced the types of questions used. To support automatic scoring, the team included multiple-choice and Likert scale questions.
Creating draft questions
First, the ISTE research team and GCPS partners collaborated to identify framework constructs they wanted to measure within each of the six core areas. This ensured consistent coverage across all areas.
Once the constructs were defined, the team worked with subject matter experts — both district educators and external specialists in AI and education — to draft three to five items for each construct aligned with their expertise.
Reviewing and revising
After drafting the items, the research team reviewed them for consistency and ensured that each measured only one skill. Through the refinement process, they narrowed the set to two items per construct across 26 constructs total, creating two versions of the pilot assessment. The school district then built the pilot assessments in their survey platform, Qualtrics, for ease of distribution.
Putting the pilot to the test
Students from Seckinger High School — about 1,200 total — participated in the pilot. They were split into two groups alphabetically by last name to evaluate the two “parallel” sets of items. The district confirmed that the two groups had similar demographics. Students completed the pilot during their homeroom period.
Analyzing the results
While expert input ensured strong construct validity, there was still a need to evaluate the reliability of both the items and the overall test. The research team conducted a series of psychometric analyses, including test reliability, empirical item analysis and item response analysis. These analyses helped identify which items performed well and which needed refinement or removal.
Before analysis, the research team cleaned the data to eliminate questionable response patterns, such as students who completed the assessment unusually quickly and likely didn’t carefully read the items.
Where This Work Is Headed
With the item and test analyses in hand, the research team and school district collaborated to produce a final version of the diagnostic assessment designed for high school students. They are now exploring ways to adapt the tool for other grade levels and to incorporate more complex items, such as performance-based tasks that allow students to demonstrate their skills in real-world contexts.
Moving forward, the district hopes the results from this diagnostic will contribute to a more comprehensive picture of a student’s AI readiness, alongside other data points like teacher evaluations, computer science coursework and capstone projects. These combined learnings will inform curriculum development and student support strategies across the district.
Reflections
Diagnostic measures of AI readiness can provide districts with crucial data for strategic planning and resource allocation, ensuring students are prepared for a world saturated with AI. The collaboration between district leaders and the research team demonstrates the importance of thoughtful design and rigorous assessment practices. GCPS and ISTE+ASCD hope their work can serve as a model for other districts preparing students for a future with generative AI.