A new group of classroom leaders will enter a school year shaped by fast-moving debates about artificial intelligence, instructional technology, and student learning.
On May 6, 2026, ISTE+ASCD announced the 2026-27 Voices of Change Fellows, a cohort of six educators based around the country who will share firsthand accounts of how AI, technology, and instructional innovation are changing K-12 education.
As AI tools become more common in classrooms, schools are rethinking teaching methods, educator roles, student support systems, and long-standing assumptions about how learning should happen.
Supporters say AI can personalize instruction, reduce administrative burden, and help students prepare for a future filled with new technologies.
Critics warn that early and careless use of AI could weaken student thinking, creativity, privacy protections, and social-emotional growth.
Table of Contents
ToggleProgram Purpose And 2026-27 Fellows
Meet the 2026-27 @ISTE_ASCD Voices of Change Fellows.
Six K-12 educators sharing first-person stories on how AI, technology, and innovation are reshaping classrooms — published throughout the year on EdSurge.🔗: https://t.co/otPxrbbRgW pic.twitter.com/kyj7Ef570L
— EdSurge (@EdSurge) May 6, 2026
Voices of Change is designed to empower the people closest to the classroom by giving educators a platform for first-person essays, reported reflections, and multimedia stories on EdSurge. For 2026-27, the program enters its sixth cohort.
Six educators will contribute perspectives grounded in their professional roles and communities:
Fellow
Role
School
State
Tambra Clark
Technology integration facilitator
Birmingham City Schools
Alabama
Nathan Kraai
Director of Innovation and Design Thinking
The Fenn School
Massachusetts
Pattie Morales
Instructional technology specialist
Indian Community School
Wisconsin
Court Shuller
Middle school ELA teacher
Gloucester Township Public School
New Jersey
Monika Vereb
Principal
Herndon Elementary School
Virginia
Beth Yirga
Assistant head of school
Freire Charter School Wilmington
Delaware
Risks, Resistance, and Parent Concerns

Many educators, parents, and cognitive scientists reject the idea that AI-aided education is inevitable.
Resistance has grown as AI tools enter more K-12 settings, sometimes through devices and platforms students already use.
One national survey found that around 80% of K-12 teachers said their districts use Chromebooks, a figure that may give Google an advantage in spreading Gemini through school devices. Major concerns center on cognitive offloading, emotional influence, privacy, and the purpose of schoolwork. AI tools may encourage students to hand over thinking tasks before they have built foundational skills. Chatbots may mimic emotional closeness and authority in ways that shape children’s social development. AI products may reward fast, polished outputs over the slower process of learning. Built-in tools may also expose students to AI even when families feel uncomfortable with that exposure. Critics argue that assignments are not only about final products. They are also about the experience of doing the work, struggling with ideas, testing approaches, making mistakes, and building confidence. In that view, a polished AI-assisted answer can miss the point of learning. This concern also connects to broader debates about academic support services such as EssayShark, where students may seek writing assistance while schools continue to define the line between support, originality, and authentic learning.
Educators As Field Observers
Practitioners can track AI’s school-level impact in ways that outside observers often cannot. A teacher can see how an AI tool changes lesson planning on a Monday morning.
A principal can see how a new data dashboard affects intervention meetings.
An instructional technology specialist can see which tools teachers actually use after a pilot ends.
Their work can also reveal if AI improves instruction or simply adds another technology layer to already crowded classrooms. School-based observation matters because AI adoption is not only a question of software. It is also a question of trust, training, judgment, equity, privacy, and well-being. Educators must decide when AI helps students think more clearly and when it creates shortcuts that weaken learning. Families must decide how much exposure feels appropriate for children. School leaders must decide how to balance innovation with safeguards. Dispatches tied to real classrooms can test a major claim made by AI supporters: that AI should place educators at the center and help create stronger learning experiences, rather than replacing professional judgment. AI tutoring and personalized learning are among the most visible technology changes in K-12 schools. One example is Alpha School, an AI-supported personalized learning model in which students follow individualized paths shaped by goals, interests, and academic needs. At Alpha School, mixed-age classrooms allow older students to mentor younger students, while students move at different paces based on readiness. Potential benefits include real-time feedback, flexible pacing, personalized learning plans, project-based work, AI dashboards that let students monitor progress, and more teacher time for mentorship, collaboration, and hands-on learning. AI can also reduce rigid classroom pacing, where some students feel bored while others feel overwhelmed. Personalized models may help teachers shift more attention to small-group support and applied learning. A teacher could review AI-generated reports in the morning, identify students who need a small-group lesson on fractions, and allow students who are ready to begin a more advanced math project. AI adoption is not only a technology issue. It is also an equity issue. Algorithms trained on narrow datasets may disadvantage minority students, which means schools may need bias audits, careful review processes, and culturally responsive content. Accessibility also matters. Several fellows bring direct experience tied to these concerns. Tambra Clark’s work on AI literacy and STEM equity connects to questions about access and future readiness. Pattie Morales’s focus on equitable access to AI and digital learning addresses gaps that can appear when new tools enter schools unevenly. Beth Yirga’s focus on educational equity, environmental justice, and whole-child success connects AI adoption to broader student needs. A key question will follow the fellowship throughout the 2026-27 school year: Can AI help close learning gaps, or will it widen divides between well-resourced and under-resourced schools? AI may redefine teachers’ work, but it does not have to eliminate the teacher’s role. In a teacher-centered model, educators move past delivering standardized lessons to every student at the same pace and toward orchestrating authentic learning experiences. Teachers may use AI dashboards to identify which students have mastered a concept and which students need more support. In one possible morning routine, a teacher reviews AI-generated reports, finds a group of students struggling with fractions, teaches a targeted small-group lesson, and sends students who are ready into a more advanced math project. Professional development should go past one-time workshops. Ongoing professional learning communities can help teachers analyze AI reports, share strategies, discuss classroom evidence, and co-design interventions. Without that support, AI dashboards can become another source of pressure rather than a tool that improves teaching. More radical AI schooling models raise additional questions about the profession. Alpha School has been described as using “guides” instead of traditional teachers and claiming that students can complete academics in two hours per day through AI-powered personalized learning. That model challenges long-standing assumptions about teacher expertise, instructional time, and the social purpose of school. Fellows can help track how educators respond to these changes. Their observations can show if AI expands teacher capacity, narrows the job, or creates pressure to accept models that reduce professional judgment. ♬ Happy / Video CM / Ukulele ♬ Cooking / Family(897363) – ImoKenpi-Dou During the 2026-27 academic year, Voices of Change fellows will document a school system in transition. Their work can show if AI becomes a meaningful instructional support, a risky classroom disruption, or both at once. Careful reporting matters because AI’s role in schools should not be treated as predetermined. AI may personalize learning, support teachers, and expand project-based instruction. It also raises serious questions about student development, equity, data privacy, family choice, and the meaning of learning itself.
Personalized Learning and New Instructional Models
Equity, Access, and Bias
Changing Teacher Roles
FAQs
Summary
Related Posts:






