How High School Should Change for an Era of AI and Robots

EdSurge Podcast

How High School Should Change for an Era of AI and Robots

By Jeffrey R. Young     Nov 29, 2022

How High School Should Change for an Era of AI and Robots

This article is part of the guide: The EdSurge Podcast.

Public high school in America was the product of the time of its invention, which was way back in 1821. But in this era of rapid technological change marked by artificial intelligence and robots moving into more aspects of work and social life, maybe the way teaching is done in high school needs a reboot.

That’s the thesis of the book “Running with Robots: The American High School’s Third Century.” It is framed around the thought experiment: What would an ideal high school of the year 2040 look like?

The tour guides of this imagined school of the future are two authors: Jim Tracy, a senior advisor at the nonprofit Jobs for the Future who in his career has led private K-12 schools and served as a college president; and Greg Toppo, longtime education journalist.

Surprisingly, these future-looking experts don’t talk that much about robots, or other high-tech tools in the book. They instead focus on how coming technological change will end up shifting the relationship between people and machines, and therefore between students and teachers.

But while the book paints an idealized, almost utopian picture of this high school of tomorrow, we learned in our conversation that these authors think it will take some work to avoid some possible downsides of the tech that promises to enrich schools and learning.

Listen to the episode on Apple Podcasts, Overcast, Spotify, Stitcher or wherever you get your podcasts, or use the player on this page. Or read a partial transcript below, lightly edited for clarity.

EdSurge: In your book you imagine a scenario of a high school in 2040 that is designed to take advantage of a world more-heavily infused with artificial intelligence and robots. What is the biggest difference folks would see if they toured this futuristic school?

Greg Toppo: One of the big changes is that even though we are sort of obsessed with this idea that technology is going to be a big deal in future high schools, [we think] that the humanities will play a bigger role than they even do now. And we need people to sort of see that before they see anything.

Jim Tracy: One of the things that strikes me about this future is that [we predict] the technology [will] become integrated into the creative processes of students. So the technology will allow [a resurgence of] constructivism, so that the students are driving their own learning, following their own passions in any direction that it brings them. And the technology will allow that interface with their classroom … to be infinitely malleable.

Having said that, one of the revelations for our chief protagonist at the end is when his guide … explains to him that learned teachers, master teachers, are more central than ever because the landscape is so infinitely malleable, the teachers become even more central—[we’ll really need] the presence of a learned guide.

Why did you name your book “Running with Robots”?

Toppo: We love the image, which is sort of counterintuitive to what so many people are fearful of. The received wisdom is that robots will take our jobs, and where we're gonna be left penniless and jobless and destitute. We wanted to kind of flip it and see what the possibilities were.

And we do this now, you know—we run with robots all day. I just took a load of laundry out of the washing machine, and I'm using a robot basically to get my clothes clean, right? And so we are already running with robots. We're already using them to our advantage, and it will be even more of a mutual relationship 20 years from now. And it was a reference from a book we really admired.

Tracy: Yeah it was from a book by [Andrew McAfee and Erik Brynjolfsson]. The image that they used was that if you think about optimal chess playing, the best human chess player in the world will lose today to the best chess algorithm. By the same token, the best chess algorithm in the world will lose to a combined team of a mid-level algorithmic chess [system] coupled with a human chess player. So we're better together than either is apart.

In your research you also visited actual high schools that are trying innovative practices that you say move toward this future. What’s an example in the real world today?

Toppo: The examples we use in the book are not really technologically focused. The book focused on new ways of seeing the relationship between teachers and students and between students and the work they do. So one of the things that we were really interested in and focused on was this idea that the biggest change we need to think about is the students' relationship to their work and what the importance of their work is.

One of the examples that I liked was a school in Iowa called Iowa Big, which is this experimental high school. And one of the students that we end up talking to there is this student who basically came from a traditional, several-thousand-person, four-year high school, and didn't really like it, was doing fine, and was college-bound. And then she sort of drops into this experimental school and realizes that she had no agency in that previous school, and nobody trusted her, and nobody really was focused on what she was interested in. Nobody really asked her the essential questions that were important to her.

And [at Iowa Big], one of the very first questions that one of her teachers asked her was, ‘What makes you mad?’ And that opened up for her this sort of new world of, ‘Oh my God, I'm mad at a lot of things.’ And that was for her at least, this sort of entryway into accessing what was important to her. And she ended up organizing this huge conference about young women in careers. And she actually ended up cold calling the lieutenant governor of Iowa, who is now the governor, actually. And just really doing some amazing stuff that I don't think she would've done otherwise.

What's the model or the mechanism that the high school used to get that to happen?

Toppo: They were just super focused on kids actualizing themselves—finding what they're interested in, finding what they like to do and what the way they can contribute to the world and really relying on students themselves to figure it out.

Tracy: One of the things is something I did at a school that I ran—Rocky Hill School. In that work we were trying to ask the question, ‘What is technology inflection going to mean for the role of humans in 10 to 20 years?’ And the answer that we kept coming up with—whether we were talking with educators or with some of the best software engineers in the world—was that we can't really know exactly what the capacity of AI is going to be in 10 to 20 years, but we can, with a high degree of confidence, say certain things that it won't be able to do yet.

And if we look at that, then we can reverse-engineer the human domain that looks like it's going to be pretty safe as part of the workforce and the social sphere and so forth. And the domains that we kept seeing were the domains that are associated not with the intellectual knowledge economy, but rather with the more-compassionate, empathic economy.

In other words, we have for the last century and a half in the knowledge economy been educating our students to become repositories of information—whether they're lawyers or doctors [or engineers] and so forth. And then somebody pays them a great deal of money to extract some of that knowledge from their heads. What's happening now is that's being reposited in algorithms increasingly, and that's only going to be more the case going forward, so that the most intelligent, capable medical diagnostician, I predict, will be a computer somewhere in the next 20 years.

What is the role of the doctor then? The doctor's role is to be a knowledgeable interpreter of that algorithmic diagnosis—to check it, to make sure that there wasn't a snafu, and to make sure that there is no social bias in the outcome. And also to help interpret that into a regimen for treatment and healing on the part of the patient in a human-connected, empathic way.

How then will we train doctors? And that's the critical point for schools, given that technological breakthrough, that now the knowledge economy is going to be owned by the algorithms. How do we train humans to be the empathic partners to that algorithm? And the way that we do that is to train them toward knowledge sufficiency so that they can understand what the algorithm is doing and interpret it for the layperson, but with empathic fluency.

Also creativity is another domain that we felt would be still uniquely human.

So if you think about how you then translate that into, say, K-12 or higher education, the doctors, for instance, will be trained in content literacy rather than content fluency and empathic and creative fluency. You would spend less time in high school training every student to take calculus and more time in portfolio-types of collaborative endeavors to solve problems.

The book portrays a very optimistic 2040. But if new AI tools need to keep students within a courseware system to get the benefits of the algorithms, you could also imagine a more-dystopian version of what happens—where there's less diversity of teaching materials and less control by educators because of that. What advice do you have for curbing some of those impulses that may be inherent in the technology or the market forces?

Tracy: I actually think that that's more likely. I think we're leaning heavily in the direction of the more-dystopian outcome, and I’m rather pessimistic. The book was an action of will—to assert, ‘Here's a vision that could be with the exact same technology if we assert a kind of agency of Paideia [a system of schooling from ancient Greek times to give a well-rounded education].

On a more practical level, what do you see that educators can do to counteract that?

Tracy: I don't know that I have the answer to that. I think that there are strong market and social and historical forces that are driving us toward less-desirable outcomes right now. And so everybody has to play their part. My part was to try to present a vision [for a positive future.] My role was more of a visionary.

Toppo: As I look at the edtech landscape, the one thing that worries me the most is privacy. I feel like we need to get privacy right, and I I don't know what it will take to make that happen other than just cataclysmic disaster. My sense is that's gonna need to happen more broadly, that we're gonna have to get to a point where people really are hurting—that we will have to hit rock bottom before a more optimistic vision starts to kick in.

Educators as a group don’t get into it to get rich, they get into it to make a difference. And my feeling is that once teachers are maybe more comfortable and familiar with the technology, they can have a hand in its development. To me that's a positive thing, and that opens a possibility that they'll be in control.

Tracy: The systems that we have for public education are becoming more rigidified, not more experimental and resilient. And they're becoming increasingly non-functional. And I believe they're going to face some sort of systemic collapse. But what I do see that's hopeful is on the margins—and we highlight some of these in our chapters—there are all sorts of experiments that are going to provide new paradigms that can be adopted when that breach, when that opening really happens in society.

Hear the complete interview on the EdSurge Podcast.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

The EdSurge Podcast

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up