Robots Take Over The Classroom

Special Education

Robots Take Over The Classroom

By Christina Quattrocchi     Apr 8, 2014

Robots Take Over The Classroom

True confession that I’m not proud to share: Robots in the classroom have long terrified me. They represented a line in the edtech world that I wasn’t willing to cross. Virtual reality? Sure. Adaptive technology? Absolutely. But robots, or particularly robots tricked up to be teachers? No way.

Sure, I worry as much about the inevitable robot takeover as anyone. But the real reasons I shied away from robot hype was because at the heart of the learning experience is a deeply human connection, the sharing of knowledge from one mind to another. And robots, cold, metallic, and distant seemed to sever that connection. So I drew my line and I wasn’t willing to cross it.

But two things happened to change my mind. First, I saw a video of what it was like to be a robot student. Then, I became a robot myself.

This all started with a video submitted for the White House Student Film Festival. The 3-minute film portrays the life of a boy who uses a robot to go to school. Hmm, I thought, when a colleague mentioned the video; there’s no way this boy truly connects with his school intermediated by a box of gears.

Here’s the video.

In it, seventh grader, Kyle Weintraub, who has lymphoma, talks about leaving his hometown in Florida to get treatment in Philadelphia. But he left a robot in his place. As a result, Kyle still attends classes, goes to lunch, and interacts with his friends in Florida.

The telepresence robot, which basically looks like a Segway with an iPad attached, makes it possible for Kyle to Skype into classes. Kyle also controls where the robot rolls. The robot’s camera lets Kyle see everything around him.

Without the robot, Kyle would be isolated in Philadelphia, without his friends, without his teachers, absorbed by a life filled with cancer. Instead, this cold, metal, robot allows him to stay in touch with a community that has been an important part of his recovery. How incredible.

After seeing this video, I conceded. Robot students are great - if they allow students to learn and to connect with their community when they otherwise couldn’t. But robot teachers - no way.

While I learned to accept student robots, my robot concerns were only magnified by a video of a robot teacher. So I called up the creators and demanded answers.

Greg Sutton, co-founder of Saskatchewan-based company TinyEYE Therapy Services, answered the phone. For the past nine year, TinyEYE and Sutton have helped speech therapists reach students in rural and remote schools by using video calls. TinyEYE currently serves 300 to 400 schools in different 12 countries.

“We had been doing online speech therapy -- like Skype on steroids with games included in the software and all sorts of bells and whistles to make the therapy session magical. But we couldn’t get into the child’s classroom to see them interact in class, and we couldn’t be in the IEP meetings in the same way, or attend open house nights. Having a speech therapist live on a screen is not the same as being there,” explains Sutton.

In autumn 2013, he began incorporating robots. TinyEye connects the school with ESSDACK, an education materials cooperative, who sells Double Robotics Robots.

One robot (which includes the hardware, iPad not included, software is provided for free by TinyEYE) costs $2,300. Schools pay additional fees for the remote speech therapist on the other end, based on their specialty and the hours they work. TinyEYE takes a cut of those fees. The specialist uses the robot to visit the student in their classroom, watch their behavior and interact with them in this environment.

So, using robots to pipe in services like speech therapy that otherwise students might not get makese sense. Still, I wondered: Could a student really connect to these robots and is it really any better than a skype call?

In the middle of my phone interview with Sutton he offered me the chance to take one of the robots for a spin to see for myself. Tentatively, I agreed. I wheeled around his office, saying “Hi!” to his coworkers. I parked my robot across a table from Sutton.

Christina as a robot.

I had a 360-degree view of his environment. I could see his hands, and tell he wasn’t emailing anyone while he was speaking to me. I used my keyboard to move my robot body left and right. I could stand up and tower over him or sit down and see him, camera-to-eye. I felt “present” with him in a way that I had never been during other phone or Skype interviews.

Something was different about this. And it wasn’t half bad.

As I signed off of our call and left my robot body, I could feel the lines I’d drawn around the robot world fade away. I became a robot, and, truly, I liked it. Now that could have been because of Sutton’s natural Saskatchewan charm - but I also think there was something else that happened.

I was a part of his world. I was more present because I could see more and interact with more. In turn, he could show me around his environment and I could react to his surroundings, making my visit to his office more “real.”

If this could give students access to professional services or even mentors that they wouldn’t have otherwise, then maybe this line is worth erasing.

As the edtech world continues to develop and push the boundaries of what we see as possible and acceptable, there will be a slew of new technology that fall at various levels on our spectrum of comfort. It’s right for us to draw lines. However, it is also right to be willing to erase them, to try things out, and to side on what’s best for kids, not what’s best for comforting our own fears.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up