From faculty who deliver classroom lectures to copywriters who create recruitment brochures, colleges employ plenty of professional communicators.
These roles often shift as institutions adopt new technologies to better convey information to students. Accordingly, the spread of communication tools powered by artificial intelligence has created a new kind of higher ed job: college chatbot writer.
These are the wordsmiths who craft dialogue for chatbot “scripts,” the curated conversations that unfold when algorithms correspond with humans. In selecting words, images and emojis, writers not only deliver information, but also establish the voice, identity and character of a college chatbot.
This “personality” matters in part because a chatbot can come to represent a university, like a mascot during a college sporting event. But it also has a serious job to do, and students’ perceptions of the tool affect how successfully it can draw them into meaningful, interactive conversations about admissions, enrollment, academics and campus life.
“We really focus on making sure that we are being as helpful as possible in our communications,” says Kevin Kovaleski, executive director of provost and enrollment communications at Arizona State University who helps write scripts for the college’s chatbot, Sunny. He says he wants to avoid “falling into the trap of just being an ASU cheerleader.”
Here’s a look at what it takes to bring college chatbots to life.
Diverse Expertise Required
Chatbot writers come from a variety of backgrounds. Creating scripts requires an understanding of narrative convention—such as how to initiate a conversation, take turns speaking and ask for feedback—so writing experience can be useful for the role, says Marissa Keech, a doctoral student at Georgia Institute of Technology and a user experience researcher for AdmitHub, a company that develops chatbots for colleges.
Developing the voice of a chatbot also draws on research and skills from fields including psychology, linguistics, sociology and human-computer interaction. “It’s incredibly diverse, a junction of so many different areas,” Keech adds.
Of course, college knowledge helps, since chatbots are primarily used to help students navigate the bureaucracy of higher education. Both during recruitment and after enrollment, bots at some campuses prompt students to meet deadlines and submit missing documentation and also offer personalized information about housing assignments and scholarships.
With all this expertise required, writing for a chatbot sometimes takes a whole team, plus support and information from offices across the institution. At Arizona State, three people are primarily responsible for writing Sunny chats. They draw from an array of research sources, including psychology texts (like “Mindset” by Carol Dweck), student surveys and Reddit threads popular with students. The messages the writers craft go through an editing process to clean up grammar and to double-check emoji selections.
“It’s never one person’s whim that goes from gut reaction to a live text,” Kovaleski says.
The chatbot at Georgia State University and its partner Perimeter College shares its identity with the institution’s mascot, a blue panther named Pounce. Five university employees write for Pounce, and they often run message ideas by students before finalizing them.
It’s up to chatbot writers and their teams to figure out the most relevant periods of the academic calendar to deploy these messages. Students “definitely like the just-in-time reminders,” says Matt Lopez, assistant vice president of enrollment services at ASU.
These writers also have to figure out who most needs which messages, and how many is too many to send. At Arizona State, most chatbot campaigns are targeted at specific subgroups of students—say, several hundred who still haven’t selected dorm rooms for next year—rather than the entire campus, so that each student doesn’t receive more than one or two a week, at most.
However, some messages do go out to a broad audience, like a note of encouragement the Sunny chatbot sent across campus during finals last fall.
“We’ll get students thanking the chatbot for that engagement,” Kovaleski says.
Developing a chatbot’s tone and personality requires a strategy, Keech explains. If the messages are primarily fun and playful, students may perceive the chatbot like a peer or friend and may not take it as seriously as intended. If the bot converses in a very formal way, it could come across like a superior, such as a professor or an administrator. While that authority can command respect, it can also create a power differential that serves as a barrier to engaging some students.
For college bots that engage via text message, a medium typically reserved for informal, intimate communication, Keech often recommends corresponding like a “near-peer,” a classmate who understands the nuances of campus culture but doesn’t assume the familiarity of a best buddy.
One way chatbot writers can indicate a level of formality is by using, or not using, a student’s name in a message. It’s the difference between a text that says “Hey, happy Friday!” and one that says “Happy Friday, Rebecca!”
“How often does a friend call you by your first name in a text? A lot of people don’t reference your formal name unless it’s a formal scenario,” Keech says. “If you’re a professor or in a position of power talking to a student, you very often will say their name.”
Another indicator is the use of emojis and multimedia. Emojis and GIFs have conventions and connotations, and Keech cautions chatbot writers to learn those meanings before deploying a stream of images in a mass text.
At Arizona State, Sunny is designed to be “friendly, helpful, with a little bit of an edge, but it’s never mean,” Kovaleski says. To establish and maintain that personality, the chatbot team developed a lexicon with language do's and don’ts. It offers guidelines about whether Sunny uses colloquialisms or esoteric higher ed words like “matriculate.”
“We don’t do slang; we’re a university,” Kovaleski says.
Because of the power of peer influence to affect student behavior, Georgia State’s Pounce chatbot communicates with students like a supportive buddy who possesses extra knowledge.
“It’s like your friend who’s a little bit smarter than you,” says Razi Shadmehry, public relations specialist. “It’s still a cool friend who understands the memes that you understand.”
When choosing memes and GIFs for Sunny to use, ASU writers try to pick ones with proven staying power rather than experimenting with brand-new jokes. The goal is conveying empathy, not getting laughs.
For example, if Sunny asks a student whether he or she will attend a particular event, and the student says yes, Sunny may respond by sending an animated happy dance. Or if a student texts Sunny, “You’re so helpful, I love you,” the chatbot may respond, “The feeling is mutual,” and include a heart emoji.
“While we aim to be relatable, some of us are approaching 40 and some are past 40, far from the traditional college student experience, so we don’t try too hard with that,” Kovaleski says. “You know it when you see it when a university is trying too hard. We’ve messed up a couple of times, but we try not to go there.”
Staying on the right side of relatable is important because a chatbot may easily slip into being annoying or condescending—and if students get frustrated, they may opt out of receiving messages. For example, ASU chatbot writers know from feedback that transfer and graduate students typically prefer Sunny to be more business-like than fun. Even incoming 18-year-olds aren’t monolithic; some may not respond well to overly chatty texts.
“We’re not treating them like they’re young tech-enabled kids that only are going to respond to memes and emojis,” Kovaleski says. “We don’t trivialize anything. We’re never talking down to students.”
And Sunny abides by the golden rule of artificially intelligent tools.
“We never attempt to hide the fact that this is a chatbot,” Kovaleski says. “That honesty is important.”
Building Human Relationships
With chatbots able to tackle so much communications work at a large scale, colleges’ human employees may worry about being replaced. But at least in theory, chatbot teams may actually free college staff to do more meaningful work than chase down students who haven’t submitted registration forms.
There are “questions that we are frequently asked over and over and over, in person and via email, that now the chatbot can respond to so that we can redeploy people in different ways,” says Scott Burke, associate vice president of undergraduate admissions at Georgia State University. The chatbot can answer thousands of questions, in fact. “We can have admissions counselors work with students who really and truly need their time and attention.”
At Arizona State, for example, Sunny’s reminders to new students about submitting immunization records have proven so effective that staff no longer need to spend time covering that topic in their orientation programs, creating more time to discuss more substantial matters.
And the cluster of staff who used to cold call students about enrollment issues now “warm call” only those who have indicated through the chatbot that they need assistance or would like to have a conversation.
“Now we’re actually putting all of that same attention into a smaller pool that has actually told us how we can help them,” Lopez says. “This is allowing us to get deeper into the relationship-building.”