As a reporter, I’m always looking for feedback on my writing. Sometimes I need it fast. But editors aren’t machines; they can do only a few things at a time.
Maybe a machine would be the best proofreader? To be fair, I’m also a little scared of technology replacing me. They can already beat me at Go. (In my defense, I don’t know how to play Go.) Can they best me at writing—or best my editor at editing?
Teachers are under similar pressure to grade papers quickly. But correcting essays takes a long, long time—sometimes into weekends. Can technology help? Can the correction and feedback engines being developed today teach kids how to improve their writing?
Two California companies—one old, the other a startup—are building tools that they claim can produce better writers. Turnitin, a 19-year-old company that’s best-known for its anti-plagiarism tool, recently launched Revision Assistant, an essay assignment, composition, feedback and submission platform. WriteLab, an essay feedback program, is much younger; Forbes 30 Under 30 honoree Matthew Ramirez created it at the end of 2013.
I wanted to compare these tools, with the following two questions in mind:
- Do these programs provide useful feedback?
- Do they give the same feedback? That is, do machines agree on what constitutes bad writing? Every editor and teacher has a different definition of poor writing, so divergence wouldn’t necessarily be a knock against them, just an important thing to know. Journalists prefer certain editors; students like certain teachers; writers may favor this robot over that.
Turnitin Revision Assistant
Revision Assistant works as an end-to-end essay platform. Teachers choose from a bank of pre-written prompts and assign them to students through the tool. Students then compose and submit essays through Revision Assistant’s browser-based word processor.
Throughout the drafting process, students can ask Revision Assistant for “Signal Checks” on their essays’ language, focus, organization and evidence. The feedback is presented as signal strength icons that indicate how well the essay fares on these four criteria. This information is intended to be low-stakes and help students improve their writing. The company also claims the program will train itself to offer more precise feedback as essays accrue in its database.
Students can ask for Signal Checks as many times as they like—but only if Revision Assistant deems the essay to be on topic. This is an important caveat: I submitted another letter to an editor that I felt was pertinent to the prompt, but Revision Assistant felt otherwise. The “off-topic” message does not offer much helpful feedback on how far my piece strayed from the topic guideline or how to correct its errant ways.
To test the tool, a representative from Turnitin assigned me this prompt:
To answer, I turned to someone whose expertise I’ve trusted since childhood: Google. I searched for “letter to the editor social media” and found an op-ed in Iron County Today, titled “ Letter to the Editor: Social Networking a Disease.” I pasted the column into Turnitin’s Revision Assistant—ironic, I know—and asked for a signal check.
In my opinion, this editorial falls short on several levels. The language is rudimentary. The piece is repetitive, deals mainly in hypothetical and lacks focus. It points are scattered, decrying the ill effects of social media on teenage behavior, hiring practices and psychological well-being.
The piece got very high marks from Revision Assistant, which gave it full bars for Language and three out of bars for the other criteria. Here’s what the tool suggested in order for the essay to receive full marks across the board:
Turnitin proffered a suggestion for clarifying: “Stay focused on your claim and explain your reasons fully to show how they support your position.” The comment is general to the point of being impossible to disagree with. Yes, focusing on your claim and explaining reasons fully is a good rule to abide by. It is one of the basic tenets of supporting a position in an argumentative essay.
The program also gave sensible feedback on the analysis and organization of the essay, asking the writer to to strengthen their stance in this instance:
I, a human writer, agree with this evaluation. Is this essay about venting on social media? Somewhat. That’s a part of it. That isn’t the main idea, though, so why does it arrive so early in the essay? Their later statement, “Social Networking sites such as Facebook, MySpace, and Twitter need to be shut down,” sounds much more like a strong, argumentative thesis.
For organization, it offered this general feedback:
“Clearly explain how the evidence and reasons you are using support your argument. Work on making your conclusion a powerful call to action for your reader.”
I don’t know if “Work on making your conclusion a powerful call to action to your reader” is advice the program gives to every writer. It’s good tip, though, for the writers of this letter. The conclusion needs some more “oomph.”
And this somewhat specific suggestion on evidence:
“The essay question asks you to convince an audience of other students, teachers, and/or parents to agree with you. What kind of evidence will best appeal to your audience? Revise with them in mind.”
These questions and recommendations sound too like the general guidelines English teachers would say before their students wrote a paper. I am unsure what revising with the audience in mind would change about the sentence above.
Turnitin Revision Assistant delivered some individualized feedback, some general. Overall, did it do enough to improve a simplistic letter to the editor? I’m not convinced.
WriteLab, by contrast, does not need a prompt. Users can paste any text into the platform and click “Analyze Draft” to get feedback on the clarity, concision, logic and grammar of the piece, which, at first, seems like an imitation of Revision Assistant’s Signal Checks. Unlike giving users an overall signal strength, WriteLab tallies up the number of things it has quibbles with. Revision Assistant’s feedback focuses on full sentences and the whole piece, whereas WriteLab analyzes individual words and phrases.
WriteLab will analyze any text you put into it, which works to both its advantage and disadvantage. It’s easier to use. It’s free to anyone; it’s open to the public. It’s a national park to Revision Assistant’s private ranch—schools must purchase a $10 per student per year license for the latter. WriteLab also gives more specific feedback.
I pasted the same letter to the editor about social media into WriteLab. It flagged eight instances where there was a problem with clarity.
The common thread through the advice on clarity was a distaste for the passive voice.
I agree with these edits. The writers veil their subjects and fail to be direct in their statements, and their arguments lose steam. I found each of the eight recommendations on clarity sound. I would have made them, had this been my letter.
Not all WriteLab’s advice is sound, however. On the issue of concision, the program recommends that “Social networking sites can bring out the worst in people, which can affect some to the point of a psychological disorder” become
“Social networking sites can bring out the worst in people, which can affect some to a psychological disorder.”
It runs afoul of the sentence’s idioms and confuses “to the point of,” which delineates the progression of something into a certain state, with “The point of this is that,” a common writing crutch (I’m guilty of it, too.)
WriteLab also made no recommendations about the cohesion and logic of the essay. Because it lacks a prompt and a database of past essays, WriteLab does not have as strong of a framework to question the letter writers’ logic as Revision Assistant did. The writers received no demerits for the logic of their argument, which Revision Assistant flagged as unsubstantiated. The letter’s argument could have, in my opinion, used a booster seat.
On the flipside, Turnitin did not flag any superfluous language, instead commending the “formal and appropriate” diction of the essay. Where Revision Assistant offers feedback that is often too general—sometimes to the point of not being useful—WriteLab tries to be specific and precise with its suggestions at the risk of being inaccurate or changing the meaning of the sentence. Can I have a fusion of both?
Until writing engines can provide feedback that obligates students to critically understand writing, these programs will not replace teachers (or editors. Thanks for the help, Tony!). That’s a big request, one that the creators of these tools are beginning to fill. Despite my witty criticism, I am a proponent of these tools. Students should receive more feedback on their writing; they will be better for it.
For my own amusement and bemusement, I threw this article into WriteLab for evaluation.
A fun fact: my writing is, apparently, neither clear nor concise. If you’ve gotten this far, I must have duped you into thinking otherwise. Sorry.
Things got very metacognitive when WriteLab asked me to refocus a sentence about the focus of the letter in the Iron County Today. The suggestion, however, didn’t make sense:
Would the sentence then become “The essay and its focus are analyzed by the program, which asks questions like ‘Are my ideas stated clearly?’” That’s sending me straight into the pit of passive voice.
WriteLab flags common writing errors, but it often fails to discriminate between the correct and incorrect usage of common phrases.
Robots may replace me as a journalist, but English teachers can rest assured that machines have a long way to go to before being able to offer quality feedback. There seems to be little possible harm in students receiving more feedback, and Revision Assistant and WriteLab could be far worse. When I need an editor, though, I’m sticking with Tony Wan.
Turnitin’s Content Specialist, Jamie Calhoun, offered thoughtful feedback on this review. Here’s what she says Revision Assistant is—and is not—designed to do.