BETTER TESTS FOR LESS? Multiple choice tests may not be popular when it comes to assessing understanding, but at least they're easy and cheap to grade. Not so much for essays and open responses. However, a white paper published by GettingSmart suggests that using machines to score essays can be up to 75 percent cheaper than human scoring. Take a look at the full paper (PDF) for more about the methodologies and assumptions behind the pricing models. The study is a follow-up to the work funded by the Hewlett Foundation that show how machine graders can produce similar scores as humans on long essays (but not for short ones.) Expect this to stir up some controversy, too, especially as both PARCC and Smarter Balanced are considering turning to robo-"readers" to help with their new assessments.