​PARCC Shares 'Lessons Learned' from Common Core Field Tests

column | Common Core

​PARCC Shares 'Lessons Learned' from Common Core Field Tests

By Mary Jo Madda (Columnist)     Dec 1, 2014

​PARCC Shares 'Lessons Learned' from Common Core Field Tests

Not everything's a walk in the PARCC.

During the spring of 2014, more than 1.1 million students in approximately 16,000 U.S. schools took field tests of a Common Core assessment developed by PARCC, short for the "Partnership for Assessment of Readiness for College and Careers." According to a recently released report from PARCC summarizing key findings from these tests, the ultimate goal of the pilots "was to confirm that PARCC is a quality assessment program and to make improvements based on the experience" before the formal administration of the exam to "an estimated 5 million students" in 2015.

So how did the field tests fare, according to the test administrators, test coordinators, and students who tested the test?

The report found that overall, the majority of students reacted positively to the look and feel of both the computer-based (CBT) and written versions of the test. Ninety-four percent of students said that they had sufficient time to complete the English language arts (ELA) portion of the CBT test and 87 percent reported the same on the math CBT test. When it came to ease of use on the computer-based assessment, 89% reported that it was comfortable typing in their ELA answers--but only 65% said the same for the math exam.

The PARCC report also found that no "system-wide" technology breakdowns occurred during the pilot, meaning no problems occurred across all or at a majority of PARCC test sites. Rather, most tech issues concerned individual devices, such as firewall or computer settings that needed to be changed.

However, the report cited several areas for improvements, including issues with test directions, training manuals and some of the testing procedures overseen by Pearson-- an issue that EdWeek's Sean Cavanaugh points out.

  • Only 39 percent of test coordinators agreed that the process for setting up test sessions and registering students through PearsonAccess (the CBT delivery system) was straightforward and simple.
  • Worse, only 28% of coordinators reported that TestNav8, Pearson's cloud-based system for delivering tests via Web browser, worked well during the test administration. (40% disagreed or strongly disagreed.)
  • 42% of test administrators found the administration manuals user-friendly, while a third disagreed or strongly disagreed, and 19% were neutral.

The report also suggested educators could better prepare students for the contents of the test: 63% of students reported that the math CBT was "harder than school work," while 33% who shared this sentiment on the ELA CBT exam.

Will these changes get implemented before the next round of testing? One thing's for sure--there's accountability on both sides to improve this process, and hopefully before more than 5 million students take the official PARCC exam.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up