Employment Outcomes Data Is All Over The Place. This Report Suggests...

Digital Learning

Employment Outcomes Data Is All Over The Place. This Report Suggests Ways To Standardize It.

By Sydney Johnson     Dec 13, 2018

Employment Outcomes Data Is All Over The Place. This Report Suggests Ways To Standardize It.

Students don’t always choose where to go to school based on where (or if) it might land them a job. But many do. And for those students, accurate employment outcomes data isn’t always easy to come by.

Take Brightwood College, a for-profit institution that shut its doors this month. At the school’s vocational nursing program in Corpus Christi, Texas, 75 percent of graduates were reported to land jobs as licensed practical and licensed vocational nurses—but that’s only according to the state’s definition of employment. When looking at employment standards from the school’s national accreditor, ACICS, that employment rate dips to 59 percent.

Reporting requirements vary across educational accrediting agencies and government bodies, ranging from what is considered in-field placement, to how long a graduate must work, and some do not require employment data reporting at all. A recent report from the nonprofit Institute for College Access and Success (TICAS) outlines the challenges around assessing and accessing employment outcomes data, calling the issue a “patchwork of uncoordinated data.”

“Getting a job is the most cited reason for students choose where and whether they go to college,” says Neha Dalal, a co-author of the report. “We can’t reduce college to economic outcomes, but it’s bizarre that we can’t actually tell you that if you go to this program, what are the odds that you will get employed.”

The report, released Tuesday, examines three entities that oversee higher education— accrediting agencies, state governments and the federal government—and breaks down pros and cons of the ways each collects college and university employment data.

For national accrediting requirements, some institutions will survey alumni. But the report points out that verifying these surveys can be costly, and without verification, schools can collect and submit inaccurate or misleading data.

That’s particularly concerning to the report authors, who list examples of fraud at for-profit institutions such as DeVry, which previously claimed that 90 percent of its job-seeking graduates found jobs in their field of study within six months. The Federal Trade Commission later found the claims to be false, and fined the school $100 million.

At the state level, employment data is often gathered by comparing state-level higher education data with state workforce data, such as unemployment records. According to the report, 35 states currently use this process and another 7 states are moving this direction. But state-level employment metrics also don’t capture everything. A table in the report shows only nine states have data systems that include employment data for all institutions, and many states leave out private schools.

In addition to differences for how data is gathered, states also vary in their requirements for what should be reported. For instance, Florida’s employment rate metric is measured by “the percent of graduates employed or continuing education one year out,” the report reads, while Minnesota defines its employment rate as whether or not there was a five percent increase in “related-employment.”’

Authors of the report recommend making the federal government oversee a standardized gathering and reporting method for employment metrics for graduates. Currently, federal data can be linked with federal-level workforce data, such as from the IRS. But there’s a compromise here, too: IRS data is “only collected annually and do not record employment intensity, occupation, and a number of other variables needed to calculate cohort exclusions, making it difficult to determine employment timing,” the report reads.

Ironing Patches

Standardizing reporting outcomes and employment data has been an uphill battle for advocates. Efforts such the Quality Assurance Taskforce, made up of think tanks, colleges, coding bootcamps and government agencies, formed in an attempt to bring consistency to help students find accurate employment information about the programs they are considering.

That effort is still moving forward. But coming to agreements on what the employment metrics should be and how to collect them has been a point of friction, according to Paul Freedman, a principal consultant at Entangled Solutions, which formed the taskforce. For example, agencies vary on questions such as whether students should be counted if they started a program program with no intent to get a job in that field, but because they simply wanted to learn.

“The challenge is what should be included and not included,” says Freedman. “It’s a little bit of a difficult process to herd the cats and get universal agreement.”

Authors of the report suggest two major ways that outcomes reporting could be streamlined. First, to standardize definitions of job placement rates across accrediting agencies and state and federal regulators, authors suggest that Congress should authorize the U.S. Department of Education to create employment data standards for accrediting agencies. Under the Higher Education Act, the federal government is currently restricted from setting these standards, the report explains.

For vocational programs that are more explicitly tied to a career, such as a dental assistant, employment metrics can be helpful. But for programs such as those in the liberal arts, the connections are not so clear cut. Occupational codes, which are used to classify workers and occupations, can be difficult to set for a degree in English or History, compared with a vocational nursing program. “We think the threshold earnings rate gives us a window into all of those liberal arts degrees and if graduates are succeeding,” says Dalal.

The report recommends the federal government calculate and publish a national threshold earnings rate, which measures the portion of program graduates who are employed and earn more than a specified annual amount. The rate would be applied at the program level for all higher education programs.

Freedman says many of the recommendations align with the efforts that he and the Quality Assurance Task Force want to see. But he isn’t confident that the federal government will oversee employment regulations and reporting for higher ed anytime soon. “From a practical standpoint, we found it’s more likely that individual states will change their policy than the federal government or accreditors at this point,” he says. “But I don’t disagree with the long-term conclusion that federal involvement would be great.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up