No matter how the data is pulled, it's based on a tiny population of students who only took the survey and received a certificate or diplomaThat number appears to be so meaningless that it disappeared from the fact sheet posted on the Coalition's website
What you see when it comes to for-profit school job placement rates may not necessarily be what you get.
Two recent examples:
- Education Management , one of the largest operators of for-profit schools, claimed in its IPO filing with the SEC a year ago 87 percent of its students in 2008 got jobs in their field of study or “a related field” within six months of graduation.
According to its recent 10-K, the number had dropped to a still impressive 85 percent. More than half of Education Management’s students come from the Art Institutes, which includes culinary programs.
- The Coalition for Education Success, whose members appear to be mostly schools run by Education Management, claimed in a “fact sheet” two weeks ago that “more than 75 percent of two-year for-profit college graduates were placed into jobs within six months.”
Where do those figures come from? Interesting question.
And wait until you see the answer.
I started wondering about these placement rates when I saw Education Management’s job placement claims, which I believed appeared unrealistic for a company top-heavy with art school enrollees.
By contrast, according to the Association of Independent Colleges of Art & Design, around 60 percent of the students who graduated in 2008 from the Art Center of Design in Pasadena, among the country’s most rigorous and selective art schools, had full-time jobs (or were self-employed) within one year of graduation. (That doesn’t include part-timers and freelancers.)
Education Management’s claim appeared even more dubious after I saw one of its former job-placement officials, Kathleen Bittel, testify two weeks ago in front of the Senate HELP Committee.
Among claims in her written testimony: That employees at her school in Pennsylvania “were expected to convince graduates that skills they used in jobs such as working as waiters, payroll clerks, retail sales, and gas station attendants were actually related to their course of study in areas like graphic design and residential planning.”
That prompted me to ask a few simple questions of Education Management: What constitutes “placement”? And what is the meaning of “related field?”
I started asking on September 17—and have asked at least four more times since. Other than the company asking what my deadline was (I told them it was noon two Thursdays ago) I have yet to get an answer. I phoned the spokeswoman again on Monday. She asked me the context of the question and said she would get back to me. (I’m not holding my breath.)
As for the industry’s 75 percent number:
Turns out that number is just that—a number, and not just any number, but I believe a misleading number that doesn’t tell the complete story.
So where did it come from?
At first, an outside spokeswoman for the Coalition for Educational Success referred me to a sentence from a 2007 report by the Imagine America Foundation (formerly known as the Career College Foundation), which cited a footnote from an Education Department-National Center for Education Statistics study. Imagine America is affiliated with the Association of Private Sector Colleges and Universities, which until recently was called the Career College Association.
The Education Department study cited in the footnote to Imagine America’s report, however, did not contain the 75 percentfigure.
Which gets back to the question: Where did the 75 percent come from?
The Coalition for Education Success spokeswoman said she would get me in touch with someone who could walk me through the numbers. That never happened. I then called Charles River Associates, which cited the 75 percent in a report it prepared for the Coalition.
Meantime, I did hear from the Association of Private Sector Colleges and Universities, which put me in touch with JBL Associates, the firm that crunched data for the Imagine America’s 2007 study.
JBL said the 75 percent was the result of data it culled from the National Center for Education Statistics database using JBL’s own parameters.
JBL even sent me a copy of the findings—and here’s where it gets good:
As the “source” it cited the same Education Department study cited in a footnote by Imagine America.
When I mentioned that to JBL, the researcher apologized and said the correct source should be JBL, which used the Data Analysis System at the Education Department’s National Center for Education Statistics.
But backing into the numbers, using available data, tells a somewhat different story. Without getting into the gritty detail, no matter how the data is pulled it’s based on a tiny population of students who only took the survey and received a certificate or degree.
(Even the exact population on which the surveys were based is subject to debate given the wide number of variables that can be used.)
But more than that: The 75 percent doesn’t distinguish whether students got their job before, during or after they got their degree. And JBL concedes it does not specify whether the employment is full or part time. “Thus,” the researcher says, “I assume that means any work,” regardless of whether it is tied to any field of the student’s study.
Yet a deeper dive into the numbers was clearly possible, given the data available on the Education Department's website, including not just whether the student had a job, but the type of employer, the industry and even whether it is related to coursework. And, yes, there is even data on whether the student had the job while enrolled.
That’s not all: The Charles River report said it was unable to compute the same placement rate for public two-year colleges because “the data is not available.”
Not only is it available. it was included in JBL’s own findings, which show that 88 percent of public two-year college graduates were employed.
“This means public two-year graduates reported a measurably higher employment rate than graduates of for-profit two years,” a spokeswoman for the NCES says.
All of which goes to prove that, if you ask me, the 75 percent is definitely meaningless.
The kicker: That number appears to be so meaningless and even misleading that after I started asking questions,it disappeared from the fact sheet, which is posted on Coalition’s website. Makes me wonder what else isn’t quite right.