Old North

Education, public life, and the Tar Heel State

A collection of writing, mostly about North Carolina.

Business Can Pay to Train Its Own Work Force

The Chronicle of Higher Education  |  June 22, 2015

In the spring of my senior year, I interviewed for a contract negotiation job at a law firm.

My college major was in peace, war, and defense, which may have sounded intriguing to professional litigants. But I had no legal training. My chief assets were literacy, an eagerness to please, and a pressing need to pay rent.

The interview got right to the point. “How would you organize a thousand retransmission consent contracts?” asked a stone-faced lawyer, looking across a conference table.

Having never heard of a retransmission consent contract, I offered up the only sensible response.

“Alphabetically?” I asked back.

This was not the right answer. 

But they hired me, anyway, and trained me to do the job. This cost them in the short run, while I puzzled my way through FCC regulations and Nielsen ratings, but paid off nicely over time. My contracting knowledge earned the firm solid revenue.

This is how employment is supposed to work. Companies hire broadly educated workers, invest in appropriate training, and reap the profits of a specialized workforce.

Increasingly, however, employers have discovered a way to offload the nettlesome cost of worker training. The trick is to relabel it as education, then complain that your prospective employees aren’t getting the right kind.

“Business leaders have doubts that higher education institutions in the U.S. are graduating students who meet their particular businesses’ needs,” read the first sentence of a Gallup news release last year. Barely a third of executives surveyed by Gallup and the Lumina Foundation agreed that “higher education institutions in this country are graduating students with the skills and competences that my business needs.”

Bemoaning the unpreparedness of undergraduates isn’t new. Today, however, those complaints are getting a more sympathetic hearing from the policymakers who govern public higher education. 

“We’ve got to adapt our education to what the marketplace needs,” North Carolina Governor Pat McCrory said earlier this year at a conference on innovation. “People are ready to get the work. Let’s teach them these skills as quick as possible.”

The governor spoke shortly after a panel on “New Delivery Models for Higher Education.” Moderated by the head of the state’s chamber of commerce, the session highlighted a particularly innovative approach to education at a tech startup called Iron Yard.

Iron Yard is a for-profit code school — it teaches people how to program computers, build applications, and design websites. A 12-week course costs $12,000, promising quick proficiency in one of the tech industry’s in-demand skills.

I don’t object to this, except the part where policymakers and business leaders call it a new model for higher education. It is actually a new model for worker training, one in which the workers bear the costs and risks for their own job-specific skill acquisition, while employers eagerly revise the curriculum to meet their immediate needs.

--------------------------

Critics of contemporary higher education lament the decline of a broad, humanistic education, but often misidentify the cause. To the extent that such a curriculum is on the wane, the culprit is not ‘60s-vintage faculty radicalism or political correctness run rampant, but the anxiety-driven preference for career-focused classes and majors. 

Most faculty would love to have more students delving into the classical canon—or any canon. But they’re up against policymakers and nervous parents who think average starting salaries are the best metric for weighing academic majors. Private-sector imperatives also threaten to dominate extracurricular time. I now work at a large public university, where I serve as a staff mentor to a cohort of freshman undergraduates. Inevitably, I spend the first few weeks of the fall semester tamping down anxiety about summer internships. Students who haven’t yet cracked a textbook or met a professor worry about finding summer programs to boost their resumes.

My university recently began offering grants to low-income students who otherwise can’t afford to take internships. It’s a great program, and I’m glad we have it. But it means that academe and its donors are now responsible for subsidizing profitable companies that want future employees to have work experience, but don’t want to pay students for a summer’s work. There are many ways society could choose to address the inequity of unpaid internships. Having universities collect and distribute tax-deductible grants to the private sector’s trainees is perhaps not the most straightforward.

This blurring of the distinction between education and job-skill training isn’t simply a fight over academic priorities. It’s a fight about who pays the cost of doing business—the companies that profit, or some combination of workers and taxpayers. The more we’re willing to countenance a redefinition of job training as education, the more we ask society to shoulder what were once business expenses.

--------------------------

This same tension between public investment and private returns is playing out in the realm of research. 

As state funding for research universities has ebbed, pressure has increased for academic institutions to more efficiently monetize their discoveries. Policymakers talk of shortening the pipeline from the laboratory to the marketplace, putting ever-greater emphasis on the kind of applied research that might yield quick returns.

This is all perfectly well—no one begrudges the discovery of a breakthrough drug or a valuable new material. But with finite campus resources, more emphasis on marketable products will inevitably mean less focus on the foundational, long-range science that may not yield tangible results for decades. This has already happened in the private sector, where a relentless focus on short-term returns has crowded out spending on fundamental research. Sending universities down the same path risks eroding one of our most important bastions of basic science.

I sat through an economic development workshop recently — titled “Research to Revenue” — in which a successful startup CEO spoke with admirable bluntness about the need to keep university researchers involved in product development but off the company payroll.

“The salaries of these people are often significant,” noted the private-sector executive. “As a company, you really don’t want to take that on unless you absolutely have to.”

Of course not. Much better to let taxpayers, through universities and federal grant dollars, pick up the tab while private-sector “partners” guide faculty efforts toward privately profitable ends. This is what a more entrepreneurial campus means, after all — a campus more attuned to profit.

--------------------------

“The thought now and then assails us that material efficiency and the passion to ‘get on’ in the world of things is already making it so that the liberal arts college cannot exist,” UNC President Edward Kidder Graham wrote in 1916.  “But this is a passing phase,” he continued, advising colleges to keep their focus on creating and teaching “the true wealth of life.”

If Graham’s confident vision feels today like a hopeless anachronism, then we begin to measure the distance of our retreat. Faced with recessionary state finances and lawmakers who regard public goods as oxymoronic, university leaders have reached for the language of investment and return. The consequences of that narrow view are mounting.

Celebrating the intrinsic value of public higher education is not a nostalgic indulgence, but a joyful duty. We spoke that language once; we should try it again. 

Eric Johnson works in student-aid communications at the University of North Carolina at Chapel Hill. The views expressed here are his own.

 

Originally published in The Chronicle of Higher Education at chronicle.com/article/Business-Can-Pay-to-Train-Its/231015/

Made in Chapel Hill.