Home Depot Syndrome, the Purple Squirrel, and America’s Job Hunt Rabbit Hole
Waiting for Superman
In the olden days, as Cappelli sketches them, HR departments served as “reality testers.” Say a line manager at a big firm got permission to hire a new worker. “He’d say, ‘We need somebody with an MBA for this.’ And the HR people would say, ‘You really need an MBA degree for that? Are you sure? What’s important in this job?’ … They’d be pushing back a bit in terms of the job description.”
This injected a degree of flexibility into job criteria.
“Those guys are gone now,” Cappelli continues. “Now the requisition often goes automatically to somebody who inserts it into the applicant-tracking system. So they kind of take the wish list from the hiring manager, who is often looking for Superman—the Purple Squirrel, as they say in IT—something that doesn’t exist.”
Unless maybe it does, what with all the talent among the ranks of the unemployed. When there are three or four job-seekers for every vacancy—and some postings draw applicants by the hundred—firms have an understandable incentive to wait for a dream candidate to show up. And ideally, a dream candidate who expresses a low salary requirement.
In that Manpower survey, 11 percent of the employers reporting skill shortages chalked it up to applicants unwilling to accept job offers at the wages companies were willing to pay.
“Given what we know about the difficulty all respondents have in recognizing problems that are actually their own fault,” Cappelli writes, “the real percentage of employers who have difficulty hiring because they are not offering adequate wages is likely to be much, much higher.”
He cites the case of a parts-supply company whose inability to fill 40 machinist vacancies had been estimated to be dragging sales down by 20 percent. “The jobs reportedly paid $13 per hour, which might sound good. But the Bureau of Labor Statistics reports that the average wage for such jobs is more than $19 per hour,” Cappelli notes. “Would that have had some effect on the companies’ ability to find candidates? You bet.”
But, again, that has little to do with a skills shortage. “A real shortage means not being able to find appropriate candidates at market-clearing wages,” Cappelli noted in The Wall Street Journal. In his book he adds, “When I hear stories about the difficulty in finding applicants, I always ask employers if they have tried raising wages, which in many cases have not gone up in years. The response is virtually always that they believe their wages are high enough.”
As the case of that parts-supply company shows, there is often a cost associated with letting positions remain vacant. But most of the time it’s hard to quantify—whereas the benefit of not paying another salary is as clear as day. That’s another source of grit in the gears of hiring.
In one sense, you can trace it all the way back to what typically spurs companies to post job openings in the first place. It is not, as an economist of the rational-choice school might suppose, a determination that a new worker will add more profit than expense to the balance sheet. Arguments to hire, Cappelli says, “mainly come from people complaining about overwork.”
In some cases—perhaps especially when high unemployment keeps existing workers docile—the complaints come from a company’s own customers. Cappelli recently came across a financial-services company whose retail customers were actually signing petitions asking the firm to deploy more service people.
The prevailing approach among American companies, as Cappelli characterizes it, has been, “Let’s just cut staff until we notice blood. But part of the problem, too, is that the accounting systems internally have not been sophisticated enough to notice when there is blood.” He suggests that the development of performance metrics capable of measuring the cost of lost opportunities, or of burnout among existing workers trying to do two jobs at once, would redound to the benefit of job-seekers and bottom lines alike.
Chris Ittner, the Ernst & Young Professor of Accounting at Wharton, has a more skeptical view.
“In accounting, we don’t account well for intangible assets, period—but with people especially,” he agrees. “Firms have a hard time even finding out what kind of payback they get from training their employees.”
But he deems it unlikely that American companies are suffering from a collective failure to realize the moneymaking potential of ramping up their payrolls. If profits in a given sector were being held down by overly lean workforces, he suggests “there would be an arbitrage opportunity.” In other words, you’d expect one or several companies to accelerate hiring as a way to gain an advantage over their competitors—and if that strategy panned out, their competitors would likely follow suit. But there’s little evidence of that happening.
Ittner observes further that hiring is inherently risky.
“If you choose to hire, you’re stuck with that decision—because it’s not that easy to fire people. And firms don’t want to make an investment when there’s so much uncertainty about what will happen in the future,” he says. “So you might be willing to accept the cost of lost productivity or lost sales.”
A Blind Alley
In virtually every discussion about America’s jobs crisis, a familiar solution is trotted out. If only we could get more people through college—whether that means a bachelor’s or two-year associate’s degree—we’d have a workforce matched to employers’ needs.
On an individual basis, this advice holds water. College graduates have substantially more success in the job market than their less-schooled peers. (For Penn alumni, the picture is rosier still. For a snapshot of how Quakers have been faring in recent years, see chart). The Penn Alumni Career Network is another potentially valuable resource: www.vpul.upenn.edu/careerservices/pacnet.)
On a societal basis, however, there are good reasons to doubt the efficacy of that prescription. Cappelli worries about over-education. Citing survey data, he points out that many American workers have—and have paid for—more education than what’s required by the jobs they are doing. That can be viewed as a deadweight loss for the economy at large, and it’s getting worse.
“In order to prove to an employer that you can do this job,” he explains, “maybe you get an extra degree. So rather than two years experience as a pharmaceutical rep, I go get a master’s degree in pharmaceutical rep work. In Philadelphia, there’s a local business school that has an MBA degree in pharmaceutical marketing, a highly specific degree which you never would have seen a generation ago. And it’s an expensive way to get the experience. It’s a time-consuming way. And I’m sort of over-relying on academic institutions to get this.”
From an employer’s perspective, this approach beats “growing all your own talent,” as the IBMs and GEs of yesteryear did. After all, a company invests in training its employees at the peril of having them hired away by a competitor who avoids the expense. So even if it has to pay a premium for a candidate who has an equivalent academic credential, that makes more sense.
The problem, as Cappelli sees it, is that this dynamic has fueled an “explosion of certifications” in the US, the majority of them provided by for-profit organizations and vocational schools. Aside from the incentive this creates for people to pursue more credentials than jobs require, just to “muscle out” other job-seekers, Cappelli wonders if there may be an underappreciated downside for the employers themselves.
The rise of credentials as a basis for hiring “makes people more plug-and-play,” Cappelli points out.
“You see, for example, in places like nursing and health care, where credentials are everything now, that it makes it easier to pop new people in and pop other people out. Maybe that’s a good thing. It does mean for an employer, though, that everything becomes more like a profession. And your ability to get things done differently might become a little harder. And to get practices that are unique to you, a little harder.
“For example, say I’m an IT person working in your company, and you’d like me to get good at your legacy computing system. Should I do that? What’s in it for me? It’s risky, because I’m going to spend a couple years working on this system, and I don’t get any credential out of it that’s useful elsewhere. So maybe I don’t even take that job—I’d much prefer to work for less for somebody else where I get a credential at the end, which is transferable.”
Part of the appeal of a “plug-and-play” approach to employers is the sense that it affords them flexibility. “If we’re going to change our products or change our strategy,” Cappelli explains, “we just get rid of everybody, and then we’ll hire in a new group, with different skills.”
Yet “some of this is not completely demonstrated by evidence,” he adds. “A generation ago you would have heard companies like IBM talk about how lifetime employment gave them flexibility—because people internally didn’t resist changes. You would change your products, you would retrain the people—off you go.”
Still, the rise of vocational certification leaves the brunt of the risk on job-seekers. Cappelli has learned this the hard way, which is perhaps ironic considering his area of expertise. When his own son couldn’t find a job with his college degree in classics, he “looked to one of the technical fields in health care that had been identified as hot, where employers (the media assured) were struggling to hire,” Cappelli writes. “He went back to school, at a community college, and got a skills certificate in that field—only to discover that it was not hot.”
If that fate can meet the college-educated son of a world-renowned expert in human resources, it may be time for the rest of us to look for another solution.