19th Ave New York, NY 95822, USA

A Fair Day’s Wage (Demo)

150209_r26114-320

On February 9, 2015, James Surowiecki writes in The New Yorker:

It’s  no secret that the years since the Great Recession have been hard on American workers. Though unemployment has finally dipped below six per cent, real wages for most have barely budged since 2007. Indeed, the whole century so far has been tough: wages haven’t grown much since 2000. So it was big news when, last month, Aetna’s C.E.O., Mark Bertolini, announced that the company’s lowest-paid workers would get a substantial raise—from twelve to sixteen dollars an hour, in some cases—as well as improved medical coverage. Bertolini didn’t stop there. He said that it was not “fair” for employees of a Fortune 50 company to be struggling to make ends meet. He explicitly linked the decision to the broader debate about inequality, mentioning that he had given copies of Thomas Piketty’s “Capital in the Twenty-first Century” to all his top executives. “Companies are not just money-making machines,” he told me last week. “For the good of the social order, these are the kinds of investments we should be willing to make.”

Such rhetoric harks back to an earlier era in U.S. labor relations. These days, most of the benefits of economic growth go to people at the top of the income ladder. But in the postwar era, in particular, the wage-setting process was shaped by norms of fairness and internal equity. These norms were bolstered by the strength of the U.S. labor movement, which emphasized the idea of the “living” or “family” wage—that someone doing a full day’s work should be paid enough to live on. But they were embraced by many in the business class, too. Economists are typically skeptical that these kinds of norms play any role in setting wages. If you want to know why wages grew fast in the nineteen-fifties, they would say, look to the economic boom and an American workforce that didn’t have to compete with foreign workers. But this is too narrow a view: the fact that the benefits of economic growth in the postwar era were widely shared had a lot to do with the assumption that companies were responsible not only to their shareholders but also to their workers. That’s why someone like Peter Drucker, the dean of management theorists, could argue that no company’s C.E.O. should be paid more than twenty times what its average employee earned.

That’s not to imply that there aren’t solid business reasons for paying workers more. A substantial body of research suggests that it can make sense to pay above-market wages—economists call them “efficiency wages.” If you pay people better, they are more likely to stay, which saves money; job turnover was costing Aetna a hundred and twenty million dollars a year. Better-paid employees tend to work harder, too. The most famous example in business history is Henry Ford’s decision, in 1914, to start paying his workers the then handsome sum of five dollars a day. Working on the Model T assembly line was an unpleasant job. Workers had been quitting in huge numbers or simply not showing up for work. Once Ford started paying better, job turnover and absenteeism plummeted, and productivity and profits rose.

Subsequent research has borne out the wisdom of Ford’s approach. As the authors of a just published study of pay and performance in a hotel chain wrote, “Increases in wages do, in fact, pay for themselves.” Zeynep Ton, a business-school professor at M.I.T., shows in her recent book, “The Good Jobs Strategy,” that one of the reasons retailers like Trader Joe’s and Costco have flourished is that, instead of relentlessly cost-cutting, they pay their employees relatively well, invest heavily in training them, and design their operations to encourage employee initiative. Their upfront labor costs may be higher, but, as Ton told me, “these companies end up with motivated, capable workers, better service, and increased sales.” Bertolini—who, as it happens, once worked on a Ford rear-axle assembly line—makes a similar argument. “It’s hard for people to be fully engaged with customers when they’re worrying about how to put food on the table,” he told me. “So I don’t buy the idea that paying people well means sacrificing short-term earnings.”

That hardly seems like a radical position. But it certainly makes Bertolini an outlier in today’s corporate America. Since the nineteen-seventies, a combination of market forces, declining union strength, and ideological changes has led to what the economist Alan Krueger has described as a steady “erosion of the norms, institutions and practices that maintain fairness in the U.S. job market.” As a result, while companies these days tend to pay lavishly for talent on the high end—Bertolini made eight million dollars in 2013—they tend to treat frontline workers as disposable commodities.

This isn’t because companies are having trouble making money: corporate America, if not the rest of the economy, has done just fine over the past five years. It’s that all the rewards went into profits and executive salaries, rather than wages. That arrangement is the result not of some inevitable market logic but of a corporate ethos that says companies should pay workers as little as they can, and no more. This is what Bertolini seems to be challenging. His move may well turn out to be merely a one-off, rather than a harbinger of bigger change. But inequality and the shrinking middle class have become abiding preoccupations on Main Street and in Washington. It’s only fair that these concerns have finally reached the executive suite. 

http://www.newyorker.com/magazine/2015/02/09/fair-days-wage

We continue to fail to connect the dots. Yes, tectonic shifts in the technologies of production is and will continue to destroy jobs and devalue the worth of labor resulting in a “share-the-scraps” economy.

The solution is not a focus on “share-the-scraps” job creation but to empower EVERY citizen to acquire personal ownership shares in wealth-creating, income-producing capital assets, which now are the privy of the wealthy ownership class.

We cannot blame businesses. Full employment is not an objective of businesses. Companies strive to keep labor input and other costs at a minimum in order to maximize profits for the owners. They strive to minimize marginal cost, the cost of producing an additional unit of a good, product or service once a business has its fixed costs in place in order to stay competitive with other companies racing to stay competitive through technological innovation. Reducing marginal costs enables businesses to increase profits, offer goods, products and services at a lower price, or both. Increasingly, new technologies are enabling companies to achieve near-zero cost growth without having to hire people. Thus, private sector job creation in numbers that match the pool of people willing and able to work is constantly being eroded by physical productive capital’s ever increasing role.

Over the past century there has been an ever-accelerating shift to productive capital––which reflects tectonic shifts in the technologies of production. The mixture of labor worker input and capital worker input has been rapidly changing at an exponential rate of increase for over 235 years in step with the Industrial Revolution (starting in 1776) and had even been changing long before that with man’s discovery of the first tools, but at a much slower rate. Up until the close of the nineteenth century, the United States remained a working democracy, with the production of products and services dependent on labor worker input. When the American Industrial Revolution began and subsequent technological advance amplified the productive power of non-human capital, plutocratic finance channeled its ownership into fewer and fewer hands, as we continue to witness today with government by the wealthy evidenced at all levels.

People invent tools to reduce toil, enable otherwise impossible production, create new highly automated industries, and significantly change the way in which products and services are produced from labor intensive to capital intensive––the core function of technological invention. Most changes in the productive capacity of the world since the beginning of the Industrial Revolution can be attributed to technological improvements in our capital assets, and a relatively diminishing proportion to human labor. Capital does not “enhance” labor productivity (labor’s ability to produce economic goods). In fact, the opposite is true. It makes many forms of labor unnecessary. Yet the virtually universal call is to want labor to be artificially elevated by government through minimum wage legislation, overtime laws, and collective bargaining legislation or by government employment and government subsidization of private employment solely to increase consumer income.

The reality is that productive capital is increasingly the source of the world’s economic growth and, therefore, should become the source of added property ownership incomes for all. Think about it, if both labor and capital are independent factors of production, and if capital’s proportionate contributions are increasing relative to that of labor, then equality of opportunity and economic justice demands that the right to property (and access to the means of acquiring and possessing property) must in justice be extended to all. Yet, sadly, the American people and its leaders still pretend to believe that labor is the ONLY productive input, and ignore the necessity to broaden personal ownership of wealth-creating, income-producing capital assets simultaneously with the growth of the American economy.

 

Leave a comment