For decades, the US labor market has operated on the assumption that workers are mostly expendable, with replacements easily hired from a large, self-replenishing pool. This can be seen in both pay and policy. At the federal minimum wage of $7.25, a person working a full 2,080-hour year, without a single hour off, wouldn’t earn enough to breach the poverty threshold of $15,225 a year. Wage theft remains troublingly prevalent, along with the exploitation of children in industries ranging from food service to auto manufacturing.
Meanwhile, the government has no comprehensive strategy to maintain an able and well-trained labor force. There’s no national paid leave program, and even unpaid leave covers only half of workers — meaning that workers can get fired for calling in sick at a time when a communicable disease is still killing thousands each week. Unemployment benefits last as little as eight weeks, with no extensions for enrollment in retraining or education.
This seemingly hard-nosed approach has advantages, allowing companies to reduce labor costs rapidly when needed. But it also comes at the cost of human capital — knowledge, skills and abilities — that businesses and the entire country need to be competitive. Any manager can recognize that employing an experienced manager as a cashier would be a waste of capital, yet the country does something similar all the time, when seasoned workers drop out because they can’t get paid leave for a medical emergency, or because they can’t afford childcare.
Now, though, the trade-off between short-term cost-cutting and human capital appears to changing, as qualified workers become harder to find and hire. At the end of 2022, US employers had more than 11 million unfilled job openings, yet a lot of potential workers remained on the sidelines. At 83%, the share of the 25-to-54-year-old population in the labor force is below the peak years of the late 1990s, and well below peer countries. Researchers at Amazon Inc. have reportedly raised concerns that the company could soon run out of people to hire for its warehouses.
Tech-industry layoffs aside, some companies appear to be changing their workforce strategy. Despite the Federal Reserve’s efforts to slow growth and nudge up the unemployment rate, layoffs overall remain below pre-pandemic levels. Perhaps employers have waded through a shallow talent pool recently and found it unpleasant, or are hesitant to cut anyone loose after putting so much time and money into hiring.
A greater emphasis on human capital could vastly benefit companies and the entire country, in part through higher labor force participation. A host of policies the US pursues piecemeal — from training and placement services for displaced workers to basic parental support like paid leave and child care — have a proven track record of increasing work. Given the remarkably consistent relationship between labor force participation and gross domestic product, and the 70-year rise in labor productivity, more workers means more economic growth.
While Corporate America might be taking some initial steps in the right direction, a proper shift would require much more ambitious policies to maintain the largest and most productive workforce possible. Ultimately, the investment would be worth it.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Kathryn Anne Edwards is a labor economist and independent policy consultant.
More stories like this are available on bloomberg.com/opinion