By Matthew Shaw and Drake Palmer
Recent jobs data sparked excitement as news reports talked of how America is finally going back to work. This is understandable optimism, based as it was on a concurrent rise in labor-force participation and a drop in the government’s preferred measure of unemployment. Here, we assess whether the Fed’s “solid” and “very well performing” economy has finally allowed low-and-moderate income (LMI) households to share the prosperity rapidly pooling at the very top of the income and wealth distribution. In short, and sad to say, it isn’t – hourly pay for low-wage/low-skill workers has declined in real (i.e., inflation-adjusted) terms over the past four decades and is essentially flat since 2010. As we noted in our last blog post, wealth concentration has soared since the financial crisis. Even if a corner has now been turned for everyone else, it’s just a very tight one at the bottom of the equality canyon.
Unemployment and Wages
Since the financial crisis, signs of recovery in the labor market have continually been undercut by the large number of discouraged potential workers. Reflecting this, data from the Bureau of Labor Statistics (BLS) taking into account both unemployed and sidelined workers (as well as those working part-time for economic reasons) are now back to pre-crisis levels. Is this good news for low-wage workers? Not exactly.
To show why optimism may be premature, we examine trends in worker real compensation, paying attention not only to trends in wages, but also to real-life costs for low-skilled workers. Having a job is of course better than not having one. But, if wage increases do not keep pace with increases in living costs, then workers are working longer for less and falling ever further behind their higher-skilled peers. A recent article in the Washington Post shows this all too clearly for one household in Ames, Iowa – the Metropolitan Statistical Area with the nation’s lowest unemployment rate.
The decreased unemployment and increased labor participation data from the BLS are considered by many to be a sign of an improving economy not only because having a job is better than not having one, but also because economic theory posits that low unemployment and high participation leads to wage increases as employers compete for a smaller pool of laborers. If true, then there should be correlation between improving unemployment and labor-participation rates and increased wages. The unemployment rate taking discouraged and part-time workers into account has fallen from its crisis high of over 17 percent to stand at 7.5 percent. However, inflation-adjusted weekly wages for the bottom quartile of U.S. workers averaged less than 0.4 percent annual growth over the same period.
Even worse, low-skill workers – i.e. the majority of low-wage workers – have seen their hourly wages decline in real terms for close to four decades. According to the Congressional Research Service, median hourly wages for workers with a high school education or less fell in real terms by 14.3 percent between 1979 and 2017 from $19.05 to $16.25 for those with a diploma and from $16.48 to $12.50 for those without (all in 2017 dollars). Even the highest earners among the high-school-educated work force saw wages decrease in real terms over this time, with those at the 90th percentile seeing a 7.5 percent drop. Given that weekly wages are essentially flat and hourly wages are down in real terms, the average low-skill wage earner is likely working more hours for essentially no additional money.
Perhaps surprisingly, these low-skill/low-wage workers have in fact seen household incomes increase over the same period. How can households increase overall income while real wages are falling? Either they increase the number of wage earners in their household, increase the number of hours worked as posited above, or both. Regardless of the method, the bottom line is that more total hours have to be worked by members of low-skill households just to tread water financially.
A lack of robust data on the number of wage-earners per household over time segmented by household income provides a murky picture of overall wage-earner trends among low-income households. Data from the Census Bureau going back only to 2000 show a slight drop in the number of earners per household for all income quintiles while also showing a correlation between a higher number of earners and higher household income. However, data show clearly that low-wage workers have in fact significantly increased the number of hours they work since 1979. Between 1979 and 2016, the bottom fifth of all prime-age wage earners increased their annual hours worked by 24.3 percent. In comparison, the middle quintile of prime-age wage earners increased their annual hours by only 9.4 percent and the top fifth did so by only 3.6 percent.
The High Cost of Low Wages
What added benefits are low-skill workers getting for their increased hours and incomes? Not many.
While bottom-fifth household incomes increased as a result of the increase in hours worked, essential household costs have also risen, eating away at any increased consumption ability resulting from higher real incomes. The Congressional Research Service study noted above points out that, while the overall median wage fell in real terms between 2007 and 2014, total compensation – i.e., wages plus other employer-provided benefits – was statistically unchanged due in large part to the rising cost of health insurance.
Employers have therefore maintained in real terms what they spend on employees, but employees have seen a decrease in their take-home pay due to employers shifting the balance of overall compensation away from wages and towards health care coverage. Notably, health-care costs as a percentage of GDP have more than doubled since 1980 from 8.9 percent to an estimated 18.2 percent this year.
Child care costs have also increased in recent decades while wages have fallen for low-skill workers on an inflation-adjusted basis. Estimates of the costs of child care vary dramatically based on the inputs and methodology, but all agree that child-care costs are increasing faster than inflation. According to the USDA – which uses a generous cutoff of before-tax income of $59,200 – low-income families on average spend approximately 27 percent of their income on child care. Low-income single parents have it even worse, with the USDA estimating that the average low-income single parent spends almost 40 percent of his or her income on child care. And while low-wage parents need to work more hours to meet rising living-cost demands due to falling real wages, additional hours worked often mean additional hours of child care and thus additional costs to cover.
More people with jobs is always a good thing, but the unemployment rate alone does not provide enough information to determine how those with jobs, particularly at the bottom of the wage/income distribution, are faring. Data on wages, income, and health- and child-care living costs show clearly that low-skill/low-income workers are working more hours for essentially the same pay while their living costs continue to increase. Further, other, unavoidable household costs are also dragging down the buying-power of wages gained from more jobs: increases in housing costs, for example, show that just getting by is even harder now than it was when the Fed recognized that the nation was in the “Great Recession.”
Our earlier blog post on rising wealth inequality also shows that even these high costs are not all of those borne by low-wage household during these “good times.” As that post notes, the aggregate wealth of the bottom half of American households was in 2016 less than half of what it was in 2007 due in no small part to a sharp increase in the debt burdens of these households. For more on this issue, see Joe Nocera’s interview on August 6 with Karen Petrou.