Half-Empty or Half-Full, Part I

by Russ Roberts on September 6, 2006

in Standard of Living

There are data collected by the government suggesting that the average worker in America has not shared in the progress of the last few years or maybe even the last thirty years. Policies to solve this problem range from raising the minimum wage to tax reform.

My claim is that if you really believe that the average worker has failed to benefit from the astounding economic growth of the last 30 years, then you have to also believe in more fundamental economic change than raising the minimum wage or getting rid the mortgage deduction.

I prefer instead, to believe that the data supporting the conclusion of stagnation is flawed. That’s an easy claim to make. All data are imperfect. So in this post, which I hope is the first of a series, I want to look carefully at a key piece of economic data and see how reliable it might be for drawing the conclusions people often draw about stagnation.

Let’s begin with a quote from Paul Krugman in a recent column ($) that nicely summarizes the gloomy view that the glass is at best half-empty:

There are still some pundits out there lecturing people about how great
the economy is. But most analysts seem to finally realize that
Americans have good reasons to be unhappy with the state of the
economy: although G.D.P. growth has been pretty good for the last few
years, most workers have seen their wages lag behind inflation and
their benefits deteriorate.

The disconnect between overall economic growth and the growing
squeeze on many working Americans will probably play a big role this
November, partly because President Bush seems so out of touch: the more
he insists that it’s a great economy, the angrier voters seem to get.
But the disconnect didn’t begin with Mr. Bush, and it won’t end with
him, unless we have a major change in policies.

The stagnation of
real wages — wages adjusted for inflation — actually goes back more
than 30 years. The real wage of nonsupervisory workers reached a peak
in the early 1970’s, at the end of the postwar boom. Since then workers
have sometimes gained ground, sometimes lost it, but they have never
earned as much per hour as they did in 1973.

The data on stagnating real wages is summarized in the chart below. I’ve taken the chart from the Economic Report of the President, Table B-47 (which in turn takes it from the Bureau of Labor Statistics (BLS), so if you want to follow along at home to make sure I’m playing fair, go right ahead.

AheshortKrugman’s claim looks pretty convincing doesn’t it? Real wages do indeed peak around 1973. (Actually, real wages in 1972 are a penny an hour higher than in 1973 but that’s probably due to some revision since Krugman last looked at this series. Besides, it’s only a penny. Close enough.) They do fall pretty steadily between 1973 and 1993, then rise, though not enough to make up for the fall, then fall again. The bottom line appears to be pretty clear. The average worker’s wages have not kept up with inflation over the last 30-plus years. In fact, the average worker in 2005 made 9% less in real terms than the average worker in 1972. In other words, the living standard of the average worker in 2005 was only 91% as high as the average worker in 1972. BTW, these data are in 1982 dollars, so ignore the levels and focus on the changes from year-to-year.

Just for fun, here are the exact same data:

The only difference between the two charts is the scale of the vertical axis. In the first chart, the vertical axis started at $7.40 and hour. In this chart, the vertical axis starts at zero. It’s interesting how the first chart fools your eye and exaggerates the swings over time. When you look at the data in the second chart, the picture is a little clearer: stagnation. The ups and downs have been smoothed out and it’s pretty clear that the apparent standard of living of the average worker is about the same as it was 40 years ago.

Is that true? Are these data an accurate portrait of economic reality for the average American? I was a little boy in 1964. I was a high school senior in 1972. I remember those years. America certainly seems dramatically more prosperous than back then on virtually every basic dimension—housing, cars, health care, luxuries. Now it’s possible that those memories are selective. And it’s possible that my observations of the world around me today are selective—selective geographically for example. I live and teach in the affluent Washington, DC area. You have to be careful not to draw conclusions based on a biased sample. But still. Is it really plausible to conclude that the average worker has been treading water for 40 years in a time that real GDP and real per-capita GDP are up dramatically over the same time period?

To believe that, you have to believe there is some secret mechanism that keeps the fruits of growth as the exclusive delight for upper-income Americans.

Or maybe these data don’t capture the standard of living of the average American. Maybe these data are helplessly flawed. All data have problems. Let us count the ways with real average hourly earnings:

1. What does "real" mean? Corrected for inflation. Is the inflation correction accurate?
2. What is the definition of wages? Does it include benefits?
3. How are the data collected? Do they capture the entire economy or only a part of it?
4. How are the data collected? What exactly is the "average" in average hourly earnings?

Questions 2 and 3 are easiest to answer. The measure here does not include benefits. Benefits have become an increasingly important part of compensation over time, so looking at average hourly earnings alone is surely misleading. The sample of workers is non-supervisory and production workers. It excludes self-employed individuals, an increasingly important part of the work force. It also excludes managers, who tend to be paid at higher rates than those they manage. That would seem to be good—it takes out the high end of the earnings distribution, which might pull the average upward making it a bad measure for say, the median worker’s standard of living. According to the BLS, production and non-supervisory workers make up about 80% of the work force. So it’s still a very big slice.

In the rest of this post, I want to focus on questions 1 and 4, because I think the answers to those questions are a bit surprising and not widely known. I didn’t know of them until recently.

When I hear the phrase, average hourly earnings, I think of taking a bunch of different wage rates, some high, some low, and averaging them. But that’s not how these data are collected. The data on average hourly earnings are taken from the Current Employment Statistics (CES), a survey not of individuals, but of establishments. The BLS gathers data from 160,000 different businesses and agencies covering 400,000 work sites. They ask each business for each work site, to provide total wages paid and total hours worked and they divide the two. That’s the average hourly earnings at that establishment.

That’s a little bit weird. Do most businesses really know total hours worked? My employer, George Mason University, certainly doesn’t. There’s no way the President of the University or the head of human resources or any department chair or any individual knows how much people actually work at the University. I don’t even know how many hours I’m supposed to work. I work at home. I work at night. I come in later than 9:00 am. I come in earlier. Sometime when I’m "working", that is, when I’m sitting at my desk in my office, I’m checking the Red Sox box score or chatting with my wife. Sometimes, while I’m driving my kids to a baseball game, I’m thinking about real wages. I have no idea what my actual hours worked might be. Certainly no one else at the University knows either.

Sure, there are places where people punch a clock. But those places have become a lot less common than in 1964. How do businesses report their total hours worked to get an accurate number? How much leisure takes place on the job today compared to 1964? How much work gets done at home that the company has no knowledge of? I have no idea. Neither does the BLS.

How does the BLS adjust the weights in the survey? When a manufacturing plant closes and a graphic arts company opens, how do they re-weight the sample? I’m sure they do re-weight, and I’ll try and find out how they do it, but does their method result in any systematic bias?

Now consider the measure of inflation. I’m told by someone at the BLS, that average hourly earnings in that standard B-47 table are deflated by CPI-W, a price index that is for urban wage earners and clerical workers. But that series has many flaws. (BTW, it’s the series used to calculate social security benefits, so changing how it is calculated has political implications) A more standard measure would be CPI-U, a price index for urban consumers. But that series is flawed as well. Over the years, the BLS has made many improvements to CPI-U, as the BLS explains:

The Bureau of Labor Statistics (BLS) has made numerous improvements to
the Consumer Price Index (CPI) over the past quarter-century. While
these improvements make the present and future CPI more accurate,
historical price index series are not adjusted to reflect the
improvements. Many researchers, however, expressed an interest in
having a historical series that was measured consistently over the
entire period. Accordingly, the Consumer Price Index research series
using current methods (CPI-U-RS) presents an estimate of the CPI for
all Urban Consumers (CPI-U) from 1978 to present that incorporates most
of the improvements made over that time span into the entire series.

What would real average hourly earnings look like if we used CPI-U-RS? Well, let’s take a look. We can only go back to 1978, so let’s take the previous chart we used and look at 1978-2005. That is, this is the same picture as the first chart of this post, but with a shorter data period:

I have converted the data to real dollars using 2005 dollars as the base to help make the comparison that I’m about to make. But the pattern is exactly the same as in the first chart which was in 1982 dollars—a glass half-full at best. According to these numbers, over the last quarter of a century, from 1978-2005, the average worker is worse off. The 2005 wage number is only 94% of the 1978 number.

And of course, we know that it’s not the same people being sampled in 1978 and in 2005. So we certainly don’t want to conclude that the average worker is literally worse off. But it still  is surprising that the average person in 2005 is worse off than the average person in 1978.  Now let’s look at the same picture using CPI-U-RS:


There’s a similar pattern as in the series deflated by CPI-W: a decline followed by an upturn, than a slight decline. But notice that when you use CPI-U-RS, the "best" price index that the BLS has, the state-of-the-art series, you get a different conclusion for progress over the period. Using CPI-U-RS to convert nominal earnings into real earnings, you find that workers today are better off than workers in 1979. Even ignoring benefits, worker’s wages are 1% higher. Now 1% higher is a pitiful performance over 27 years. But it’s better than a 6% decline. (Though of course if I stretched the vertical axis down to zero, these differences would again look small).

Is it possible that even CPI-U-RS is flawed? Of course it is. It doesn’t accurately take account of quality changes. Mark Bils has shown that it’s off by quite a bit. Jerry Hausman and Ephraim Leibtag have shown that the CPI overstates the price level of food and inflation because it ignores the phenomenon of Wal-Mart. The CPI dramatically overstates inflation. Which means it understates improvements in standard of living. Is it possible that if we corrected the CPI for quality even wages would show a dramatic improvement over time in real terms? I think it would, but that calculation is still a work in progress.

The bottom line: the standard measure the pessimists use to indict the American economy’s treatment of the little guy, the average worker, is a strange measure. It’s a snapshot at different points in time that looks at different people, it is sensitive to even the BLS’s definition of inflation, it doesn’t include benefits which are an increasingly important part of compensation and it’s based on a survey of establishments where hours are difficult to measure.

Would you use that measure as a basis for making policy conclusions?

In the next post of this series, I’ll look at other measures of compensation collected by the BLS.

Be Sociable, Share!



48 comments    Share Share    Print    Email


David September 6, 2006 at 4:12 pm

Oakland A's team batting average: 13th of 14 teams in AL
Oakland A's total runs scored: 11th of 14

Oakland A's in AL West standings: 1st place, 5.5 games ahead of my home team. Thanks, Paul, the Angels season is looking better already!

pawnking71 September 6, 2006 at 4:37 pm

I'm concerned about data which treats workers as if they had no upward mobility. As I understand the study you refer to, it assumes that a man on a wigit line in 1972 is still making wigits in 2006. Or if he retured, his son is on the line, making wigits. That's obviously nonsense.

The fact is, this grandson of a weilder is now a white collar worker, making quite a bit more in both relative and absolute terms than my grandfather ever did, even if you accept the arguement that weilders make the same as they did, and that accontants do, too.

Also consider what you money buys. Maybe a car costs a lot more in absolute terms today than in 1972, and possibly more in relative terms, as well. But my car just went over the 150,000 marker, which is remarkable because it's so unremarkable. Remember when hitting 100,000 was a big deal? Let's not even discuss the fact that my car is safer, has better gas mileage, is more environmentally friendly, etc.

My point is that by almost any measure, Americans are better off today than they were 30 years ago, or 10 years ago, or even 5. Only through great effort can we seem to be worse off. Such logic does not hold up to scrutiny.

Chris September 6, 2006 at 4:49 pm

Doesn't changes in worker wages drive inflation – at least in part? If you increased everyone's wages by 100% wouldn't things like food, housing, energy, etc go up in kind as price sensitivity is decreased?

Being concerned that average wages haven't gone up seems, to me, like a silly argument to be making.

Don Lloyd September 6, 2006 at 5:08 pm

It seems to me that the lagging minimum wage could be part of the explanation, but as a positive, not a negative.

Assume a continuing supply of sub-entry level workers, both high school dropouts and immigrants.

Assume that they can find something to do for pay, and that they are always paid their marginal revenue product by whoever can make use of their limited skills.

Also assume that their real productivity and skill levels increase at a steady 5% per year rate.

If there were no price inflation and no change in the minimum wage, then each worker would at some at point in time become productive enough to be able to seek work in the official and reported part of the economy that is subject to the minimum wage.

Under these conditions, changes in the minimum wage relative to price inflation change the year in which any individual breaks out into the official economy and, presumably, becomes a part of the statistical workforce.

The more the minimum wage lags price inflation, the earlier each given worker can become part of the official workforce.

The overall statistical result of a lagging minimum wage would tend to be lower statistical wages, but a benefit to individual workers.

Regards, Don

Kevin Nowell September 6, 2006 at 5:47 pm

Doesn't changes in worker wages drive inflation – at least in part? If you increased everyone's wages by 100% wouldn't things like food, housing, energy, etc go up in kind as price sensitivity is decreased?

I don't think so. Price inflation increases wages but an increase in wages does not cause price inflation. Increased real wages are caused by increased production so they wouldn't necessarily decrease price sensitivity.

I really enjoyed today's post. It literally blows Krugman's argument out of the water.

Its interesting that the assumption made by Krugman et al is that since the 70s we have had less government and if it can be proven (however falliciously) that we are worse off than then then that proves the case for increased government. I really don't think the case can be made that government intervention in the economy has actually diminished since the 70s.

JohnDewey September 6, 2006 at 6:18 pm

Thanks for the great post, Professor Boudreaux.

I had learned from Virginia Postrel that CPI-W increases transfers to seniors. She explained that political considerations (Seniors vote!) wouldn't allow use of an accurate index. But I didn't realize economic reports presented by the Executive Branch were also distorted.

Don Boudreaux September 6, 2006 at 6:41 pm


Right you are — save for your assumption that I, rather than Russ, wrote this superb post!

Don Bx

JohnDewey September 6, 2006 at 7:02 pm

Oops! Sorry about that.

Offa Rex September 6, 2006 at 7:22 pm

Reminds me of the quote by Ernest Benn:

"Politics is the art of looking for trouble, finding it whether it exists or not, diagnosing it incorrectly, and applying the wrong remedy."

Robert September 6, 2006 at 9:20 pm

Can someone explain why in Krugman's columns, and many more, that the proportion of the american workforce is the same as it was 30 years ago? Typically I am thinking about the number of factory workers now compared to 30 years ago, and the other part of that is why do guys like Krugman try to insinuate that the number itself of blue collar workers is the same? I guess this goes into the heading of income mobility, but I was curious if anyone else thought of that.


Marc September 6, 2006 at 11:10 pm

One thing I would like to see is this data corrected for demographic patterns like average experience of the workforce and women as a percentage of the workforce. We know that the way they calculate the average wage is misleading from the above exposition. After all, every demographer would tell you that the baby boomers entering the workforce in the early 70s would have increased the youth of the workforce. We also know that less experienced workers, especially ones that haven't gone through 27 raise cycles, are going to get paid less than experienced workers in general for the same work. Also, women entering the workforce in the 70s would have also increased supply and depressed wages. I would think these are at least as important as the greater move towards self-employment in the last 25 years.

Also, one thing I would emphasize more, Professor, is that it by design excludes those who got a promotion to management, something that would clearly be an improvement in standard of living in most cases.

Ammonium September 7, 2006 at 1:34 am

Its interesting that the assumption made by Krugman et al is that since the 70s we have had less government and if it can be proven (however falliciously) that we are worse off than then then that proves the case for increased government. I really don't think the case can be made that government intervention in the economy has actually diminished since the 70s.

Government spending as a percentage of GDP actually peaked in 1992 (it was only higher during WWII). Real government spending, of course, increases to a new high every year.

It's interesting that in the 1980s and the first part of the 1990s government spending as a percentage of GDP eclipsed the percentages in the 1970s, or any time since WWII. During the Clinton administration these percentages fell to 1970s levels, but they've increased in the 2000s. We're still quite a bit below where we were during the 1980s, however.

I guess we can call the 1980s the Decade of Big Government. However, since real government spending is now over twice what it ever was in the 1980s, maybe we should call 2006 The Year of the Biggest Government Ever (So Far).

If Krugman thinks that too little government spending is the problem, it seems like he ought to be rooting for the Republicans to be in power.

Ammonium September 7, 2006 at 1:48 am

Oops… I was looking at nominal government spending. Real spending is now only a bit over 50% more what it ever was in the 1980s.

Slocum September 7, 2006 at 8:38 am

There's an interesting tension here. On the one hand, analyses like these (excellent as they are) get very much less publicity than the gloom and doom in the NY Times (news and op-ed).

But on the other hand, there are a lot of American around who are old enough to remember the 1970s and 80s and who realize how much nicer our cars are now and how much bigger our houses are, how much better and cheaper our home entertainment is, how much more able we are to travel by air, and so on. And even our much more expensive medicine is much better. (Would you rather pay today's prices for today's medicine than 1973 prices for 1973 medicine–assuming you'd just torn your ACL, or had a blockage in a heart artery?)

And, because people are aware of all this improvement in living standards, there's a natural skepticism with respect to all the gloom mongering and the claims that the average American was richest in 1973 and has gotten steadily poorer since. You could almost say that the question is, "Who are you going to believe, Paul Krugman or your lyin' eyes?"

Aaron Krowne September 7, 2006 at 9:00 am


There are at least two major canards in your arguments:

1. You essentially argue that benefits represent compensation "dark matter," where there is lots of hidden, shadow income being collected by the worker.

But I know this to be false from my own experience. The majority of my benefits are in health care, which is provably dramatically over-priced, and systemically inflated (it is certainly not 40% more valuable than it was five years ago, but that's what the prices would have you believe). Thus, most of this "benefit" is not actually redeemable by workers. It doesn't exist.

(See http://br.endernet.org/~akrowne/writings/us_health_care/us_health_care/)

2. You assume that the inflation series are flawed, which is fine, but conveniently do so only in the *upwards* direction. My own view is that the inflation series were /more/ correct before the 1990s, not less. Geometric averaging and hedonic adjustments are objective errors in the series and philosophically have no place in it, but serve to significantly tweak it downwards. Neither have any sort of empirical grounding.

In general, I don't see how this supposed bounty of "dark matter wealth" squares with the negative savings rate and an increasing reliance on an increasing quantity of consumer credit. You've really got some 'splainin' to do here.

Also, the Wal-Mart effect seems like a distraction to me. Wouldn't this be zero-sum, since it largely relies upon global labor arbitrage? Domestic manufacturers and retailers put out of work have incomes of zero.

JohnDewey September 7, 2006 at 10:08 am

Aaron Krowne: "But I know this to be false from my own experience. The majority of my benefits are in health care"

Perhaps your experience is not entirely representative of the population. According to the BLS, private employers pay for these benefits in addition to health care:

retirement – available to 60% of employees
paid holidays – 77%
sick leave – 58%
paid vacations – 77%
disability insurance – 39%
child care – 14%

I don't know how much these have changed since 1979. BLS data do show that availability has increased for all these benefits since 1999.

Aaron Krowne: "health care, which is provably dramatically over-priced"

I don't understand how this can be. Health insurance plans are subject to competitve bidding. If anything, the value of group-negotiated medical insurance should be higher to employees than the amount expensed by the company. I think it is the latter figure that gets included in total compenasation estimates.

JohnDewey September 7, 2006 at 10:41 am

Aaron Krowne: "(health care) is certainly not 40% more valuable than it was five years ago"

Perhaps not 40% more valuable to you, but probably to some workers. The quality of health care has sharply increased in my wife's 30 year RN career. Here's a few innovations her patients have benefitted from:

1. minimally invasive surgery – has proven to be far less traumatic to patients initially, with much shorter recovery times;

2. magnetic resonance imaging (MRI) – has vastly improved diagnostic abilities, and no doubt saved many lives;

3. cochlear implants – have provided the sense of sound to over 100,000 deaf or near-deaf persons worldwide.

We could list a few thousand improvements – in equipment, medicine, and knowledge – that we've gained since 1979. You may not yet have benefitted from them. But you are likely to before you die.

Bill Conerly September 7, 2006 at 11:51 am

"How does the BLS adjust the weights in the survey? When a manufacturing plant closes and a graphic arts company opens, how do they re-weight the sample? I'm sure they do re-weight, and I'll try and find out how they do it, but does their method result in any systematic bias?"

The most recent information on wages and hours is based on nearly universal information from large employers and a sample of small employers. The data are annually benchmarked to the unemployment insurance tax forms that all employers file. This is a universal census except for a few scofflaws trying to avoid the tax. As a result, no change in sample is necessary.

The resulting average hourly wage captures the average wage in a year; the change in the hourly wage captures the change in the average, which is not the same as the average of the changes. If we are adding more low-wage workers to the economy (as through immigration), each workers could be receiving a five percent annual raise, while the average wage does not change.

Chuck September 7, 2006 at 12:10 pm

What is with this reverse-nostalgia?

How many of you guys are, or hang out with, or are related to our "median worker"?

I have a lot of friends from school that I hang out with, and we all like to think we're regular guys. And we are, for our self-selected subset of the population.

But we're all mostly in the top 75%ile for income, with bachelors degrees or better. (We're also handsome and intelligent!) But how could that be? We *feel* perfectly average!

I'm looking for people who know *well, today* a adult who makes $8 per hour and who thinks that person is better off than they would have been in 1967 or 79 or whatever.

kebko September 7, 2006 at 12:30 pm

"I'm looking for people who know *well, today* a adult who makes $8 per hour and who thinks that person is better off than they would have been in 1967 or 79 or whatever."

If you're assuming that the average person makes $8/hour, then you're kind of answering your own question, I think. You & your pals with bachelor's degrees might actually be closer to average, than the $8/hour worker that you are imagining.

But, for starters, the $8/hour worker today is almost certainly living in a centrally heated & cooled home. The equivalent, barely-over-minimum-wage worker in 1970 would almost certainly have not.

Chuck September 7, 2006 at 1:33 pm


Maybe you are right that me and my buddies are closer to average, but that would be a tribute to how skewed incomes are. The average is meaningfully higher than the median. The median income is probably pretty close to the average labor wage.

The post says that, in 1982 dollars, the average labor wage was $8ish. Maybe it's $9 an hour in 2006 dollars, whatever. The question remains: since we're comparing the life of our average laborer, who here knows one and has some sense of what his life is like?

He has AC today in his home? Great! But I honestly have to say that I have no idea if it is likely that a single person could get a home loan on a $10/hr job that may or may not have benefits. Anyone know someone who's done it? One of your kids maybe? What kind of down payment would I need to save up (while I'm paying rent on an apartment already).

Maybe our average laborer is married, and so the household gets $20/hr. So no problem on the house, right? Wait, maybe there's some medical bills from a car accident in the past?

Who knows? Who among us can say that we hang with our "average laborer" and his friends and know them well?

Finally, even if their standard of living is rising in some ways, why shouldn't their income show benefits of rising productivity in some relative proportion to the rich over the last 30 years?

Kevin Nowell September 7, 2006 at 3:35 pm

One thing I'd like to see is the difference in black market income between the 1970s and now. I'd bet there is a huge difference.

Slocum September 7, 2006 at 5:32 pm

"I'm looking for people who know *well, today* a adult who makes $8 per hour and who thinks that person is better off than they would have been in 1967 or 79 or whatever."

I have a sibling in this category. He makes more like $12/hr, which is still well below the median. He owns his own house (a smaller ranch). He also has more than one computer, broadband internet, a digital camera, a large TV, and so on. He drives a late 90s luxury car that he bought used for a few thousand dollars (and that's a lot better than any 1970s car). Occasionally, he flies to Florida for a week's vacation. He has a non-union blue-collar job with basic health insurance. Does he live better than somebody in the equivalent job did in the 1970s? Yes, I'd say that he does.

Stephen September 8, 2006 at 5:42 am

One terrible part about the way these statistics are collected is that if a dramatic incrase in the minimum wage happens, it will appear that the average real wage has increased quite a bit. The data wouldn't take into account all the low productivity people who would be completely shut out of the market and earning zero.

It will then be hounded by the leftists that it is "proven" that raising the minimum wage makes the average worker better off.

Previous post:

Next post: