Half-Empty or Half-Full, Part I

by Russ Roberts on September 6, 2006

in Standard of Living

There are data collected by the government suggesting that the average worker in America has not shared in the progress of the last few years or maybe even the last thirty years. Policies to solve this problem range from raising the minimum wage to tax reform.

My claim is that if you really believe that the average worker has failed to benefit from the astounding economic growth of the last 30 years, then you have to also believe in more fundamental economic change than raising the minimum wage or getting rid the mortgage deduction.

I prefer instead, to believe that the data supporting the conclusion of stagnation is flawed. That’s an easy claim to make. All data are imperfect. So in this post, which I hope is the first of a series, I want to look carefully at a key piece of economic data and see how reliable it might be for drawing the conclusions people often draw about stagnation.

Let’s begin with a quote from Paul Krugman in a recent column ($) that nicely summarizes the gloomy view that the glass is at best half-empty:

There are still some pundits out there lecturing people about how great
the economy is. But most analysts seem to finally realize that
Americans have good reasons to be unhappy with the state of the
economy: although G.D.P. growth has been pretty good for the last few
years, most workers have seen their wages lag behind inflation and
their benefits deteriorate.

The disconnect between overall economic growth and the growing
squeeze on many working Americans will probably play a big role this
November, partly because President Bush seems so out of touch: the more
he insists that it’s a great economy, the angrier voters seem to get.
But the disconnect didn’t begin with Mr. Bush, and it won’t end with
him, unless we have a major change in policies.

The stagnation of
real wages — wages adjusted for inflation — actually goes back more
than 30 years. The real wage of nonsupervisory workers reached a peak
in the early 1970’s, at the end of the postwar boom. Since then workers
have sometimes gained ground, sometimes lost it, but they have never
earned as much per hour as they did in 1973.

The data on stagnating real wages is summarized in the chart below. I’ve taken the chart from the Economic Report of the President, Table B-47 (which in turn takes it from the Bureau of Labor Statistics (BLS), so if you want to follow along at home to make sure I’m playing fair, go right ahead.

AheshortKrugman’s claim looks pretty convincing doesn’t it? Real wages do indeed peak around 1973. (Actually, real wages in 1972 are a penny an hour higher than in 1973 but that’s probably due to some revision since Krugman last looked at this series. Besides, it’s only a penny. Close enough.) They do fall pretty steadily between 1973 and 1993, then rise, though not enough to make up for the fall, then fall again. The bottom line appears to be pretty clear. The average worker’s wages have not kept up with inflation over the last 30-plus years. In fact, the average worker in 2005 made 9% less in real terms than the average worker in 1972. In other words, the living standard of the average worker in 2005 was only 91% as high as the average worker in 1972. BTW, these data are in 1982 dollars, so ignore the levels and focus on the changes from year-to-year.

Just for fun, here are the exact same data:

The only difference between the two charts is the scale of the vertical axis. In the first chart, the vertical axis started at $7.40 and hour. In this chart, the vertical axis starts at zero. It’s interesting how the first chart fools your eye and exaggerates the swings over time. When you look at the data in the second chart, the picture is a little clearer: stagnation. The ups and downs have been smoothed out and it’s pretty clear that the apparent standard of living of the average worker is about the same as it was 40 years ago.

Is that true? Are these data an accurate portrait of economic reality for the average American? I was a little boy in 1964. I was a high school senior in 1972. I remember those years. America certainly seems dramatically more prosperous than back then on virtually every basic dimension—housing, cars, health care, luxuries. Now it’s possible that those memories are selective. And it’s possible that my observations of the world around me today are selective—selective geographically for example. I live and teach in the affluent Washington, DC area. You have to be careful not to draw conclusions based on a biased sample. But still. Is it really plausible to conclude that the average worker has been treading water for 40 years in a time that real GDP and real per-capita GDP are up dramatically over the same time period?

To believe that, you have to believe there is some secret mechanism that keeps the fruits of growth as the exclusive delight for upper-income Americans.

Or maybe these data don’t capture the standard of living of the average American. Maybe these data are helplessly flawed. All data have problems. Let us count the ways with real average hourly earnings:

1. What does "real" mean? Corrected for inflation. Is the inflation correction accurate?
2. What is the definition of wages? Does it include benefits?
3. How are the data collected? Do they capture the entire economy or only a part of it?
4. How are the data collected? What exactly is the "average" in average hourly earnings?

Questions 2 and 3 are easiest to answer. The measure here does not include benefits. Benefits have become an increasingly important part of compensation over time, so looking at average hourly earnings alone is surely misleading. The sample of workers is non-supervisory and production workers. It excludes self-employed individuals, an increasingly important part of the work force. It also excludes managers, who tend to be paid at higher rates than those they manage. That would seem to be good—it takes out the high end of the earnings distribution, which might pull the average upward making it a bad measure for say, the median worker’s standard of living. According to the BLS, production and non-supervisory workers make up about 80% of the work force. So it’s still a very big slice.

In the rest of this post, I want to focus on questions 1 and 4, because I think the answers to those questions are a bit surprising and not widely known. I didn’t know of them until recently.

When I hear the phrase, average hourly earnings, I think of taking a bunch of different wage rates, some high, some low, and averaging them. But that’s not how these data are collected. The data on average hourly earnings are taken from the Current Employment Statistics (CES), a survey not of individuals, but of establishments. The BLS gathers data from 160,000 different businesses and agencies covering 400,000 work sites. They ask each business for each work site, to provide total wages paid and total hours worked and they divide the two. That’s the average hourly earnings at that establishment.

That’s a little bit weird. Do most businesses really know total hours worked? My employer, George Mason University, certainly doesn’t. There’s no way the President of the University or the head of human resources or any department chair or any individual knows how much people actually work at the University. I don’t even know how many hours I’m supposed to work. I work at home. I work at night. I come in later than 9:00 am. I come in earlier. Sometime when I’m "working", that is, when I’m sitting at my desk in my office, I’m checking the Red Sox box score or chatting with my wife. Sometimes, while I’m driving my kids to a baseball game, I’m thinking about real wages. I have no idea what my actual hours worked might be. Certainly no one else at the University knows either.

Sure, there are places where people punch a clock. But those places have become a lot less common than in 1964. How do businesses report their total hours worked to get an accurate number? How much leisure takes place on the job today compared to 1964? How much work gets done at home that the company has no knowledge of? I have no idea. Neither does the BLS.

How does the BLS adjust the weights in the survey? When a manufacturing plant closes and a graphic arts company opens, how do they re-weight the sample? I’m sure they do re-weight, and I’ll try and find out how they do it, but does their method result in any systematic bias?

Now consider the measure of inflation. I’m told by someone at the BLS, that average hourly earnings in that standard B-47 table are deflated by CPI-W, a price index that is for urban wage earners and clerical workers. But that series has many flaws. (BTW, it’s the series used to calculate social security benefits, so changing how it is calculated has political implications) A more standard measure would be CPI-U, a price index for urban consumers. But that series is flawed as well. Over the years, the BLS has made many improvements to CPI-U, as the BLS explains:

The Bureau of Labor Statistics (BLS) has made numerous improvements to
the Consumer Price Index (CPI) over the past quarter-century. While
these improvements make the present and future CPI more accurate,
historical price index series are not adjusted to reflect the
improvements. Many researchers, however, expressed an interest in
having a historical series that was measured consistently over the
entire period. Accordingly, the Consumer Price Index research series
using current methods (CPI-U-RS) presents an estimate of the CPI for
all Urban Consumers (CPI-U) from 1978 to present that incorporates most
of the improvements made over that time span into the entire series.

What would real average hourly earnings look like if we used CPI-U-RS? Well, let’s take a look. We can only go back to 1978, so let’s take the previous chart we used and look at 1978-2005. That is, this is the same picture as the first chart of this post, but with a shorter data period:

I have converted the data to real dollars using 2005 dollars as the base to help make the comparison that I’m about to make. But the pattern is exactly the same as in the first chart which was in 1982 dollars—a glass half-full at best. According to these numbers, over the last quarter of a century, from 1978-2005, the average worker is worse off. The 2005 wage number is only 94% of the 1978 number.

And of course, we know that it’s not the same people being sampled in 1978 and in 2005. So we certainly don’t want to conclude that the average worker is literally worse off. But it still  is surprising that the average person in 2005 is worse off than the average person in 1978.  Now let’s look at the same picture using CPI-U-RS:


There’s a similar pattern as in the series deflated by CPI-W: a decline followed by an upturn, than a slight decline. But notice that when you use CPI-U-RS, the "best" price index that the BLS has, the state-of-the-art series, you get a different conclusion for progress over the period. Using CPI-U-RS to convert nominal earnings into real earnings, you find that workers today are better off than workers in 1979. Even ignoring benefits, worker’s wages are 1% higher. Now 1% higher is a pitiful performance over 27 years. But it’s better than a 6% decline. (Though of course if I stretched the vertical axis down to zero, these differences would again look small).

Is it possible that even CPI-U-RS is flawed? Of course it is. It doesn’t accurately take account of quality changes. Mark Bils has shown that it’s off by quite a bit. Jerry Hausman and Ephraim Leibtag have shown that the CPI overstates the price level of food and inflation because it ignores the phenomenon of Wal-Mart. The CPI dramatically overstates inflation. Which means it understates improvements in standard of living. Is it possible that if we corrected the CPI for quality even wages would show a dramatic improvement over time in real terms? I think it would, but that calculation is still a work in progress.

The bottom line: the standard measure the pessimists use to indict the American economy’s treatment of the little guy, the average worker, is a strange measure. It’s a snapshot at different points in time that looks at different people, it is sensitive to even the BLS’s definition of inflation, it doesn’t include benefits which are an increasingly important part of compensation and it’s based on a survey of establishments where hours are difficult to measure.

Would you use that measure as a basis for making policy conclusions?

In the next post of this series, I’ll look at other measures of compensation collected by the BLS.


48 comments    Share Share    Print    Email

Previous post:

Next post: