At lunch today with Robin Hanson, guest blogger Bryan Caplan, and blogging competitor Tyler Cowen, someone brought up the subject of computers and productivity. I am scanning Brad DeLong’s reading list for a course in American Economic History, and naturally he points to something on the topic, by Dale W. Jorgenson, Mun S. Ho, and Kevin J. Stiroh.
Of the 1.57 percentage point increase in ALP [Aggregate Labor Productivity] growth after 1995, 0.86 percentage point was due to capital deepening and 0.80 percentage point due to faster TFP [Total Factor Productivity] growth, with a small decline in labor quality growth of –0.09 percentage point. IT [Information Technology] production accounted for more than 35 percent of the increase in aggregate TFP, far exceeding the 5 percent share of IT goods in aggregate output. This sizable contribution reflects the exceedingly high rates of technological progress in IT production and is manifest in the 9.2 percent per year decline in the price of IT output in 1995-2003. Similarly, 60 percent of the increased capital deepening in 1995-2003 was attributable to IT, although information processing equipment and software accounted for only about one-quarter of private fixed investments in this period. This large contribution reflects both the rapid accumulation of IT capital as prices fell and IT capital’s high marginal product.
For Discussion. How much of what computers routinely do for us now could not have been done by computers prevalent in the 1980’s?
READER COMMENTS
Lawrance George Lux
Jan 20 2005 at 5:17pm
Relatively nothing, but that is not the point! The point is Training Costs, and length of time required to perform. The investment in Training ordinary labor in Program function would exceed current costs by about 7 to 1. Sourcing and creating program in Computer language, rather than Windows, would take about 8 times as long, with 400% increase of Chance-mistakes. lgl
John F. Opie
Jan 21 2005 at 10:26am
Hi –
I’m an economist working in a commercial environment.
It actually surprises me somewhat that this is continues to be a question: I think people aren’t thinking this through.
I’ll give you two examples from my work.
1) A product that took three people working 2 weeks to convert model tables into production-quality nicely formatted tables in Word was converted to macro-driven production in Excel that was then done by one person in 4 hours.
2) The creation of 100 reports that took one person 6 weeks (3 weeks of writing, 2 weeks of modelling work and 1 week of collating and printing/conversion to PDF format was converted to macro-driven production in Excel to 16 days (5 days of modelling work, 10 days writing, one day production).
Now, part of the productivity increases were due to programming work (not strictly speaking IT investment) and reorganization of tasks to avoid redundancy, but the major increase in computer speeds from, say, 1991 to 2004 led as well to massive reductions in processing time of computer-supported work, such as the production of economic forecasts and analyses.
And the quality of the work is also significantly better, since we now can run better integrated and more sophisticated models than we could in previous generations. Further, programming tools have become more advanced, allowing mere economists as myself to become system programmers in order to free themselves up to actually get product out of the door while increasing the time available to do economics (and read economics blogs!).
And more to the question: little or nothing that I do now couldn’t have been done by, say, 1987 or so (that’s when I started working as an economist). However, I certainly remember model runs lasting all night and into the next day that now take little more than a minute or two to run under ceteris paribus conditions.
John
spencer
Jan 21 2005 at 10:42am
I would say a lot. I am an economic researcher.
In the 1970s I worked in Washington and if I wanted to put a chart in a report I had to do all the data by hand and take it to an art shop for them to do the chart. One chart probably took at least one man day, and maybe two. At that time when I did a regression I still used punch cards and would find numerous errors as I went through the process. So I would take a stack of cards to the computer shop and it would run overnight.
The next day I would get the cards back with an error maybe one third of the way through the stack of punch cards. I would create an additional set of punch cards to insert into the program at that point to correct that error.
The next day I would find an error maybe half way throught the process and have to repeat the process. Generally, it would take about a week to do one regression that now takes a few seconds.
Your fear was that you would drop the box of punch cards and have to start all over. So you numbered the cards with a felt tip pen. And you would end up with numbers like 100 a 33 b 17
on the revisions.
In the early 1980s when I worked for an investment firm my DRI budget was larger then my salary.
Now, I get more data then in the 1980s that I can download over the internet in a format easy to import in data analysis programs for about $5,000 a year.
Without the advances in computer power I could not have the business I have now.
This would be true for almost everyone that works out of a home office.
Edge
Jan 22 2005 at 12:05am
I think software is much more refined now. You can complete work with less labor input.
In terms of hardware, the big increases in memory scale were necessary for much of what we’re able to do now.
In terms of computation speed, I think a lot of computing power is being burned these days because it is cheap. If computation speed had hit a wall in the 1980s, I think we would have worked around many of the issues and still have gotten work productivity up. Some of that is true for memory, as well. User interfaces wouldn’t be as pretty; we wouldn’t have motion video like we’ve got; web pages would be simpler and more reliant on text.
But on reflection, a lot of productivity is in the user interface and the ability to see lots of data. This is all dependent on the specific applications. It seems to be an interesting question to ask: which of the computing gains have resulted in real productivity increases, and how. Which of the gains that that we now burn through like air could we have worked around, and how would both the tech industry and the broader economy have changed if we did.
spencer
Jan 22 2005 at 11:18am
Because we are researchers we tend to think of the computer the way we use it. But the really big applications are not from the PC, rather they are from the way firms can manage.
For example, since 1995 the biggest gains in productivity have been in retail, and a lot of these gains stem from the ability the universal product codes give management to eliminate waste, etc. Moreover, this is just one example. Look at the way banks have improved productivity. The reasons banks use to close at 3:00 in the afternoon was that they needed all that time to reconcile their balance and keep up the bookkeeping. Now these tasks are done via computer on a real time basis and the labor required to keep an account up to date has dropped to a fraction of what it use to be.
Comments are closed.