The rate of productivity growth is a fundamental determinant of long-run living standards. Yet when it comes to understanding or predicting this variable, economics has been sadly deficient, especially at the turning points.
He refers to a paper that forecasts a slowdown in productivity growth. I have not read the paper, but I’m guessing that it talks about multifactor productivity, which is another way of saying the error term in a regression of output on capital and labor. Predicting the error term strikes me as a fool’s game.
But the larger issue is that the headline productivity number for the standard of living is not multifactor productivity, but labor productivity. Labor productivity is almost certain to grow rapidly, because growth in capital per worker is now being influenced by Moore’s Law. Computers are a sufficiently large share of the capital stock that improvements to computers now raise the amount of capital per worker in meaningful increments.
UPDATE: pushback from Gabriel Mihalache:
I don’t know much but I know this… a common trait of all good economists, regardless of other characteristics, is that they worship at the altar of TFP, at least at low frequencies. It seems to me that the self-titled Masonomics movement doesn’t.
I absolutely do agree that TFP is important over decades and centuries. My view is that TFP is determined by the pace of innovation. Again, I have not read the paper, but I don’t think that calibrated business cycle models are good tools for predicting future innovation.
Some commenters disagree that Moore’s Law implies advancing productivity. They argue that computers are not becoming more capable. But don’t just look at the PC. Look at iPods, cell phones, EZ passes, RFID’s, and other devices that benefit from shrinking transistors.
READER COMMENTS
Jared
Dec 3 2007 at 9:09pm
As a computer scientist, I’d love it if computers could be responsible for increased growth, and I think they will, but not because of Moore’s Law. I just don’t see how more transistors alone are going to make workers more productive, because so few of our tasks now are processor limited. How often is the average American worker forced to sit idle while their CPU cranks away at full speed, and are they spending half as much time this way as they were in 2005? How much faster is Joe Officeworker going to be able to push out a memo, or file a purchase order, or arrange a conference call, if we double his processor power?
Yeah, computers certainly have and will continue to increase our productivity, but it’s not really accurate to peg it to Moore’s Law. Take into account Amdahl’s and Wirth’s Laws, and you’ll see where I’m coming from. Other aspects of hardware, and all of software, need to increase at the same rate that transistors do for us to see the exponential gains implied when you invoke Gordon Moore.
In short, I think it’s going to be much more important to figure out how to use computation more effectively than it will be to gain extra transistors. That’s when we’ll see extra productivity.
But then again, maybe I’m confusing your statements about “labor productivity” and “capital per worker” in some way?
Scott McNeally
Dec 3 2007 at 9:28pm
The network is the computer.
Ajay
Dec 4 2007 at 12:56am
I think Jared raises a valid point in that the bottleneck isn’t processing power anymore, it’s our ability to harness it by writing software that takes advantage of that power. In other words, processing power has grown so fast and so consistently that our human institutions to write software to make use of that computation have been far outrun. For example, it is now possible to have people download HD videos to their computers, which require fast processors to watch, over broadband but we have not yet created software to easily find and pay for those videos yet (we do have HD video in homes in the form of HD cable channels but that limits your selection to content that has mass appeal, a restriction broadband content doesn’t have). However, this bottleneck is only a short-term one and we’ll figure it out, rendering Arnold’s broader point true: that we have a lot more headroom for long-term growth because of all that processing power.
Brandon Berg
Dec 4 2007 at 1:36am
Jared:
Moore’s Law improves productivity not so much by speeding up current applications as by making new kinds of applications feasible. For example, digital video recording and editing are not possible without the powerful microchips needed to compress video in real time.
To give an example in manufacturing, I’ve heard that Boeing’s plane designs are hundreds of gigabytes in size–obviously this is something that requires a great deal of computing power to manipulate.
What specific productivity enhancements will be enabled by future increases in available computing power, I don’t know. I think simulation of biological processes (e.g., protein folding) is a pretty good bet, and probably also simulation of physical processes for engineering purposes.
Channeling the gains from Moore’s Law into efficiency rather than performance can reduce power consumption, which will in turn enable a wider variety of embedded computing applications.
Also, better hardware makes software development easier and faster, because manual optimization is no longer as important as it once was. Writing programs in Java or C# is much easier than writing them in C, which in turn is much easier than writing them in hand-coded assembly. This enables the creation of a wider variety of software applications better tailored to specific needs.
Doubling processing power doesn’t do much for most people, but increasing it tenfold or a hundredfold will open up new possibilities we can’t imagine now.
Gabriel M.
Dec 4 2007 at 1:52am
It’s a fool’s game to try and predict a i.i.d. N() error term, but the Solow residual is highly auto-correlated and it shows decade-long trends. — Beyond that, we know that TFP has been THE engine of growth for the last 300 years. Not capital per worker.
Also, it’s a black box. And looking into the black box is where it’s at. Nobody’s suggesting that we stick to atheoretical ARIMA-ish forecasts for TFP. Succesfully endogenizing TFP is one of the greatest challenges in Economics.
Re: computers being special qua capital… they’re not (I’m a former IT professional). They’ve hit decreasing returns, along several dimensions, as early as the ’90s. This is why many industrial applications still use DOS and other old stuff. They don’t need more. Computational power is blind and useless without insight, i.e. human capital, a.k.a. TFP in some circles.
Jared
Dec 4 2007 at 12:03pm
I’m not trying to argue that “computers are not becoming more capable,” I’m merely saying that those increases can not be pegged to the exponential growth Moore’s Law implies. And Moore’s Law definitely increases productivity, but not nearly as much as people seem to think.
The extra transistors we get will help, but they’re far from the only thing that needs to occur to make us more capable. Yeah, Boeing needs serious processing power to do their CAD, but they also need serious disk space, and clever graphics algorithms, and well designed data buses, and so on.
The implication in the original post was that exponential growth in transistor counts will lead to similar growth in productivity, and I don’t think that’s accurate. Moore’s Law alone is no panacea.
Paul
Dec 4 2007 at 12:16pm
The best way to think of Moore’s law is not that it doubles computer power every couple of years, but rather that it makes any electronics cheaper by a factor of two every couple of years, or a factor of a thousand in about 20 years.
So something today that would cost $10,000 will be in Costco for $10 in 20 years. This is how mainframe computers that used to cost millions are so cheap that we attach radios to them and call them cellphones.
Matt
Dec 4 2007 at 12:55pm
I was once a computer scientist, until I discovered it ain’t rocket scientist.
Arnold is right, we have just begun to penetrate digital systems into the economy, and much of the software algorithms are transferable across sectors.
Jared
Dec 4 2007 at 2:13pm
Paul, that is a very good way of looking at it, and I had only partially considered the question from that angle. But that analysis is still only strictly true if transistors are the limiting factor on cost.
Matt, I think you’re right as to the reason we’ll see benefits from technology. I think it will be more due to widely applied machine learning techniques, or better networking effects, or more use of semantic organization, or new architectures like processing-in-memory, or new highly-parallel algorithms, than because of better or faster or even cheaper chipsets.
I’m very, very optimistic that technology will improve the economy and our lives in general, just like Arnold. Let’s just not peg it on what is essentially a forty year old motivational tactic by Gordon Moore.
ArtD0dger
Dec 5 2007 at 12:01pm
Well, predicting the rate of productivity growth of individual companies, as opposed to the whole economy, often goes by the name of stock-picking. Whether or not you think that is a fool’s game, you must admit that there are a lot of fools in the game. Why should predicting productivity growth for a whole economy be any easier?
Comments are closed.