Part 7 of a #ReadWithMe series on Matt Ridley’s How Innovation Works
It looks like a question easy enough to answer. It is not.
For Matt Ridley, four are the features of the computer which make it different from a calculator: it must be digital, electronic, programmable, and capable of carrying out whatever logical task, at least in principle. Walter Isaacson finds the turning point in the ENIAC, which begun operations in 1945 at the University of Pennsylvania. Ridley doesn’t agree: a better candidate would be Colussus, the computer built in Britain during World War II to crack the German codes. Who should take the credit then? “The construction was led largely by an engineer named Tommy Flowers, a pioneer of using vacuum tubes in complex telephone circuits, and his boss was the mathematician Max Newman, but they consulted Alan Turing”.
Ridley’s chapter on the origin of the computer is a delight. He goes back, up to Charles Babbage, to prove that ENIAC (and for that matter) the modern computer was not “so much invented as evolved through the combination and adaptation of precursor ideas and machines”.
Writes Ridley “the deeper you look, the less likely you are to find a moment of sudden breakthroughs, rather than a series of small incremental steps”. We tend to think differently because, as I mentioned earlier, we tend to search for visible hands, for clear moments of changes, for Innovator with a capital “i”. What escapes this picture is how much innovation is a matter related with feedback mechanisms. In a sense, I think this is perhaps the greatest accomplishment of Ridley’s book: he uses plenty of interesting stories of innovators, but he never gets tired of explaining that their brilliant undertakings need to be received by consumers – and so sometimes they were adapted, used in different contexts, and became useful in a previously unpromising and unforeseen circumstances. In this sense, “innovation is the child of freedom”, Ridley explains, “because it is a free, creative attempt to satisfy freely expressed human designs”.
READER COMMENTS
Phil H
Aug 22 2020 at 2:39am
I like the feedback mechanisms idea a lot. Lots of institutions could maybe be understood as ways of sorting signal from noise within this feedback. Markets would be the classic and probably the best such signal booster.
john hare
Aug 22 2020 at 5:46am
My name is John and I am an inventor. To cure my condition, I have tried the pill, the patch, and the twelve step program. They all failed and I will just have to live with my condition.
In seriousness, I have a passion for design and build of tools and equipment. Most of them apply to the construction trades that are my business. In spite of being quite prolific, I have successfully avoided wealth for decades. A large part of that success is based on the expectation that people that use the tools have a similar mindset regarding learning and productivity as I do. This is demonstrably false as the majority of employees, and even the competition, want to do work in whatever way they learned so they can get a paycheck and go home.
It is only recently that I have started trying to change my focus to tools and techniques that are suitable for people that are not career focused in my field. It is my opinion that understanding the target users would have made me far wealthier with less innovation if I had understood this a few decades back.
In short, if I had been reading the feedback properly, I would have done quite well financially. I haven’t done badly, and I certainly enjoy the stuff I work on, BUT my innovations have not been generally adopted due to my inability to read the feedback properly. So I agree with the article that innovation and feedback cycles are partners that need to work together for true progress.
Bill Mauchly
Aug 22 2020 at 9:59am
From the small example you cite, the book seems to stretch truth quite a bit. The Colossus did not fill the criteria for a computer that he states; it was NOT capable of performing any logical task. It’s important and only ability was to decipher.
In my book, and most of the ones I’ve read, producing a general-purpose computer that was one thousand times faster than any other (the ENIAC) was not a “small incremental step.” It was the Big Bang of the Information Age.
mike shupp
Aug 23 2020 at 4:36pm
Uhhh … the “customer” for COLLOSUS was the British government. Similarly, ENIAC was built for the US Army, which had developed a need in the early 1940’s for creating missile ballistic tables.
You’re quite sure this demonstrates the free market at work?
Michael Pettengill
Aug 24 2020 at 3:14pm
The computer history defines the difference between invention and innovation.
Invention is writing down a new idea.
Innovation is learning by doing over and over incrementally better.
Patents are the means to stop innovation. Fortunately, not even a hundred thousand new ideas is enough to cover all the ideas used by innovators making computer systems better.
The origins are clearly in weaving where producing patterns in cloth was done by algorithms which were mechanized to remove the drudgery of work so workers could focus on producing new patterns quickly of much higher quality, even if unskilled, ie, not having spent years in drudgery.
Note, computers are not digital, but binary.
Comments are closed.