Reference Quote

Shuffle
The speed and density of computation have been doubling every three years (at the beginning of the twentieth century) to one year (at the end of the twentieth century), regardless of the type of hardware used. ...Despite many decades of progress since the first calculating equipment was used in the 1890 census, it was not until the mid-1960s that this phenomenon was even noticed (although Alan Turing had an inkling of it in 1950).

Similar Quotes

Quote search results. More quotes will automatically load as you scroll down, or you can use the load more buttons.

Exponential Growth in Computation How about computation? In 1971, Intel put out its first computer chip, the Intel 4004. It had 2,300 transistors on it, at $1 each. Intel no longer actually tells you how many transistors are on their chips, but the recent Core i9 had 7 billion transistors at less than a millionth of a penny each. This represents a 27-billion-fold increase in price performance in forty-five years.

Since 1954, the raw speed of computers, as measured by the time it takes to do an addition, increased by a factor of 10,000. That means an algorithm that once took 10 minutes to perform can now be done 15 times a second. Students sometimes ask my advice on how to get rich. The best advice I can give them is to dig up some old algorithm that once took forever, program it for a modern workstation, form a startup to market it and then get rich.

Go Premium

Support Quotewise while enjoying an ad-free experience and premium features.

View Plans
Computer science is still a young field. The first computers were built in the mid 1940s, since when the field has developed tremendously. Applications from the early years of computerization can be characterized as follows: the programs were quite small, certainly when compared to those that are currently being constructed; they were written by one person; they were written and used by experts in the application area concerned. The problems to be solved were mostly of a technical nature, and the emphasis was on expressing known algorithms efficiently in some programming language. Input typically consisted of numerical data, read from such media as punched tape or punched cards. The output, also numeric, was printed on paper. Programs were run off-line. If the program contained errors, the programmer studied an octal or hexadecimal dump of memory. Sometimes, the execution of the program would be followed by binary reading machine registers at the console.

Faster-than-exponential growth also occurs in computing power, as measured by the evolution of the number of MIPS per $1,000 of computer from 1900 to 1997. Thus the so-called Moore's law is incorrect, since it implies only an exponential growth. This faster than exponential acceleration has been argued to lead to a transition to a new era, around 2030, corresponding to the epoch when we will have the technological means to create superhuman intelligence.

In 1936 the notion of a computable function was clarified by Turing, and he showed the existence of universal computers that, with an appropriate program, could compute anything computed by any other computer. [...] In some subconscious sense even the sales departments of computer manufacturers are aware of this, and they do not advertise magic instructions that cannot be simulated on competitors machines, but only that their machines are faster, cheaper, have more memory, or are easier to program.

Looking back, I think the computer age did not really start until this moment, when computers merged with the telephone. Stand-alone computers were inadequate. All the enduring consequences of computation did not start until the early 1980s, that moment when computers married phones and melded into a robust hybrid.

A computation is a physical process in which physical objects like computers, or slide rules or brains are used to discover, or to demonstrate or to harness properties of abstract objects—like numbers and equations. How can they do that? The answer is that we use them only in situations where to the best of our understanding the laws of physics will cause physical variables like electric currents in computers (representing bits) faithfully to mimic the abstract entities that we’re interested in.

Share Your Favorite Quotes

Know a quote that's missing? Help grow our collection.

...the 'size' of science has doubled steadily every 15 years. In a century this means a factor of 100. For every single scientific paper or for every single scientist in 1670, there were 100 in 1770, 10,000 in 1870 and 1,000,000 in 1970.

The emergence of computer technology in World War II and its rapidly growing power in the second half of this century made it possible to deal with increasingly complex problems, some of which began to resemble the notion of organized complexity. Initially, it was the common belief of many scientists that the level of complexity we can handle is basically a matter of the level of computational power at our disposal. Later, in the early 1960s, this naive belief was replaced with a more realistic outlook.

The total number of multiplications involved in the practical solution of our problem exceeds 450,000. This task alone would mean a two-year job, at 120 multiplications per hour. Fortunately, the recent invention of the Simultaneous Calculator by Professor Wilbur of the Massachusetts Institute of Technology has made it possible to perform all the necessary computations in a small fraction of the time they otherwise would have required. This apparatus solves nearly automatically a system of nine simultaneous linear equations.

Works in ChatGPT, Claude, or Any AI

Add semantic quote search to your AI assistant via MCP. One command setup.

The desire to economize time and mental effort in arithmetical computations, and to eliminate human liability to error is probably as old as the science of arithmetic itself.

As I understand the theory of period information doubling, this states that if we take one period of human information as being the time between the invention of the first hand axe, say around 50,000 BC and 1 AD, then this is one period of human information and we can measure it by how many human inventions we came up during that time. Then we see how long it takes for us to have twice as many inventions. This means that human information has doubled. As it turns out, after the first 50,000-year period, the second period is about 1500 years, say around the time of the Renaissance. By then we have twice as much information. To double again, human information took a couple of hundred years. The period speeds up—between 1960 and 1970, human information doubled. As I understand it, at the last count human information was doubling around every 18 months. Further to this, there is a point sometime around 2015 where human information is doubling every thousandth of a second. This means that in each thousandth of a second we will have accumulated more information than we have in the entire previous history of the world. At this point I believe that all bets are off. I cannot imagine the kind of culture that might exist after such a flashpoint of knowledge. I believe that our culture would probably move into a completely different state, would move past the boiling point, from a fluid culture to a culture of steam. ... Most people find the word "Apocalypse" to be a terrifying concept. Checked in the dictionary, it means only revelation, although it obviously has also come to mean end of the world. As to what the end of the world means, I would say that probably depends on what we mean by world. I don't think this means the planet, or even the life forms upon the planet. I think the world is purely a construction of ideas, and not just the physical structures, but the mental structures, the ideologies that we've erected, THAT is what I would call the world. Our political structures, philosophical structures, ideological frameworks, economies. These are actually imaginary things, and yet that is the framework that we have built our entire world upon. It strikes me that a strong enough wave of information could completely overturn and destroy all of that. A sudden realization that would change our entire perspective upon who we are and how we exist. History is a heat, it is the heat of accumulated information and accumulated complexity. As our culture progresses, we find that we gather more and more information and that we slowly start to move almost from a fluid to a vaporous state, as we approach the ultimate complexity of a social boiling point. I believe that our culture is turning to steam.

Loading more quotes...

Loading...