Reference Quote
ShuffleSimilar Quotes
Quote search results. More quotes will automatically load as you scroll down, or you can use the load more buttons.
Unlimited Quote Collections
Organize your favorite quotes without limits. Create themed collections for every occasion with Premium.
As the natural sciences have developed to encompass increasingly complex systems, scientific rationality has become ever more statistical, or probabilistic. The deterministic classical mechanics of the enlightenment was revolutionized by the near-equilibrium statistical mechanics of late 19th century atomists, by quantum mechanics in the early 20th century, and by the far-from-equilibrium complexity theorists of the later 20th century. Mathematical neo-Darwinism, information theory, and quantitative social sciences compounded the trend. Forces, objects, and natural types were progressively dissolved into statistical distributions: heterogeneous clouds, entropy deviations, wave functions, gene frequencies, noise-signal ratios and redundancies, dissipative structures, and complex systems at the edge of chaos.
The emergence of computer technology in World War II and its rapidly growing power in the second half of this century made it possible to deal with increasingly complex problems, some of which began to resemble the notion of organized complexity. Initially, it was the common belief of many scientists that the level of complexity we can handle is basically a matter of the level of computational power at our disposal. Later, in the early 1960s, this naive belief was replaced with a more realistic outlook.
Can we use programs instead of equations to make models of the world? ...[I]n the beginning of the 1980s ...I did a bunch of computer experiments. ...It took me a few years to really say, "Wow, there's a big important phenomenon here that lets... complex things arise from very simple programs." ...[A] bunch of other years go by [and] I start of doing ...more systematic computer experiments ...and find ...that ...this phenomenon ...is actually something incredibly general... [T]hat led me to this... principle of computational equivalence... [A]s part of that process I said, "OK... simple programs can make models of complicated things. What about the whole universe?" ...and so I got to thinking, "Could we use these ideas to study fundamental physics?" ...I happened to know a lot about traditional fundamental physics. ...I had a bunch of ideas about how to do this in the early 1990s. I made... technical progress. ...I wrote about them back in 2002.
A computation is a physical process in which physical objects like computers, or slide rules or brains are used to discover, or to demonstrate or to harness properties of abstract objects—like numbers and equations. How can they do that? The answer is that we use them only in situations where to the best of our understanding the laws of physics will cause physical variables like electric currents in computers (representing bits) faithfully to mimic the abstract entities that we’re interested in.
Enhance Your Quote Experience
Enjoy ad-free browsing, unlimited collections, and advanced search features with Premium.
The beginning of the twentieth century witnessed the breakdown of the mechanistic theory even within physics, the science where it was the most successful... Relativity took over in field physics, and the science of quantum theory in microphysics... In view of parallel developments in physics, chemistry, biology, sociology, and economics, many branches of the contemporary sciences became... ‘sciences of organized complexity’ — that is, systems sciences.
Computer science is still a young field. The first computers were built in the mid 1940s, since when the field has developed tremendously. Applications from the early years of computerization can be characterized as follows: the programs were quite small, certainly when compared to those that are currently being constructed; they were written by one person; they were written and used by experts in the application area concerned. The problems to be solved were mostly of a technical nature, and the emphasis was on expressing known algorithms efficiently in some programming language. Input typically consisted of numerical data, read from such media as punched tape or punched cards. The output, also numeric, was printed on paper. Programs were run off-line. If the program contained errors, the programmer studied an octal or hexadecimal dump of memory. Sometimes, the execution of the program would be followed by binary reading machine registers at the console.
The speed and density of computation have been doubling every three years (at the beginning of the twentieth century) to one year (at the end of the twentieth century), regardless of the type of hardware used. ...Despite many decades of progress since the first calculating equipment was used in the 1890 census, it was not until the mid-1960s that this phenomenon was even noticed (although Alan Turing had an inkling of it in 1950).
Works in ChatGPT, Claude, or Any AI
Add semantic quote search to your AI assistant via MCP. One command setup.
A confusion of even longer standing came from the fact that the unprepared included the electronic engineers that were supposed to design, build and maintain the machines. The job was actually beyond the electronic technology of the day, and, as a result, the question of how to get and keep the physical equipment more or less in working condition became in the early days the all-overriding concern. As a result, the topic became – primarily in the USA – prematurely known as 'computer science' – which, actually, is like referring to surgery as 'knife science' – and it was firmly implanted in people's minds that computing science is about machines and their peripheral equipment. Quod non [Latin: "Which is not true"]. We now know that electronic technology has no more to contribute to computing than the physical equipment. We now know that programmable computer is no more and no less than an extremely handy device for realizing any conceivable mechanism without changing a single wire, and that the core challenge for computing science is hence a conceptual one, viz., what (abstract) mechanisms we can conceive without getting lost in the complexities of our own making.
Loading more quotes...
Loading...