Notes from The Information Technology Revolution, The Rise of the Network Society, Manuel Castells

Thus, computers, communications systems, and genetic decoding and programming are all amplifiers and extensions of the human mind. What we think, and how we think, become expressed in goods, services, material and intellectual output, be it food, shelter, transportation and communications system, computers, missiles, health, education, or images. The growing integration between minds and machines, including the DNA machine, is canceling what Bruce Mazlish calls the fourth discontinuity (the one between human and machines), fundamentally altering the way we are born, we live, we learn, we work, we produce, we consume, we dream, we fight or we die... - pg. 31

Note on biotech - The first edition of The Rise of Network Society was written in 1996 and the heavily revised second edition in 2009. In 1996, the first mammal clone was born - a sheep named Dolly. In October 2018, the world's first known gene-edited human babies were born. Dr. He Jiankui "had worked toward this for two years, altering their genes as embryos to try making them resistant to their father’s HIV infection. Dr. He (pronounced “huh”) gave them pseudonyms, Lulu and Nana."

Eyewriter (2009) is a collaborative artist project using a small camera and eyetracking software to allow a paralyzed graffiti artist, Tempt1, generate visual art work through his eye movements

...Crispr-Cas9 "holds the promise of new disease treatments, has made ethics questions more urgent. The tool acts like molecular scissors that can target specific genes, cutting and splicing them to prevent or cure diseases."
How a Chinese Scientist Broke the Rules to Create the First Gene-Edited Babies by Preetika Rana, 10 May, 2019

Some current "breakthroughs"

Notes from The Information Technology Revolution, The Rise of the Network Society, Manuel Castells

"fourth discontinuity" (1993) - "Mazlish argues that just as Copernicus, Darwin, and Freud overturned our illusions of separation from and domination over the cosmos, the animal world, and the unconscious, it is now necessary to relinquish a fourth fallacy or discontinuity--that humans are discontinuous and distinct from the machines they make."

Mazlish identifies three previous mental barriers or "discontinuities" that the human race has had to overcome. The First Discontinuity started to be crossed when Nicolaus Copernicus proposed his theory of heliocentricity, and which stated that the Earth rotated around the Sun and not the other way around. The Second Discontinuity was then breached when Charles Darwin popularised his theories of evolution and natural selection. The Third Discontinuity then started to be dispelled when Sigmund Freud bridged the divide between our "conscious" and "subconscious", in the process highlighting human beings as psychological as well as physiological creatures.

Mazlish's proposition that we are now crossing the Fourth Discontinuity is presented in two parts. Firstly, he states that it is no longer realistic to think of humans without machines. Secondly, he suggests that the same paradigms or concepts now explain the very workings of both human beings and many artificial mechanisms. With the development of artificial intelligence, organic biocomputers, genetic engineering, nanotechnology and xenotransplantation -- not to mention progress in life extension and the advocation of transhumanism, both of these propositions also seem at least as reasonable (if not still as uncomfortable) as they did when Mazlish published his book back in 1993. - Fouth Discontinuity

Notes from The Information Technology Revolution, The Rise of the Network Society, Manuel Castells

Yet this is precisely a confirmation of the revolutionary character of new industrial, technologies. The historical ascent of the so-called West, in fact limited to Britain and a handful of nations in Western Europe as well as to their North American, and Australian offspring, is fundamentally linked to the technological superiority achieved during the two industrial revolutions. Nothing in the cultural, scientific, political or military history of the world prior to the industrial revolution would explain such disputable 'Western'supremacy between the 1750s and 1940s. China was a far superior culture for most of the pre-Renaissance history; the Muslim civilization dominated much of the Mediterranean and exerted a significant influence in Africa and Asia throughout the modern age; Asia and Africa remained by and large organized around autonomous cultural and political centers; Russia ruled in splendid isolation... - pg. 34

Notes from The Information Technology Revolution, The Rise of the Network Society, Manuel Castells

Microchips will have double in performance at a given price, according to the generally acknowledged "Moore's Law."

NOTE: In 1965, Gordon Moore made a prediction that would set the pace for our modern digital revolution. From careful observation of an emerging trend, Moore extrapolated that computing would dramatically increase in power, and decrease in relative cost, at an exponential pace. The insight, known as Moore’s Law, became the golden rule for the electronics industry, and a springboard for innovation. As a co-founder, Gordon paved the path for Intel to make the ever faster, smaller, more affordable transistors that drive our modern tools and toys. Even over 50 years later, the lasting impact and benefits are felt in many ways.

Moore's Law refers to Moore's perception that the number of transistors on a microchip doubles every two years, though the cost of computers is halved. Moore's Law states that we can expect the speed and capability of our computers to increase every couple of years, and we will pay less for them. Unfortunately, Moore's Law is starting to fail: transistors have become so small (Intel is currently working on readying its 10nm architecture, which is an atomically small size) that simple physics began to block the process. We can only make things so minuscule. ... Like it or not, change is coming to Intel.

Notes from The Information Technology Revolution, The Rise of the Network Society, Manuel Castells

The frontier of information technology at the turn of the millennium appeared to be the application of a chemically based and/or biologically based nanotechnology-approach to chip making... Based on these technologies, computer scientists envisage the possibility of computing environments where billions of microscopic information-processing devices will be spread everywhere 'like pigment in the wall paint.' If so, then computer networks will be, materially speaking, the fabric of our lives. - pg. 57

Without necessarily surrendering to historical relativism, it can be said that the information technology revolution was culturally, historically, and spatially contingent on a very specific set of circumstances whose characteristics earmarked its future evolution. - pg. 63

The conclusion to be drawn from these colorful stories is twofold: first, the development of the information technology revolution contributed to the formation of the milieux of innovation where discoveries and applications would interact, and be tested, in a recurrent process of trail and error, of learning by doing [Move fast and break things]; these milieux required (and still do in the early twenty-first century, in spite of on-line networking) the spatial concentration research centers, higher-education institutions, advanced-technology companies, a network of ancillary suppliers of goods and services, and businesses networks of venture capital to finance start-ups. Secondly, once a milieu is consolidated, as Silicon Valley was in the 1970s, it trends to generate its own dynamics, and to attract knowledge, investment, and talent from around the world. Indeed, in the 1990s Silicon Valley benefited from a proliferation of Japanese, Taiwanese, Korean, Indian and European companies, and from the influx of thousands of engineers and computer experts, mainly from India and China, for whom active presence in the Valley is the most productive linkage to the sources of new technology and valuable business information... In the 1990s, when the Internet was privatized, and became a commercial technology, Silicon Valley was also able to capture the new industry. - pg. 65