Accelerating developments in computing, information production and storage, and artificial intelligence are rapidly approaching a point of inflection. This point is commonly known as "the Singularity."
However, advances in nanotechnology and biotechnology are also potential pathways towards the Singularity. Like computing, artificial intelligence, and the birth of "Big Data," progress in these areas has also been exponential in nature and has kept pace with growth in these other fields.
If things continue at this rate, the result could be sudden and explosive. It would feel like a "revolution" to people looking back on it, and it's generally agreed that, beyond the point of this anticipated Singularity, nothing will ever be the same again.
In fact, it could bring an end to death itself.
Engines of creation
Another anticipated pathway is the ability to manipulate matter at increasingly smaller scales. Eventually, this would allow humans to engineer materials at the atomic or even quantum level, leading to a new era in fabrication and medicine. Such technologies fall under the general heading of "nanotechnology" and describe machines that measure a few nanometers (10-9 m) in scale.
Theoretical physicist Richard Feynman first described the concept in his 1959 Caltech lecture, “There’s Plenty of Room at the Bottom.” Borrowing from John von Neumann’s concept of “Universal Assemblers” (aka. von Neumann probes), he spoke of machines that would be capable of endlessly reproducing themselves, but on smaller and smaller scales.
This concept was elaborated on further by K. Eric Drexler, an engineer who studied under Marvin Minsky — a cognitive and computer scientist considered one of the "fathers of AI." In 1986, he released his book, Engines of Creation: The Coming Era of Nanotechnology, in which he described the concepts of "molecular nanotechnology" and "molecular manufacturing."
In the first chapter of the book, Drexler provides a straightforward description of how rearranging atoms can allow for immense possibilities:
"COAL AND DIAMONDS, sand and computer chips, cancer and healthy tissue: throughout history, variations in the arrangement of atoms have distinguished the cheap from the cherished, the diseased from the healthy. Arranged one way, atoms make up soil, air, and water; arranged another, they make up ripe strawberries. Arranged one way, they make up homes and fresh air; arranged another, they make up ash and smoke."
Here too, advancements in the field have been subject to acceleration, concurrent with advancement in other fields. In 1981, five years before the publication of Engines of Creation, Gerd Binnig and Heinrich Rohrer developed the Scanning Tunneling Microscope (STM), which allowed researchers to view atoms on the surfaces of materials for the first time.
In 1989, Don Eiger and Erhard Schweizer of IBM's Almaden Research Center used an STM to position 35 individual atoms of xenon to spell "IBM" on a surface. Ten years later, they used a similar technique to create the first atomic-scale movie titled "A Boy and His Atom: The World's Smallest Movie."
In the early 1990s, scientists discovered ways to synthesize buckminsterfullerenes and carbon nanotubes, nanometer-scale structures composed of carbon a single atom in thickness. By 2004, Andre Geim and Konstantin Novoselov had isolated and characterized two-dimensional sheets of carbon that are just one atom thick (known as graphene), which earned them the 2010 Nobel Prize in Physics.
In 2005, scientists at Rice University (led by Professor James Tour) created the "nanocar," a molecular structure consisting of fullerene "wheels" of carbon atoms, and a body composed of complex chains of hydrogen, carbon, and oxygen. Once placed on a heated gold surface, the nanocar was induced to move back and forth.
In 2012, IBM scientists created the world's smallest magnetic memory out of only 12 atoms. By 2017, they took things a step further by storing one bit of data on a single atom. Since conventional hard disk drives require about 100,000 atoms to store a single bit, these experiments could lead to incredibly smaller and denser storage devices.
Because of the variety of potential applications, nanotechnology has become one of the fastest-growing areas of technology in the world. In 2010, the value of the nanotechnology and nanomaterials sector was estimated at $15.7 billion. By 2024, this is projected to grow to $125 billion, an average increase of 57% per annum.
In the future, machines could interact directly with DNA and viruses at the nanometer scale, allowing for revolutionary new medical treatments that can repair genetic damage, correct genetic diseases, kill viruses, and reverse cell decay and even death. Therefore, the growing availability of such technology could offer everything from cures to all known diseases to life extension.
Following Feynman's logic, nanomachines could be used to fashion machines that are even smaller in scale, such as robots that measure a few picometers (m-12) or femtometers (m-15) in diameter. At these scales, machines would be comparable in size to individual atoms, and capable of manipulating them.
This will lead to a new age where we could synthesize precious materials out of simpler and more common elements and materials with special properties (aka. "supermaterials") to fulfill all sorts of tasks. Examples include room-temperature superconductors and materials that are super resistant to heat and pressure in space, deep sea, and interior planet exploration.
Several futurists and speculative thinkers have postulated that continued advancement in the fields of medical science and biotechnology would eventually usher in an age of "clinical immortality." As Feynman indicated in his 1959 lecture, a former grad student and collaborator (Albert Hibbs) suggested that tiny repair machines could be created in pill form, and people could simply "swallow the doctor."
In his book, Engines of Creation, Drexler also postulated that it would be possible to create medical nanomachines ("medimachines") that can operate within cells. He also ventured that biological machines assembled out of cells could be created that could repair living tissues and even DNA strands.
In his 2005 book, The Singularity is Near, Kurzweil suggested that medical science would eventually allow people to extend their life indefinitely, ushering in an age of "post-immortality." Kurzweil's prediction came down to somatic gene therapy, where engineered viruses repair damaged or defective DNA strands with synthesized genes.
In 2010, computer scientist and futurist Jaron Lanier published his futurist manifesto titled You are not a gadget. Expanding on the idea of life-extension in the physical sense, Lanier argued that immortality could be achieved through "Digital Ascension." As he described it, this would consist of "people dying in the flesh and being uploaded into a computer and remaining conscious."
Today, millimeter-scale medical devices are already available, allowing for targeted medical treatment for conditions like cancer and HIV. The invention of CRISPR and other gene-editing techniques has opened the door to true genetic engineering. And bioprinting is rapidly advancing to the point where internal organs can be replaced with on-demand printed tissue.
With these and other biotech advances entering mainstream medicine, life expectancy is expected to increase considerably. Granted, the possibility of uploading one's mind or extending life indefinitely with anti-aging cures is still highly speculative. But the creation of research projects and companies dedicated to these very things shows that they are entering the realm of possibility.
In 2013, tech giant Google announced its partnership with a new venture to research anti-aging cures. Known as Calico Life Sciences LLC, this organization is tasked with studying the underlying biology of aging and how we can combat it. They are joined by companies like Bulletproof, Human Longevity Inc., and the Silicon Valley Health Institute.
In that same year, the European Commission established the Human Brain Project, and the Obama administration founded the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative to advance neuroscience and AI. In 2014, Silicon Valley investors and numerous research institutes founded The Palo Alto Longevity Prize to develop life-extension treatments.
Soft or hard take-off?
Another point of contention when it comes to the Singularity is how it will happen, as in whether it will occur gradually or rapidly. These are also known as the "soft takeoff" and "hard takeoff" scenarios. While the former scenario is based on the belief that certain technologies will keep advancing while others will plateau, the latter is based on concurrent and mutually beneficial advancements.
In various soft takeoff scenarios, anti-aging cures and human augmentation gradually extend human lifespans and intelligence, leading to significant change over time. In this scenario, advancements in artificial intelligence are also not expected to lead to runaway change, as the machines in question will still be dependent on humans and their limitations.
It has also been suggested that breakthroughs in quantum computing and the development of a superintelligent AI (alone or in combination) would be constrained by interacting with existing "slow" infrastructure. Similarly, there is no guarantee that artificial intelligence will augment human intelligence at all or in an exponential way.
Intelligence Explosion: If and when humanity can realize a state of "superintelligence" — either through the creation of AI or neural augmentation (or both) — all forms of research and development will undergo a massive shift. The resulting applications are likely to be immense, with "supersmart cities" and entire nations run by sophisticated and predictive AIs.
Humanity stands to benefit from this immensely, especially where resource use is concerned. With highly advanced intelligence overseeing energy, transportation, water usage, waste management, and the like, major cities are likely to become places where net-zero carbon emissions and zero waste is actually possible.
Post-Scarcity: The ability to alter elements and compounds on the cellular, atomic, and even subatomic scale is projected to lead to an age of abundance. Long before the rise of "civilization," one basis for human economies has always been scarcity. The more scarce a resource is, and the more labor needed to produce an item, the more it is valued.
But in an age where precious metals and other valuable commodities can be synthesized out of base materials, energy can be generated endlessly from renewable sources, and food and water can be created out of thin air (a la replicators from Star Trek), the situation will change drastically.
The economic order, and the poverty, hunger, and underdevelopment that comes with it, would cease to exist. As George Orwell famously wrote in 1984 (in the Goldstein Manifesto):
"In a world in which everyone worked short hours, had enough to eat, lived in a house with a bathroom and a refrigerator, and possessed a motor-car or even an aeroplane, the most obvious and perhaps the most important form of inequality would already have disappeared. If it once became general, wealth would confer no distinction."
Post-Mortality: The ability to extend one's own life indefinitely with anti-aging medicines, replacement organs, stem cell therapy, medimachines, and neural uploading will lead to the first "immortals." The ability to keep on living and reverse the scourge of time and aging will also mean that people will be able to remain active, healthy, and productive indefinitely.
Consider the Baby Boom generation. In their parents' generation, people stayed in the workforce until their retirement was mandated at 65, and the average life expectancy was in the 60s or 70s. In contrast, many Boomers today continue to work well past the age of retirement and can live well into their 80s, thanks in large part to recent advances in medical science and nutrition.
The takeaway from this is rather obvious: the ability to live longer, healthier lives raises the possibility of having longer, more productive careers. For Generation Z and "Gen Alpha," the situation is likely to be even more radical as aging, retirement, and even death are something that could be put off again and again.
Posthumanism: The ability to transcend biology and enhance oneself endlessly is also likely to lead to what is commonly referred to as the "Transhuman Era." This term refers to a process where humans will begin transitioning to a higher form of life by replacing or augmenting their physical bodies will synthetic parts.
Beyond this transitional state, there is also what is referred to as "Posthumanism." In this context, it refers to a state where humanity is no longer constrained by any physical or biological limitations and exists in various forms that allow it to explore the Universe, live in simulated realities, or inhabit otherwise uninhabitable spaces where there is abundant energy to draw from.
Disruption and destabilization: An undeniable and not altogether positive aspect of the Singularity is how it will lead to widespread obsolescence. Today, the term "disruption" is used frequently to describe innovations that force people to rethink various habits or behaviors, often concerning work, transportation, consumption, and other daily rituals.
The term, however, is double-edged, implying that while certain innovations can lead to new ways of doing things, it also entails change that may not be entirely comfortable. Rideshare services, cryptocurrencies, and the ability to rent out rooms online allow for all kinds of things but could also result in entire industries going under, and do not necessarily lead to better conditions.
Just imagine what the effect could be if everything from healthcare, manufacturing, banking, buying and selling, work, supply and demand, and all the metrics with which we measure our lives are revolutionized overnight. Destabilization is one way of putting it, but it might not be an exaggeration to say that the effect would be more akin to chaos and anarchy.
An immediate takeaway from this is how those who have access to these technologies will have an advantage over those who don't. Especially where biotechnological advancements are concerned, people in developed nations will benefit before those living in developing countries. And in both cases, it is the wealthiest citizens who will have access before anyone else does.
However, eliminating scarcity also entails the possibility that all forms of want, inequality, and underdevelopment will also be eliminated. In a world where all basic needs can be met by possessing tools that cost virtually nothing to make, wealth, distinction, and the gap between the rich and the poor will cease to exist.
Of course, there is no shortage of naysayers, skeptics, and doubters regarding the Technological Singularity and similar predictions. In one camp, you have those who cite past claims such as flying cars, floating cities, and other futuristic visions that were predicted to come true by the 21st century.
While some speculative thinkers have proven to be bang on in the past, predicting the future has always suffered from a rather high failure rate. Second, some challenge the rather utopian outlook of thinkers like Ray Kurzweil, Peter Diamandis, and other Singularitarians who believe that this event will usher in a future of abundance and limitless opportunities.
Consider the technological advancements during the High Middle Ages, where European society incorporated advancements from the East like blast furnaces (which produced better cast iron), the compass, the astrolabe, and gunpowder. By the 15th century, Spain was well-positioned to inherit these inventions after they expelled the Moorish rulers from the Iberian peninsula.
These inventions (and European diseases like smallpox) allowed the Kingdom of Spain to conquer and murder millions of Indigenous Peoples all across Latin America from 1493 to 1898. Similarly, the Industrial Revolution (mid-18th to early-20th century) led to a dramatic increase in population, production, and agricultural output in Europe and North America.
In addition, it led to the industrialization of warfare and the birth of weapons like the machine gun and modern repeating rifles, which enabled Europeans to conquer and pillage their way across Africa and Asia. Further inventions like modern artillery, airplanes, tanks, barbed wire, and weaponized gas led to the brutality of World War I and World War II.
And who can forget how nuclear science gave us nuclear weapons and the "Atomic Age"? In short, history is replete with examples of how technological progress was not paralleled by similar advancements in human wisdom or empathy.
Arthur Koestler summarized this view in his famous book Sleepwalkers: A History of Man's Changing Vision of the Universe (1959). In it, he stated how human history could be visualized by plotting two curves (similar to temperature charts), where one curve measures physical power and the other "spiritual insight, moral awareness, charity, and related values":
"[I]n the course of the last two hundred years - a stretch less than one-thousandth of the total on the chart - the curve would, for the first time in the history of the species, suddenly rise in leaps and bounds; and in the last fifty years - about one-hundred thousands of the total - the curve rises so steeply that it now points almost vertically upward...
"Compared to the first, the second curve will show a very slow rise during the nearly flat prehistoric miles; then it will undulate with in-decisive ups and downs through civilized history; finally, one the last dramatic fraction of the chart where the power curve shoots upward like a cobra stabbing at the sky, the spiritual curve goes into a steep decline."
Last but not least, there is the rather repetitive claim that an event like the predicted Singularity is inevitable. Not only does this seem fatalistic to some, but it is a rather suspicious claim when spoken by Singularitarians, individuals who desire to see it happen and are even actively working to make sure it does.
As famed British historian AJP Taylor remarked when addressing the causes of World War I and World War II: "Nothing is inevitable until it happens."
Will it happen (soon)?
Skepticism, as always, is healthy and even necessary. But dissenting opinions aside, an examination of human history and technological trends suggests that the rate of change has been accelerating since the beginning of human history. Arguably, it has only been since the development of modern industry and automation that we have become aware of this trend.
The trend has become even more pronounced since the 20th century, where multiple technological revolutions happened within the lifetime of a single generation. When the onset of change becomes this rapid, it can lead to a state of "Future Shock," a term coined by famous futurist and entrepreneur Alvin Toffler in his 1970 book of the same name.
Toffler described this phenomenon as "the shattering stress and disorientation that we induce in individuals by subjecting them to too much change in too short a time." A futurist himself, Toffler was essentially describing the downside that too much acceleration and "disruption" can have while stressing the need for adaptive strategies in the future.
"The acceleration of change in our time is, itself, an elemental force," he said. "This accelerative thrust has personal and psychological, as well as sociological, consequences." While humanity has always had a somewhat complex relationship with technology, the relationship is timeless and undeniable.
What's more, a full and fair assessment will surely lead one to the conclusion that the net effect of technological change has been positive overall. Sure, technology gave us nukes, terrorism, spam emails, pandemics, and climate change. Still, it also gave us (for the most part) literacy, science, education, the internet, modern medicine, electricity, air conditioning, heating, alternative energy, recycling, and food security.
While it is always proper to treat the idea of "inevitability" or the promise of utopia with skepticism, it would also be irresponsible to ignore what is fast becoming an undeniable trend. From all outward appearances, technological change is an anthropogenic trend subject to acceleration, and the speed at which changes are coming is reaching a critical point.
Whether it remains to be seen if the Technological Singularity will be all that it's advertised to be, it appears safe to assume at this point that it will be happening. What's more, it may be happening in our lifetimes!