From the perspective of one experiencing time-dilation nothing appears unusual, everything appears normal, it only from the outside perspective that things are strange.
As far as I can tell the singularity happened in the late 1700's. For thousands of years the collective economic growth of the world was effectively a straight shallow line, it grew, but slowly and linearly, then in the late 1700's something changed, it went exponential and everybody was along for the ride, and from the perspective of being caught up in this exponential growth it appears flat, normal even. but you look at history and wonder why every advance was so slow. and you look ahead and say the singularity is almost there. But we will never actually reach it. by the time we get there it is the new normal.
The amazing complexity in rigging seen in the age of sail is built on a long line of innovation, but engines rendered it largely irrelevant etc. As such ~1700 isn’t some clear tipping point, just the horizon before which innovation seems less relevant.
GP wrote "late 1700's". He's probably referring to the industrial revolution.
I.e. One day there are significant overnight changes. Then the very next day hourly changes, soon thereafter every minute, second, millisecond, etc.
Glacial economic growth for hundreds of thousands of years beforehand and then "something changed".
There is a popular illusion that somehow technological progress is a pure function of human ingenuity, and that the more efficient we can make technology the faster we can make even better technological improvement. But history of technology has always been the history of energy usage.
Prior to the emergence of homo-sapiens, "humans" learned to cook food by releasing energy stored in wood. Cooking food is often considered a prerequisite for the development of the massive, energy consuming, brain of homo-sapiens.
After that it took hundreds of thousands of years for Earth's climate to become stable enough to make agriculture feasible. We see almost no technological progress until we start harvesting enormous amounts of solar energy through farming. Not long after this we see the development of mathematics and writing since humans now had surplus energy and they could spend some of it on other things.
You can follow this pattern though the development and extraction of coal, oil etc. You can look at the advancement of technology in the last 100 years alongside our use of fossil fuels and expansion of energy capabilities with renewables (which historically only been used to supplement, not replace non-renewables).
But technological progress has always been a function of energy, and more specifically, going back to cooking food, computational/cognitive ability similarly demands increasingly high energy consumption.
All evidence seems to suggest that we increasingly need more energy for incrementally smaller return on computation.
So for something like the singularity to happen, we would also need incredible changes in available energy (there's also a more nuanced argument that you also need smooth energy gradients but that's more discussion than necessary). Computation is not going to rapidly expand without also requiring tremendously large increases in energy.
Further it's entirely reasonable that there is some practical limit to just how "smart" a thing can be based on the energy requirements to get there. That is, you can't reasonably harvest enough energy to create intelligence on the level we imagine (the same way there is a limit to how tall a mountain can be on earth due to gravity).
Like most mystical thinking, ignoring what we know about thermodynamics tends to be a fundamental axiom.
As for the compute side, we are running inference on GPUs which are designed for training. There are enormous inefficiencies in data movement in these platforms.
If we play our cards right we might have autonomous robots mining lunar resources and building more autonomous robots so they can mine even more. If we manage to bootstrap a space industry on the Moon with primarily autonomous operations and full ISRU, we are on our way to build space datacenters that might actually be economically viable.
There is a lot of stuff that needs to happen before we have a Dyson ring or a Matrioska brain around the Sun, but we don’t need to break any laws of physics for that.
> the singularity is a term borrowed from physics to describe a cataclysmic threshold in a black hole
In his article which popularized the idea of The Singularity, Vinge quotes Ulam paraphrasing von Neumann, and states, "Von Neumann even uses the term singularity". As von Neumann surely knew, "singularity" was a term widely used in mathematics well before the idea of black holes (etymonline dates first use to 1893). Vinge does not say anything about black holes.
> an object is pulled into the center [of] gravity of a black hole [until] it passes a point beyond which nothing about it, including information, can escape. [...] This disruption on the way to infinity is called a singular event – a singularity.
The point at which "nothing" can escape a black hole is the event horizon, not the singularity. What exactly happens to information and what exactly happens when crossing the event horizon are subjects of debate (see "black hole information paradox" and "AMPS/firewall paradox"); however, it's probably fair to say that the most orthodox/consensus views are that information is conserved through black-hole evaporation and that nothing dramatic happens to an observer passing through the event horizon.
> the singularity became a black hole, an impenetrable veil hiding our future from us. Ray Kurzweil, a legendary inventor and computer scientist, seized on this metaphor
While I'm not prepared to go into my personal views in this comment, it's worth noting that the idea that "exponential curves look the same from every point" is not foreign to, e.g., the Kurzweilian view of The Singularity; nevertheless, fitting dramatic, industrial-revolution-sized progress into the fixed scale of a (contemporary) human lifetime would surely be a big deal. This idea, (whether you believe it will happen or not), is obscured by the spurious black hole metaphor.
For example it took centuries for indoor plumbing to be widely adopted, and less than a decade for smartphones. It took hundreds of thousands of years to get the first billion people (~1800), but the eighth billion happened in eleven years (2011-2022).
Finding the second and the third antibiotic for non resistant bacteria may be fast and easy, finding another three antibiotics for resistant bacteria decades later is now crazy hard, as bacteria evolved to resist everything that doesn't also kill humans.
For antibiotics specifically, we will probably find other ways to fight bacteria even if we never discover another chemical antibiotic. As one technology S-curves, another technology replaces it.
Even if for no other reason than us abandoning a diminishing returns approach looking for other alternatives.
We have been kind of at the end of the rope for silicon for quite some time now and we found increasingly heroic ways to protect our investment in silicon based semiconductors, but silicon is not the only option - it’s just the one we have a lot of supply chains already set up.
Population appears to be on a droopy S curve. The preposterousness of those space data centers and the fact that we don't have a theory of consciousness makes it seem plausible that AI could also not continue to rocket ahead.
The rate of datacenter construction in the last few years exceeds Moore's law and is almost certainly unsustainable. 'Only' 2x improvement every 2 years would seem relatively slow compared to what's happened recently.
However, I expect AI will continue to advance over the coming decades even once the bubble pops. They're clearly on to something with neural networks.
Wood fires were the only option for something like a few hundred thousand years.
Oil lamps for millennia.
Tallow or beeswax candles are modern technology, appearing after the fall of the Roman Empire.
Gas lighting was widespread for less than a century.
Incandescent lightbulbs for another century, but were starting to get replaced by fluorescent tubes just decades later.
Cold cathode fluorescents saw mainstream use for about two decades.
LEDs completely displaced almost all previous forms of lighting in less than a decade.
I recently read about a new form of lighting developed and commercialised in just a few of years: https://www.science.org/doi/10.1126/sciadv.adf3737
If we need more light, we can deploy more power generators.
In the days of rotary & pay telephones the loss of communication was possible.
That is no longer the case.
I guess it's easier to find people now, especially if they have an online presence, but I think the experience of losing touch is still pretty much the same.
My tendinitis complained as I read this. It told me not to dare trying that.
But this is a bit of a straw man. Mathematical models of the technological singularity [1], along with the history of human economic growth [2], are super-exponential: the rate of growth is itself increasing over time, or at least has taken multiple discrete leaps [3] at the transitions to agriculture and industry, respectively. A true singularity/infinity can of course never be achieved for physical reasons (limited stuff within the cubically-expanding lightcone, plus inherent limits to technology itself), but the growth curve can look hyperbolic and traverse many orders of magnitude before those physical limits are encountered.
[1] https://www.nber.org/system/files/working_papers/w23928/w239...
[2] https://docs.google.com/document/d/1wcEPEb2mnZ9mtGlkv8lEtScU...
It can’t be infinitely fast, but after the point where we all collectively cease to be able to comprehend the rate of change, it’s effectively a discontinuity from our point of view.
Cynical take: Kurzweil's predictions follow a predictable pattern which suggests something about how and why they are being generated.
Namely, it's whatever increasingly-improbable new advances and discoveries are needed to ensure achieve practical immortality is achieved just in time for a particular human named Ray Kurzweil to escape the icy grip of death.
Or to want to talk to meat.
> In Vinge’s analysis, at some point not too far away, innovations in computer power would enable us to design computers more intelligent than we are, and these smarter computers could design computers yet smarter than themselves, and so on, the loop of computers-making-newer-computers accelerating very quickly towards unimaginable levels of intelligence.
You can't multiply matrix x matrix (or vector x matrix) faster than O(N^2).
You can't iterate through array faster than O(N).
Search & sort are sub- or near-linear, yes - but any realistic numerical simulations are O(N^3) or worse. Computational chemistry algorithms can be as hard as O(N^7).
And that's all in P class, not even NP.
The n in this article is the size of each dimension of the matrix — N=n^2. Lowest known is O(N^1.175...). Most practical is O(N^1.403...). Naive is already O(N^1.5) which, you see, is less than O(N^2).
P.S. I am not arguing against, but rather agreeing with you.
You are fools to think you personally are a part of or will be present at the zenith of human ascendancy.
One, all, and the world will go on as though another day. Those who become or go beyond their “full self” will merely have a new level. Like a base conversion.
Besides, there are notes of singularities flitting in and out of your very minds. You get the the bottom of those and you will find whichever part is yours will come by your acquiring it for yourself.
The singularium will be your own place in the ascendency of Man, through technology or personal development. The self is the ultimate technology.
Quantum mechanics and consciousness.
Pyramids and aliens.
Looking forward, it is a great opportunity for random mashup "explanations". The urge will be great for some people.
Quantum mechanics as understood is flawed, consciousness is universal potential subjectively bound to particulate, animated by living biotechnology, and squares of this day still refuse to wink at “magic.”
Pyramids are human engineering. “The Greys” are our Earth mates. America’s nuclear suicidal tendencies have revoked your right to deny. I speak only for the ascendency of Man.
Have fun flat landing stoic, I know you’re really a bleeding heart.