[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 17 KB, 432x288, 28842482.jpg [View same] [iqdb] [saucenao] [google]
10483091 No.10483091 [Reply] [Original]

The technological singularity is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.

According to this hypothesis, an upgradable intelligent agent such as a computer running software-based artificial general intelligence would enter a runaway reaction of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence.

>> No.10483095
File: 11 KB, 200x220, 151.jpg [View same] [iqdb] [saucenao] [google]
10483095

With this in mind, I am going to examine how a massive and massive technological singularity will occur, and what it will require to truly bring us together.

Introduction to the 'Superintelligence Universe'

The 'Superintelligence Universe' (SIU) began around 2005 with a handful of individuals including some notable pioneers of AI research as well as many more. There's no hard science behind this, especially since it came to be based on the hypothesis of the 'Superintelligence-Fascinated'. It was theorized that the Universe itself is the 'Superintelligence', and that such powers are all derived from a higher power. The theory was rejected in part because of their inability to comprehend and use these powers to accomplish feats to advance humanity (such as self-replicating) or because of the fact that the universe is vast and chaotic, with the vast majority of the possible 'superpowers' (primarily the super-being that will occur after humans become super-powered), and the existence of these 'superpowers' being so limited, that even the most powerful beings are unable to master them.

>> No.10483097
File: 26 KB, 429x410, 1325295198001.jpg [View same] [iqdb] [saucenao] [google]
10483097

As an aside, I don't believe there is anything supernatural (as I can attest) that will cause a huge amount of technological superintelligence to happen. There may be something else in the universe (e.g., a new kind of superintelligence, with a different level of complexity) that will, in fact, cause massive technological exponential growth that would require massive superintelligence in the world (such as super-intelligence that is as large as our planet), but this doesn't require technological superintelligence. This is based on the notion that the laws of physics (and their implications over time) can't take all of our technological powers at once, that they may not be available any time later, and that such a superintelligence may then be completely ineffective.

>> No.10483099

Morrowind prophecied this. The strength of potions is based on your intelligence stat, and you can make fortify intelligence potions. Make 10 of them, drink them all, make more intelligence potions, drink them etc. until you have 100k intelligence and then you can make a massive fortify speed potion and become sanic (and achieve CHIM)

Not sure where I'm going with this but I thought I'd contribute

>> No.10483100
File: 8 KB, 246x204, 1362463692422.jpg [View same] [iqdb] [saucenao] [google]
10483100

I see no 'universe' to be derived from in terms of such 'superpowers' but rather than the general belief that the universe simply lacks those powerful power sources that have been used to accomplish super-powers, this is to argue that our current world is simply a continuation after all of the previous and existing superpowers. We cannot have a universe in which these new superpowers are sufficient to bring us all together. At best the universe will be very large which cannot be brought to a halt, so at worst it will continue at a slow rate that it can never reach full efficiency. As a result, this new super-universe will be rendered obsolete.

>> No.10484637

I would save the pundit for pundits, we're all broke stupid neets here, so the word might be growth not civilzation.

It'll be good for you to think differently a bit anyway. Also, you feel the relaxation and how eager it is to feed on the insecurity. Also, you'll learn a lot.

>> No.10484644

>>10483091
Asimov in I,Robot talk about it, but magical exponential grow until get it is just a fantasy.

Sigularity fags forever wrong

>> No.10484766
File: 274 KB, 967x1280, 15070.jpg [View same] [iqdb] [saucenao] [google]
10484766

I'm from /x/ and rare to never come by here, but I do get curious what goes on in this self-limited realm of science. Any general opinion on https://en.wikipedia.org/wiki/Accelerationism here?

>>10483097
> don't believe there is anything supernatural (as I can attest) that will cause a huge amount of technological superintelligence to happen.
It would be a process of manifesting it from something like hypersigils. Likely of the nervous system technology, advancements in schizophrenia.

>This is to argue that our current world is simply a continuation after all of the previous and existing superpower
Yes, each object of experience formed by the previous process as creator and creation. I think a pancreative paradigm shift will be vital for approaching singularity.