[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 43 KB, 720x480, lain.jpg [View same] [iqdb] [saucenao] [google]
2622801 No.2622801 [Reply] [Original]

Is Computer Science, if we look forward in time, really the study of technological transcendence?

TL;DR: http://en.wikipedia.org/wiki/Technological_singularity
A technological singularity is a hypothetical event occurring when technological progress becomes so rapid that it makes the future after the singularity qualitatively different and harder to predict. Many of the most recognized writers on the singularity, such as Vernor Vinge and Ray Kurzweil, define the concept in terms of the technological creation of superintelligence, and allege that a post-singularity world would be unpredictable to humans due to an inability of human beings to imagine the intentions or capabilities of superintelligent entities. Some writers use "the singularity" in a broader way to refer to any radical changes in our society brought about by new technologies such as molecular nanotechnology, although Vinge and other prominent writers specifically state that without superintelligence, such changes would not qualify as a true singularity. Many writers also tie the singularity to observations of exponential growth in various technologies (with Moore's Law being the most prominent example), using such observations as a basis for predicting that the singularity is likely to happen sometime within the 21st century.

The term "technological singularity" reflects the idea that such change may happen suddenly, and that it is difficult to predict how such a new world would operate. It is unclear whether an intelligence explosion of this kind would be beneficial or harmful, or even an existential threat, as the issue has not been dealt with by most artificial general intelligence researchers, although the topic of friendly artificial intelligence is investigated by the Singularity Institute for Artificial Intelligence and the Future of Humanity Institute.


What does /sci think?

>> No.2622822
File: 36 KB, 398x398, phobos.jpg [View same] [iqdb] [saucenao] [google]
2622822

I think that Transcendent Man is coming out today/tomorrow.

My body is ready.

>> No.2622838

It's really tough to have anything intelligent to say about this subject. It's all speculation. Though none of it falls outside the laws of physics, that's for damn sure. - Which would lead one to imagine that it really is only a matter of time, which is pretty awesome to think about.

>> No.2622971

>>2622838

Right. Which is why I posted about it. I personally think it is inevitable and I believe in the idea. I certainly want it to happen in any case. The sooner the better. The nihilistic side of me doesn't give a fuck if it leads to our extinction either. Because you have to look at it the other way around as well and recognize that if we merge with computers then we won't be humans anymore. We'll be some new composite being of some sort that can't even be described.

>> No.2625641

>>2622838
>Which would lead one to imagine that it really is only a matter of time
The Singularity is just becoming a philosophy and a sales pitch. My concern is somebody using that guise to rush 'a premature and poor product'. The more responsibility you give this new entity, if you will, the more liability and risk we take on. Sure if you want that super intelligence to take over traffic lights, rewards would definitely outweigh the potential risks but put it in charge of functions necessary for life on a MASS scale then we have a problem.

ugh, I just think these people for this need assloads and assloads of proof-of-concept work done instead of the assloads and assloads of sales pitches and "OH BUT IT'S COMING EVENTUALLY ANYWAYS GUYS, JUST ACCEPT IT NOW (FOR AN EXTREMELY LOW INTRODUCTORY OFFER)".

I also think that all these singularists are just human-hating liberals and that's why they are so enthusiastic about this crap. I hate (and enjoy) life/the human-race just as much, if not more, than the next guy but I don't keep my feelings about it secret nor do I hope for any miracle, wipe-my-ass technologies. Radio, TV, computers and cell-phones have done enough damage to the evolution of man and the next phase of technologies are not going to be any better just because they strive for "super-intelligence".

>> No.2625660

as a computer science student i got the impression computer science was the study or algorithms, programming math and corporate slavery


please tell me more about how it relates to becoming a god

>> No.2625678

i never got the hype about the singularity.
yes a copy of you will live forever on a computer.
not you, you will die.
plus i'd like to keep my balls.

>> No.2625687

I think we'll have another dark age before reaching that point.

>> No.2625689

Computer science is the study of why java applets simulating horse races with threads make you want to suicide.

>> No.2625716

I just can't wait until everything is wireless and we can control computers/etc. with our minds. I predict this to be accessible to the public within the next 20 or 30 years?

>> No.2625728

>>2625716
http://www.youtube.com/watch?v=iDV_62QoHjY&feature=player_embedded

>> No.2625744

>>2625716
Won't be long!

captcha: Herbert ubercent

>> No.2625759

>>2625678
>>2625678
don't confuse the singularity with becoming a virtual entity, the singularity is just just an unpredictable point of technology rapidly advancing, no guarantee of what we will be on the otherside

>> No.2625766

>>2625728
thats bullshit waste of money and time right there

>> No.2625830

>>2625766
> doesn't realize BCI's have other applications than driving cars

fullretard.jpeg