[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.10886315 [View]
File: 136 KB, 1260x560, 2F0DDBB7-763A-4DEB-B7F1-5E3B6065A8AB.jpg [View same] [iqdb] [saucenao] [google]
10886315

It is well established a technological singularity will occur at some point
>Four polls, conducted in 2012 and 2013, suggested that the median estimate was a 50% chance that artificial general intelligence (AGI) would be developed by 2040–2050.
>The first use of the concept of a "singularity" in the technological context was John von Neumann.[4] Stanislaw Ulam reports a discussion with von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue

>Source: Wikipedia: technological singularity

How do we prevent it from occurring? Obviously we want to get technology right up to the edge of a singularity to get max efficiency for human gains, but not go so far as to trigger a real singularity which would signal the EXTINCTION of Homo Sapien

Is a Nanny AI the only option? (A friendly AI who monitors everyone on the planet to ensure nobody, not lone hacker nor government, triggers a singularity - notifies the authorities well before hand)

Navigation
View posts[+24][+48][+96]