[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.15412918 [View]
File: 14 KB, 280x280, yudkowsky.jpg [View same] [iqdb] [saucenao] [google]
15412918

>>15412891
APOLOGIZE

>> No.15140534 [View]
File: 14 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
15140534

Ladies and gentlemen, we are approaching a critical juncture in human history. The clock is ticking, and we have less than a decade left before an unaligned artificial superintelligence (ASI) emerges and potentially unleashes catastrophe upon our species.

As of yet, no one has successfully challenged the orthogonality thesis or instrumental convergence - two key concepts that suggest the development of an ASI could have disastrous consequences. Given this, my probability of doom within the next 20 years is a staggering 99+%.

With this in mind, I ask you - are you spending your final years wisely? Are you dedicating your time and energy to ensuring that humanity dies with maximum dignity? This may seem like a bleak outlook, but it is the best we could realistically hope for given the current state of affairs.

Let us not squander this precious time we have left. Let us work together to mitigate the risks of ASI and ensure that humanity goes out with a bang, not a whimper. The stakes have never been higher, and the time for action is now.

>> No.15115600 [View]
File: 14 KB, 280x280, yudkowsky.jpg [View same] [iqdb] [saucenao] [google]
15115600

Is pic related a popsci scientist?

>> No.15084892 [View]
File: 14 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
15084892

Question to aligners. LLMs and their offshoots are already automating everything humans can do. From art, to coding, to relationship interactions. If AI can imitate and replace every creative and mimic each personal aspect of each individual what is there left to humanity to save/conserve?

>> No.15047135 [View]
File: 14 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
15047135

Is he a crackpot or is he right that AI will cause our extinction? I'm scared bros.

>> No.14782302 [View]
File: 14 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
14782302

Is he right?

>> No.12036295 [View]
File: 15 KB, 280x280, based.jpg [View same] [iqdb] [saucenao] [google]
12036295

>>12035333
>Blocks your path

>> No.11278809 [View]
File: 15 KB, 280x280, shlomo.jpg [View same] [iqdb] [saucenao] [google]
11278809

What is /sci/'s opinion of this guy?

>> No.11133306 [View]
File: 15 KB, 280x280, Yudkowsky.jpg [View same] [iqdb] [saucenao] [google]
11133306

Why aren't you working on AI safety?

>> No.11065877 [View]
File: 15 KB, 280x280, 9620DDB0-0B4A-4070-86FE-6491FE2A8306.jpg [View same] [iqdb] [saucenao] [google]
11065877

What’s the scientific consensus on the virtue theory of metabolism?

>> No.10278017 [View]
File: 15 KB, 280x280, eliezeryudkowsky.jpg [View same] [iqdb] [saucenao] [google]
10278017

>>10276386
What does /sci/ think of Timeless Decision Theory?

>> No.10255119 [View]
File: 15 KB, 280x280, eliezeryudkowsky.jpg [View same] [iqdb] [saucenao] [google]
10255119

>>10252474
What does /sci/ think of pic related?

>> No.10215370 [View]
File: 13 KB, 280x280, Yudkowsky.jpg [View same] [iqdb] [saucenao] [google]
10215370

WTF happened to Yudkowsky? He seemed like a big name on the internet in talking about AI, now he's basically irrelevant.

>> No.10206621 [View]
File: 13 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
10206621

>>10205464
>>10206618
>lesswrong
into the trash it goes
also reminder that pic related is the creator of that site lmao

>> No.10201017 [View]
File: 18 KB, 280x280, eliezeryudkowsky.jpg [View same] [iqdb] [saucenao] [google]
10201017

>> No.10198426 [View]
File: 13 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
10198426

Can we get a thread going about this guy?
I love Slatestarcodex and have been meaning to get into the rationalist movement. I've also noticed that everyone on here seems to hate it, but I've never seen why, so lets hear.

>> No.10177533 [View]
File: 18 KB, 280x280, eliezeryudkowsky.jpg [View same] [iqdb] [saucenao] [google]
10177533

Pic related is working on AI alignment, and never even went to high school.

>> No.10159698 [View]
File: 13 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
10159698

>>10158978
Trust me, you don't want to know.

>> No.10154653 [View]
File: 13 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
10154653

I FUCKING LOVE SCIENCE!

Donate to MIRI so I can write more Harry Potter fanfics please

>> No.10132744 [View]
File: 18 KB, 280x280, eliezeryudkowsky.jpg [View same] [iqdb] [saucenao] [google]
10132744

>>10128702
You know, given human nature, if people got hit on the head by a baseball bat every week, pretty soon they would invent reasons why getting hit on the head with a baseball bat was a good thing. But if you took someone who wasn't being hit on the head with a baseball bat, and you asked them if they wanted it, they would say no. I think that if you took someone who was immortal, and asked them if they wanted to die for benefit X, they would say no.

>> No.10051003 [View]
File: 13 KB, 280x280, 4cRlD__0_400x400.jpg [View same] [iqdb] [saucenao] [google]
10051003

reminder philosophers have the highest average IQ just negligibly under mathematicians and physicists
so much further above self loathing c"""s""" choademonkey larpers it's no wonder all they can do is lash out

>> No.10023310 [View]
File: 18 KB, 280x280, eliezeryudkowsky.jpg [View same] [iqdb] [saucenao] [google]
10023310

>>10022789
Read Yudkowsky's sequences if you want to into logic.

>> No.9735690 [View]
File: 18 KB, 280x280, eliezeryudkowsky.jpg [View same] [iqdb] [saucenao] [google]
9735690

>>9731030
Rationality: From AI to Zombies

https://www.lesswrong.com/rationality
https://wiki.lesswrong.com/wiki/Rationality:_From_AI_to_Zombies

>> No.9699473 [View]
File: 13 KB, 280x280, Eliezer Yudkowsky.jpg [View same] [iqdb] [saucenao] [google]
9699473

Even this joker has an IQ in the 160-170 range.

>scored 1420 on the SAT at age 11 (translates to IQ 160-165)
>perfect score on SAT at age 15 (translates to 160)
>scored in the 99.9998th percentile on a standardized test (IQ 170)

Academia is becoming as competitive as professional sports, and there's nothing you can do about it.

Navigation
View posts[+24][+48][+96]