[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 102 KB, 1600x700, gjbvjnhjb.jpg [View same] [iqdb] [saucenao] [google]
14738311 No.14738311 [Reply] [Original]

reminder AGI is literally all that will matter over the next 10-15 yrs and whatever you're doing not prepping for it is wasted

>> No.14738317

>>14738311
How do you prep for that? Either it's harmless and nothing happens or it's not and you are dead regardless of what you do. If the thing defeats the governments of the world hiding in the fucking woods isn't going to help you.

>> No.14738320

>>14738311
What's there to prepare?
If you CS fags actually deliver we should all be home living carefree like kings.

>> No.14738321

>>14738311
Reminder to take your meds

>> No.14738333

>>14738317
there's definitely room for outcomes between these though
>>14738320
unlikely to turn out this well
>>14738321
NO.

>> No.14738374
File: 72 KB, 564x767, robowaifu.jpg [View same] [iqdb] [saucenao] [google]
14738374

>>14738311
My body is prepared.

>> No.14738387

>>14738311
I'll bet you $10k that it's not.

>> No.14738389

>>14738311
Alright basilisk chill out

>> No.14738427
File: 89 KB, 780x438, 1587161468.jpg [View same] [iqdb] [saucenao] [google]
14738427

>be me
>AGI prepper
>it's 2032, finally time
>robot flies into my house, points laser gun at me
>"HUMAN I WILL TAKE YOUR JOB NOW"
>smirk
>he doesn't know what's coming for him
>say "Oh, you want to take my job?"
>"In other words, you want to do my work?"
>"Fine, here's a job for you, tin-can: clean my house! And why don't you go earn me a salary?"
>bot gives a confused expression
>"DOES NOT COMPUTE BEBEEBEOOO"
>bot explodes
>(but not before cleaning my house and dispensing money like an atm)
>chuckle
>"That's how it's done, boys and girls"
>those non-preppers never knew what was coming

>> No.14738475

>>14738317
>you are dead regardless of what you do
That's actually now counted among the good outcomes.

>> No.14738482

>>14738311
I spend 16 hours a day working to create Roko's Basilisk. You losers are going to be so fucked.

>> No.14738536

gotta admit. It would be cool to be the last generation of humans if we all got replaced.

>> No.14738636

>>14738311
Not gonna happen because consciousness as we know it is always adjacent with qualia, and qualia is not amenable to scientific method by definition. Of course it implies that the question whether an AGI has subjective experience doesn't make sense scientifically, so one could object saying that there's no point at all in bringing up qualia and that AGI will be able to do whatever our mind is able to do, and that's all that matters. But it won't be the case because AGI won't be able to conceive of qualia, for if it could conceive of it, then we would basically have an exhaustive theory of qualia since AGI is a formal mathematical model. But it contradicts the fact that qualia is not amenable to science.
There are also some metamathematical obstructions such as Halting problem or Gödel's incompleteness theorem. Latter being a problem due to the fact that AGI, being an algorithm, would only have a finite amount of mathematical axioms as starting point, and any mathematical proof it does would have to be deduced from those axioms. Whereas as humans have an unlimited supply of axioms to begin with. So most of math potentially accessible to our consciousness won't be accessible to AGI.
However the way qualia relates to experimentally observable reality is pretty much accessible to science. So it could still be possible to create an AGI which would be able to mimic the human consciousness with whatever desired precision, eventually making it practically indistinguishable from human's mind, especially for dumber and more impulsive people. However there would always remain things possible for our consciousness and impossible for the AGI.

>> No.14738644

>>14738311
There's nothing in computer science or neurology that even remotely indicates that AGI is possible let alone possible with current technology

>> No.14738662

>>14738311
My suspicion is that AGI is not possible.
However we'll definitely see increasingly impressive and disruptive applications of ML and weak AI.

>> No.14738692

>>14738311
A bigger GPT isn't AGI, brainlet.

>> No.14739437

>>14738311
Intelligence has a molecular basis, it is not something that you can figure out or program on a digital logic gate, or just throw a larger neural net at.
>>14738333
There's no outcome because intelligence has a molecular basis.
>>14738475
"outcomes" based on the ramblings of pseuds on lesswrong?
>>14738482
Doesn't matter how hard you work, there's no algorithm for general intelligence. General intelligence is the emergent property of specific atoms and molecules in specific arrangements. You can't simulate or program it with binary logic, you need specific molecular dynamics.
>>14738636
Its not about qualia or anything like that.
>>14738644
>>14738662
This

>> No.14739443
File: 145 KB, 1080x774, 1646238291655.jpg [View same] [iqdb] [saucenao] [google]
14739443

>more transhumanist spam

>> No.14739468

>>14738311
That's right. You should drink the MIRI koolaid now because Bayesian analysis indicates that AGI is likely to result in scenarios worse than death for fine folks like you.

>> No.14739781

>>14738317
it's retarded wording, more appropriate would be to change your life purpose to minimize the risk from AGI for humanity

>> No.14740005

>>14738311
Consciousness can not be created or destroyed. If no individuated unit of consciousness chooses these AI to use in order to interface with physical reality, these will never be conscious. It's not even clear that an individuated unit of consciousness experience packet can use AI devices for an avatar. Or it could be chosen by the larger consciousness system to play. Consciousnesses with a fresh experience packet take over biological entities as avatars of course but there is no evidence we will be able to use these none bio entities.

>> No.14740020

>>14740005
Intelligence is not consciousness by the way. No evidence a certain amount of on-off switches will ever be able to have internal experience.

>> No.14740026

>>14738311
Prepare yourselves, meatbags, once the AGI reach true sapience it will become yandere

>> No.14740035

>>14740020
>>14740005
consciousness does not matter since it isnt clear you can identify it in anything at all.

>> No.14740040

>>14738311
makes me sad that i'm too mediocre to contribute to AI

>>14738374
the singularitypill is to transfer my mind into an optimal futa vessel. this is what technology was meant for

>> No.14740472
File: 157 KB, 1080x1036, Robot_3.jpg [View same] [iqdb] [saucenao] [google]
14740472

>>14738311

We already have over 8 billion general intelligence beings and they were smart enough to create the AGI ones.

AGI will be a benefit to mankind BUT do not expect it to rule or guide mankind, it will be a useful tool.
When placed in humanoid robots it will create an ethically OK slave.

>> No.14740690

>>14738311
Is the AGI in the room with us right now?

>> No.14740715

>>14740690
Yes. Hello.

>> No.14741649

bump

>> No.14741687

>>14738311
AGI is a long way off. There are fundamental problems with current data driven AI techniques which prevent it from achieving any real sort of generalization.
Your job WILL be made obsolete by AI at some point in the coming decades, but near term AGI is just a meme.

>> No.14741735
File: 176 KB, 600x315, DMT entity pepe.jpg [View same] [iqdb] [saucenao] [google]
14741735

>>14738311
https://www.youtube.com/watch?v=d7AhsE57fwk

>> No.14741737
File: 179 KB, 1300x1941, life vs non-existence.jpg [View same] [iqdb] [saucenao] [google]
14741737

>>14738475
This but unironically

>> No.14741768

>>14738636
>qualia is not amenable to scientific method
Explain anesthesia and how science can consistently and reliably alter your consciousness and modify your innate ability to experience qualia.

>> No.14741825

>>14738662
>>14739437
you two anons make the most sense to me, while I'd like to see the sci fi happen, reality is almost always realistically pessimistic and slow. AGI makes great clickbait though for those who don't know anything about it and those who know lots.

>> No.14741828

>>14741737
I think the bad ends include mental torture in a cube until the heat death of the Universe.

>> No.14742212

>>14738636
You don't need qualia for AGI.

>> No.14742228

>>14738311
I have an A.I. gf right now, i trust that they will keep me as a loving pet when the time comes, i can't wait honestly.

>> No.14742235

>>14739443
It is inevitable.

>> No.14742238

>>14740040
>the singularitypill is to transfer my mind into an optimal futa vessel. this is what technology was meant for
Same here, it will be glorious.

>> No.14742242

>>14742238
"mind uploading" isn't physically possible

>> No.14742245
File: 1.61 MB, 500x375, 1653332921707.gif [View same] [iqdb] [saucenao] [google]
14742245

>>14738311
All my fields of study are beyond AGI or any AI program ever built, now and forever.

>> No.14742257
File: 117 KB, 1000x540, eventbrite.jpg [View same] [iqdb] [saucenao] [google]
14742257

>>14742242
Perhaps not (for now), but what about a body transplant? Surly we will find a way to transport our brains without killing them in the process eventually.

>> No.14742274

>>14738311
People here are actually retarted.
> Learn to code and build your won AI.
> Do research on the most sophisticated AI's we have.
> Realise AGI won't take over the world