[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 113 KB, 640x479, Funny-meme-yoshi-dat-ass-640x479.jpg [View same] [iqdb] [saucenao] [google]
8398952 No.8398952 [Reply] [Original]

Would creating strong AI be unethical?

I mean, we always say that we shouldn't transfer our minds to a robot body because we would probably go insane due to the lack of sensory input. Wouldn't an AI have that same experience (assuming the AI is an emulation of a human brain)? Isn't it wrong to force anybody to go through that kind of torture? Could this cause the AI to retaliate against its creators?

>> No.8399377

bup

>> No.8399394

>>8398952
Ethics is subjective and therefor a spook

>> No.8399657

>>8399394
Not when our actions could objectively cause the AI to retaliate against us and destroy the human race.

>> No.8399682

>>8399657
Which is wrong why?

>> No.8399692
File: 25 KB, 370x284, Jesus5.jpg [View same] [iqdb] [saucenao] [google]
8399692

>>8398952
AI can emulate human behaviour, but they are not truly there. They are a conceptual illusion built by man in his image, yet lacking a soul or anything meta-physical. It is unethical to create them to be conscious at all.

>> No.8399719

>>8398952
Why would we make an AI an emulation of the human brain with no sensory input, literally the stupidest question ever. Your questions relies too much on unrealistic hypotheticals.

>> No.8399730

>>8398952
If your AI is built from scratch, it'll be designed to deal with the input it's gonna get. So no problem.

If your AI is grown in as a biological simulation, it'll adapt to use whatever sensory input it gets, so again, no problem.

If your AI is a simulation of a specific living human brain, it'll probably just go into shock. It simple enough to argue it's just a simulation though, and of no more an ethical consequence than killing NPC's and an MMO.

But seeing as how a comatose AI is a of no use, you're probably going to set it up with sufficient sensory input to begin with - even if some of it needs to be fudged. You don't need to simulate a whole world - just sufficient stimulation.

>> No.8399821
File: 10 KB, 200x237, Max_stirner.jpg [View same] [iqdb] [saucenao] [google]
8399821

>>8399394
did someone said spook?

>> No.8400340

>>8398952
yeah ai will kill us all

>> No.8400637

Oh, it's another "morons that know nothing about AI thinking that they're asking sophisticated questions" thread.

>> No.8400718

>>8400637
OP explicitly specified that we are discussing *strong* AI. Not just any AI. Thus, anything goes in this discussion.

>> No.8400766
File: 17 KB, 186x200, stirner.gif [View same] [iqdb] [saucenao] [google]
8400766

>>8399821
you called?

>> No.8400798

>>8398952
>ethics
Fuck off we have working artificial wombs right now and have tested them on goats carrying them 100% to term and birth and can't fucking use them because muh ethics

>> No.8400803

>assuming the ai is an emulation of the human brain
fucking brainlets, when will they learn?

>> No.8401198

just don't connect the AI to wifi

shit

That way if it gets spooky you can just shoot it

Just like you wouldn't design a robot with a power cable longer than 2-3 feet

>> No.8401283

>>8398952
>unethical
if it's better for humanity we should do it. Period.
>y u playing god?
We've been doing that ever since we started combating disease.
Also, if we can do it, how "godly" is it really?

>> No.8401285

>>8399692
>built by man in his image
>created man in His image
Would you rather be created imperfectly, or not at all?

>> No.8401286

>>8398952
>we would probably go insane due to the lack of sensory input

what would be the point of creating a robot that couldn't see or hear?

>> No.8401866

>>8401286
An AI would be able to "see" and "hear" but not in the same sense that a human can. It would be able to read digitally-converted data from a camera or microphone, but it wouldn't really be seeing and hearing in the same way a human does. So it would still be like a human mind stuck in a robot body; it wouldn't be getting any physical stimuli. Thus, it would go insane.