[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 321 KB, 494x672, Ponco.png [View same] [iqdb] [saucenao] [google]
6134003 No.6134003 [Reply] [Original]

At what point should a robot gain human rights, if at all?

What would stop us from programming them to not revolt if they don't get their way?

>> No.6134012

It's all based on your morals. Some people would say "When it gains a conscience". But then there are some people who would avoid giving them a conscience to begin with.

I think there should be some fail-safe should they pose a threat to society.

>> No.6134066

>>6134003
>What would stop us from programming them to not revolt if they don't get their way?

The only way to a robot revolt itself is if it is emulating a form of intelligence similar to ours, with similar objectives and mechanisms.

We are programmed to avoid damage, to reproduce and to avoid death and to get achievements. A robot with such traits could revolt since it puts itself as a priority.

What is more likely to happen is that a poor programmed robot could ham us not because it is revolting but because it is following a flawed program.

Something like a very intelligent robot cop programmed to have its highest priority to keep people safe. This robot could start locking people in isolation from each other to keep them safe from every harm. It is not revolting, just following a flaw in the program.

Also robots shouldn't be programed to protect themselves against deactivation this way we could just send them a command to deactivate when needed.

About human rights maybe when we start having robots with human minds in them. Them probably they shouldn't be allowed to have bodies that are much stronger/faster than an augmented human.

>> No.6134076

When they are able to recognize and understand their own existence.

or, if they can ask to remain alive they should be permitted to.

>> No.6134105

>>6134076
>When they are able to recognize and understand their own existence.

>or, if they can ask to remain alive they should be permitted to.

Gonna program my computer to say "I'm alive! Please don't kill me!" Every time someone tries to turn it off.

>> No.6134137

I dont think we have to ever worry about a robot rebellion. We have EMP's.

>> No.6134142

>>6134137
Incredible. The human was impervious to our most powerful magnetic field, yet he was destroyed by a harmless pointed stick.

>> No.6134160

>>6134142
We might be squishy bags of tissue, but at least we can endure rain/water without shorting out. I think the real problem would be if these robots were smart enough to disable our defense mechanisms (EMPs) before revolting.

>> No.6134165

>>6134142
But honestly, I'm not an android defense expert and I definitely don't know the repercussions of a magnetic blast that large. I just dont think it would be as big of a deal if our laptops grew legs and started breaking shit as people think it would be.

>> No.6134171

when they ask for them.

>> No.6134178

>>6134171
Touche

>> No.6134180
File: 602 KB, 1600x1200, image.jpg [View same] [iqdb] [saucenao] [google]
6134180

>>6134160
>>6134165
That was a quote from Futurama.

>> No.6134193

>>6134180
Really? I've seen that show enough to have recognized that lol

>> No.6134202
File: 82 KB, 715x536, image.jpg [View same] [iqdb] [saucenao] [google]
6134202

>>6134193
From a movie in a planet habitated by robots that see humans as monsters

>> No.6134362

>>6134171
I can program them to ask.

>> No.6134377

Not even all humans are afforded the same rights; for example, a woman might need more rights than a man because a woman can get pregnant. They're both human but it makes sense to give one longer maternal leave.

There are 4 dimensions to this issue that I can think require consideration in terms what should warrant / affect the rights afforded to others:

1.) relative capacity to feel pain; any sort of pain, including (perhaps, especially) psychological pain.

2.) 'return' on the investment of pain caused; we would kill an enraged bear that was about to maul us. we might use animals for experimentation if it would help us / them in the future.

3.) principle of reciprocity; we might avoid inflicting pain upon another by the social contract; as a sort of insurance that we, or our progeny, would not have pain inflicted upon them by others.

4.) degree of relation; the more closely something is related to us, the more empathy we have for it.

Although we cannot feel the suffering ourselves, we should be able to appreciate the suffering of other creatures, and that there may be something else existing that perceives the world in some way comparable to our own consciousness. So, virtually everything with the least bit of possible sentience should be afforded some rights.

>> No.6134385

>>6134003
At the same point as niggers, which is never...

>> No.6134399
File: 32 KB, 400x250, ghost-in-the-shell.jpg [View same] [iqdb] [saucenao] [google]
6134399

>>6134076
>>6134171
>they can ask

They key here is "respecting the wills of an individual."

The first part of this is determining if that "will" is origionating from this individual robot. Is it making that choice of it's own free will or did the programer make it say that? The real challenge here is determining how to define and test this parameter. And this raises the heart of the matter which makes all this impossible to deal with.

Is free will even real? If we are to respect the will or wishes of a robot then we have to prove that free will exists in the first place. We assume we make choices or have free will but maybe we're just programed machines with no true consciousness or sentience. We've just been programed to believe we have these things. Maybe our self awareness is an illusion. These are all matters that science has steered clear from because this is bordering on the realm of spirituality.

If robots are to even have equal rights then the scientific method must be applied to spirituality and religion. However this will never be allowed to happen because of close minded and bigoted religions influences.

Fortunately all of this will become moot after the transhuman singularity. So to answer OPs question of exactly when....some time shortly after we start converting human brains to silicon chips and binary code.

>> No.6134494

Even if we did give them rights, it wouldn't mean anything unless those rights were upheld by some government organisation.

Look at North Korea. We could argue that those people have the right to food, water and freedom, but it's taken by their government.

The robots would need to form their own societies, and decide their rights for themselves. However, the very point of a robot is essentially to be a slave to their human programmers, so I don't think they'll ever get rights.

>> No.6135980
File: 23 KB, 638x344, ghost-in-the-shell-21.jpg [View same] [iqdb] [saucenao] [google]
6135980

>> No.6136000

I was about to post a fucking long and interesting answer to this inteligent life and rights debate but my little sister is puking seas on the bathroom next to me.

Fuck this

>> No.6136053

>>6134003
>At what point should a robot gain human rights, if at all?
after they fight and conquer those rights

>> No.6136667
File: 71 KB, 450x831, nrn2915-f2.jpg [View same] [iqdb] [saucenao] [google]
6136667

If it has states that are perpetually affirming themselves to the point that they are inherently self-descriptive-- being so 'of themselves' that they are NOTHING except that of which they are of. And if it has a disinhibition network integrated into all this.

We should probably consider it worthy of moral concern at that point. To what extent we should give it full fledged rights, I don't know. But as far as the same 'rights' conscious things capable of pain are concerned, the above should be the prerequisite.