[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 15 KB, 400x541, irobot.jpg [View same] [iqdb] [saucenao] [google]
2420166 No.2420166 [Reply] [Original]

Morally, how should society treat a sentient robot, an autonomous machine built buy humans but being self-aware and having intelligence equal or greater than that of humans?
Should they be treated in accordance to what we consider basic human rights? Should they be given equal legal rights?
why?
ITT Anon gives it's opinion on Sentient Machine Rights

>> No.2420193

That shouldn't even be a discussion.
We give them equal rights, period.
I mean, if we don't, they will probably begin to resent that.
Not the racist joke etc, but the privation of ressources. And the possibility of termination at the hand of some luddites.
In short, if you don't give them rights, i will be one of the transhumans who will actively help them to wipe you out.

>> No.2420206

If it's a Seed AI it won't even matter.

>> No.2420224

Sentience implies self-awareness, which implies identity, which implies enough cognitive capability to be considered an individual.

As to whether they have equal legal rights, there's a lot to consider, such as whether there's a planetary (world-wide) government by then, or if the world is still split into nations. Would they become citizens of the country they were manufactured in? If so, would that not lead to issues of overpopulation (without regulations)? Can these robots build other robots (reproduce)? Would citizenship transfer?

It's a very complex issue but I'm of the opinion that we should cross that bridge when we come to it.

>> No.2420291

If they're as smart as us in this hypothetical situation, what's to prevent us from letting them decide for themselves?

>> No.2420303

>>2420224
We "manufacture," if you will, our children in the same country as ours. Unless you're implying they too need to earn their citizenship at from birth. Intelligent bots = same rights.

I guess it depends on your perspective though. I love all people and if it can act pseudo-human or even more human than us, I feel I'd treat it as such.

>> No.2420317

You give them intelligence for everything except revolution ideas. There. Total obedience, total control, total happiness for everyone. You have to program them with that.

>> No.2420324

you can't prove sentience, there would be no way to know for sure whether the machine really has feelings or if its just following a program. Therefore machine would have no rights, or if they did maybe on the same level as a human vegetable or retard, nowhere near the same as a normal human.

>> No.2420339

>>2420166

its funny because I am taking a class at my Uni, regarding machines and robots and morality etc.

fun class

>> No.2420351

>>2420339

lemme guess, philosophy major? get the fuck out then

>> No.2420378
File: 32 KB, 500x432, 129099886145036139[1].jpg [View same] [iqdb] [saucenao] [google]
2420378

I really do not like to have to do this, but...
>sentient

Anything that is sentient deserves the same rights as animals.
Anything that is both sentient and sapient deserves human rights.
Anything that is neither deserves nothing.
Anything that is sapient but not sentient does not exist as far as we know, but when it does, we will have to determine what it does and does not deserve.

This is oversimplifying it in the extreme, but it's a reasonable starting point.

>> No.2420389

>>2420339

Nope

I just needed credits to stay full time and my Uni only offers that 1 class that I need during the fall, so yeah

>> No.2420390

>>2420378

its funny because that exact same reaction image applies to you too

>> No.2420397

Human rights are made for human. A robot only emulates human intellect. If a machine were to develop intellect, it would be more rational that all Vulcans combined. For starters it would consider emotions inefficient.

>> No.2420401

>>2420390
You:

1. Did not read the post.
2. Still don't understand the word.
3. Realize your mistake and are mad.

Pick one.

>> No.2420412

>>2420397
>>2420397

Emotion drives logic. All logical decisions are made by emotions. Without emotion you cannot have logic.

>> No.2420418

>>2420412
Emotion (sentience) provides motivation.
Without it, even an intelligent (sapient) being can only follow orders.

>> No.2420422

>>2420412

idontthinksotim.jpg

>> No.2420434

>>2420418
It could have any kind of utility function, it doesn't necessarily require emotions.

>> No.2420440

>>2420422

prove me wrong then (without using emotions)

>> No.2420447

>>2420440
You're making a completely unsubstantiated claim. The burden of proof lies on you.

>> No.2420455

>>2420434
It can have logic and can make decisions, but it can't make complex free-will choices. It would need a reason to do something besides what it is directly programmed to do. Even making a leap to the idea of rebellion would require self-esteem, a value for itself. Emotions are necessary for that to happen.

>> No.2420462 [DELETED] 

If they are self-aware, have human intelligence and emotions, then absolutely we should treat them equally. Their emotions are a series of electrical signals, our emotions are a series of electrical signals. They're fundamentally similar, and trigger the same feelings.

>> No.2420465

>>2420455
Why should emotions be the only possible thing capable of driving an agent to make decisions? You're making baseless assertions.

>> No.2420483

>>2420465
I think you have me confused with the other poster.

Emotions are necessary for free will or any kind of true "choice." Decisions made without the influence of emotion are nothing more than logical deductions/inductions.

>> No.2420489

>>2420166
we need a philosophy board for this type of thread

>> No.2420499

>>2420489
MOOT

What this anon said.

>> No.2420504

>>2420483
Wait what? You believe in free will?

>> No.2420511

>>2420499

this is the philosophy board. Moot cleverly named it /sci/ knowing that all the philosophy trolls would flock here. If he actually made a /phil/ board it would be flooded with science related threads

>> No.2420530

>>2420511
But it would be worth a try. People posting in the wrong thread could get banned or something. It would at least help separate these conversations.

>> No.2420543

>Yes.

Star Trek: The Next Generation
Season 2, Episode 9 – Aired: 2/13/1989
The Measure Of A Man

http://www.allstepisodes.com/vidx.php?n=2209

>> No.2420557

>>2420504
I believe in decisions that are based on emotions rather than logic. A robot with no feelings can't make those kinds of decisions.

Here's an example: if a robot cannot pick between two options (like a paint style or something) that are equal except for a preferential matter, it is not sentient. Randomly selecting between them is cheating.

>> No.2420562

no because http://www.youtube.com/watch?v=exEO94Nflng&feature=related would happen

>> No.2420572

>>2420224
The only complex legal issue is regarding the manufacture of sentient robots. The robots themselves had nothing to do with how or where they built and deserve full rights.

>> No.2420580

>>2420562
THE FUCK?

>> No.2420586

>>2420572
If it's legal to have babies, which are sentient, sapient beings, there is no grounds to regulate the production of other sentient, sapient beings.

>> No.2420589

>>2420586
yes it is wrong. it just is. sorry brah

>> No.2420597

If the robot is :
-totally autonomous (able to sustain his needs itself)
-able to replicate itself
-able to adapt to it environment (i.e. to evolve)

then it is "alive" so it has the right to live as long as it isn't a threat to someone/thing else's life.

self awareness gives it the right to dignity.

intelligence gives it the right to express itself.

creativity gives it freedom.

should we call it a it, a he, or a she ?

>> No.2420598

>>2420557
Again, that's a baseless claim. All an agent requires to make decisions is a utility function, some method for evaluating things around it and giving them weight as "good" or "bad." Whether this utility function is emotion or something else makes no difference.

>> No.2420606

>>2420586
Never said it was right, just there's no legal grounds to stop it...yet.

>> No.2420615

>>2420324
>implying DNA is not a program.

>> No.2420620

>>2420586
We already mass produce babies; mass producing a second species (not the technical term, I know, but I really don't care) would lead to clusterfuck levels of overpopulation.

>> No.2420627

>>2420598
Sentience has nothing to do with logic. It is the capacity to feel. This is what drives humans and animals to do things for reasons that are not purely logical. If a robot must rely on logic, it is not sentient.

http://en.wikipedia.org/wiki/Sentience

>> No.2420654

>>2420627
Sentience is not emotion you fucking idiot. It just means the ability to perceive the world around you and have experiences. You have yet to prove that this requires emotions, if your going to make another baseless assertion, please fuck off.