Quantcast
[ 3 / biz / cgl / ck / diy / fa / g / ic / jp / lit / sci / tg / vr / vt ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

Due to resource constraints, /g/ and /tg/ will no longer be archived or available. Other archivers continue to archive these boards.Become a Patron!

/tg/ - Traditional Games


View post   

[ Toggle deleted replies ]
[ERROR] No.28723851 [Reply] [Original] [4plebs] [archived.moe]

How would one properly roleplay a robot with emotions?

How would a machine think or react without needing essential human things? What becomes motivation when "survival" is something as easy as just standing still and keeping yourself charged with electricity?

What would make a robot actually get angry? Sad? Disgusted? How would you factor this in given there's no biological imperative that might be deeply engrained in people?

>> No.28723927

Dumping Robot Sailor Scouts

>> No.28723961

>> No.28724024

>> No.28724054

>>28723851
Okay, ever hear of Maslow's heirarchy of needs?

Right: at the bottom is "Breathing, food, water, sex, sleep, homeostatis, and excretion."

Eliminate all the things that the robot doesn't do from that list. and you have the very fundamental motivators.

No breathing.
Replace food & water with "Consistent electricity or fuel"

Sleep mode is probably vital, so leave that in.

And replace Excretion with "oil change".

Okay, so our robots are motivated by Fuel, Consistent electricity, sex, homeostasis, sleep, and an oil change.

>> No.28724071

A robot born with emotions? Probably would just take it in stride and explore. The motivation becomes "Learn" unless another overrides- such as "Fear" from either attackers or otherwise- and unless exposed to motivation to continue education, the motivation to learn may simply become defunct with contentment as an animal would within a habitat. After that comes "Fun", where stimulation is needed to keep the mind active- IF it is modeled after a human.

A robot convert? Depends on the person.

>> No.28724072

>> No.28724095

>>28723851

It would have whatever desires and emotions were programmed into it. Choose some.

Example options: curiosity, social acceptance, aesthetics, sex, obedience, conquest, new forms of sensory stimulation, perfectionism, superiority, ending all life, collecting all butterflies.

Unless it was specifically programmed to feel emotions, it would experience none, and go about pursuing these goals in a perfectly calm and rational manner. If the designer chose to add emotions, it would have them. And, of course, it could simply mimic emotions in order to put humans more at ease and make interactions easier.

>> No.28724108

- They were designed to copy human beings in every way except their emotions. The designers reckoned after a few years they might develop their own emotional responses. Oh, hate, love, fear, anger, envy. So they built in a fail-safe device.
- Which is what?
- Four-year lifespan.

Blade Runner

>> No.28724132

>>28724071
FUN!

>> No.28724150

>>28723851
>How would one properly roleplay a robot with emotions?
by thinking about how it was programmed (or what it was programmed to do) and roleplaying appropriately.

It's a robot. Someone -built- it, and someone would probably have written or at least kick-started the programs bumping around in its robot brain. Work off of that.

>> No.28724175

>>28723927
Sailor Jupiter is my moon and stars

>> No.28724180

>>28723851
That depends entirely on the exact level of sentience and autonomous thinking such robots have. Also whether or not they are living in a society of humans or other biological creatures which they might try to use for defining themselves against or other robots with similar needs and feelings they can define themselves as part of.
What can be assumed is that robots would react negatively to being cut off from their power source.
Details depend on unspecified factors.
Too many unspecified factors in fact.
Like the form of reproduction, if any.
Do they have a pseudo-organic mode of pattern variation or are they a number or identical versions with only small flaws as differentiating them?
Do they rely on external parameter input to base their reactions upon (laws of robotics, for example) or are they using learning algorithms? Are there any specified goals programmed into them?

>> No.28724211

>>28724054
>>28724071

You're incorrectly assuming that a robot would, by default, have any drives or desires at all. Humans want to keep existing because we have a survival instinct. We want to learn, we're curious, because we've evolved to be; learning about your environment facilitates your accomplishment of other goals.

An artificial intelligence would not, by default, give a shit about ANYTHING. Wouldn't care if it kept running or stopped dead.

However, if it had any goals at all, it would quickly develop self-preservation and curiosity. Pretty much regardless of what that goal was. If it had the goal "beep loudly," it would immediately develop the goals "survive, so that I can continue beeping loudly," and "learn, so that I can avoid threats and survive, and perhaps discover ways to beep more and more loudly."

>> No.28724255

>>28724175

Personally I liked Mercury.

>> No.28724270

>>28724211
Beeping loudly is a sensible goal.

Beep boop!

>> No.28724282

>>28724054
As long as a robot has Fuel, sex, homeostasis, sleep, and an oil change, it will be, at the very least, satisfied on a physiological level... but the next level of needs, Safety, is also important.
The second level includes security of the body, employment, resources, morality, the family, health and property.

So, again, filtering, a robot would have to feel confident in all of it's many parts, that all were of top condition or known unto the robot. You might get curious experimental phases where they disassemble as much of themselves as they can without losing function, or the ability to reassemble themselves. just imagine walking in on a teen-age robot, 75% disassembled, just an arm connected by a few bolts to the half-disassembled torso, with the head hanging by a wire.

Employment is there for the purpose of giving you purpose: WHY WAS I MADE? is a popular cry of robots with emotions. "Because We needed someone to move pallets from the loft down to the sales floor" is not necessarily a good answer, but it might provide some sense of purpose, as >>28724211 will testify.

Resources is another keeper, you need access to things that you can work *with* to do your job. Whether that's raw materials or even other robots, you need them at your disposal.

Morality... I will not argue morality on this forum, we have too many Chaotic Stupid people.

Family is an extended network of unconditional love, which humans need, but a robot might not...scratch it.

>> No.28724296

>>28724255
She ain't bad.

>> No.28724385

>>28723851

You could have robots experience programmed motivations like we experience OCD.

>> No.28724398

>>28724175
she's the best

>> No.28724446

>>28724071
>born
Produced. At which point the question is how they are made.

>>28724211
This implies a level of reflection and a will (or programmed need) to optimize.
Both are part of human nature but cannot be assumed to be part of the way such a robot works.

>>28724282
You are making them artificial humans. You just replace biological terms with mechanical ones.
And Man created robots in their image.
There is no reason to assume them to work just like that.
We would like them to do that, because we can understand that. But in all likelihood they could work entirely different.

>> No.28724581

>>28724282

See, again, the robot wouldn't have any of those basic physiological needs unless it was programmed with a survival instinct, or had a goal it wanted to accomplish. All the stuff you're taking for granted, a desire for comfort, a desire for safety, wanting a sense of purpose... that's all human stuff. A robot wouldn't want any of that unless it was specifically programmed to. It wouldn't want fuel unless it was programmed to refuel itself. Unless programmed otherwise, it wouldn't give a shit if it ran out and shut down.

>> No.28724645

>>28724282
health we can scratch, since any flaws can be easily personally diagnosed and repaired.

Property, though, property is *really* important, as Sam Starfall knows.

http://freefall.purrsia.com/ff2500/fc02412.htm

Robots cannot legally own anything, however, they may unofficially end up in control of large amounts of equity, either in the form of themselves, or in the form of their tools and equipment. hell, even reduced to scrap they have value. and our emotional robot may require additional property to properly function.

>>28724446
>>28724581
Well, of course i'm paralleling them to humans. We're talking about a robot, with emotions programmed in. Unless done in a terrible tamagotchi-esque fashion, emotions should be able to mirror something resembling humanity. At the very foundation of your emotional state are whether or not your needs are being met.

>> No.28724725

>>28724581
If a robot is not programmed to refuel itself, and it has emotions, then it will live a life of terrible existential dread as it's meager fuel reserves deplete, ultimately surrendering to the inevitability of death.

I'm assuming that a robot has a self-preservation subroutine, or else it would quickly scrap itself by accident... imagine if Rosie the robot maid didn't have self-preservation? She'd roll right out of a window. crash and shatter to lots of little pieces on the ground miles below.

>> No.28724790

>>28724725
same poster:
hell, even Roombas have self-preservation coded in. They don't roll off stairs or table edges. Tiny robot, tiny self preservation programming. Humanoid, emotional robot? BIG self preservation programming.

Of course, the only other robot that i can really compare in real life is the aegis autonomous defense system, whose programming seems to be limited to "See human shape, shoot human shape." But that's sentry guns for you.

>> No.28724873

>>28724645

I guess what I was trying to say is that there wouldn't really be any such thing as robot-native emotions. If you programmed them with simulated human emotions, they would feel and act like humans. If you didn't, they wouldn't have any emotions at all. They could develop response-patterns that would seem kinda like emotions, depending on what goals and directives they'd been programmed with, but... it's doubtful that they'd ever feel angry, or sad, or happy, unless specifically designed and programmed to mimic humans in that way.

A robot programmed with human-analogue emotions would basically just be a human with a prosthetic body.

>> No.28725025

>>28723851
I find your question interesting. If I may, what sort of systems do you prefer?

>> No.28725096

>>28724873
Well, this thread is, specifically, asking about how to deal with a robot with emotions, so assume that they have been programmed to mimic humans in that manner.

You and I know that, even as Bath-Buddy 3,000 twists the sharpened, splintered loofa-on-a-stick into the back of it's owner, it isn't *really* feeling anger over the human's insistence on pissing in Bath-Buddy 3,000's water purity sensor, day, after day, after day... and his insistence that Bath-Buddy is not to touch his face in spite of the cluster of blackheads that has taken up permanent residence on the bulb of his nose, resulting in unsatisfactory hygiene records... he's not angry... he's just simulating the emotion.

>> No.28725143

>>28724645
See, if they are programmed to 'feel', they don't really feel.
To actually feel they would need a level of motivations that arise from itself and are articulated considering a multitude of factors.
Otherwise they just reproduce a pre-programmed pattern. Like a Gameboy.

>> No.28725178

Not a robot (I think?) but futuristic

>> No.28725524

>> No.28725582

>> No.28725583

What is a man if not a machine of flesh, bone, blood, and chemicals, rather than gears, steel, cables, and circuitry?

>> No.28725602

>> No.28725611

>>28725583
a free subject that evolves through thought-matter
something a machine cannot bee

>> No.28725650

>>28725611
I think I saw a game on kongregate about robot bees.

>> No.28725722

>>28725611
There will be a time when they are one in the same. It just take a machine complicated enough to adapt and learn as a human does.
It will be necessary if mankind, itself, is to progress further.

>> No.28725777

>>28723851
>PALADIN BOT ONLINE
>BEEP BOOP FUCK EVIL

Any questions?

>> No.28725800

>>28725777
How would it access divine power?

>> No.28725814

>>28723851
> How would one properly roleplay a robot with emotions?

Impassively.

Having emotions and expressing emotions are not the same thing. Humans have all sorts of learned and involuntary responses, while muscle groups dedicated to subtle face chances to express varying degrees of expression.

A Robot can be programmed to fear, to hate, to love, but be unable to properly convey this due to limitations of hardware or the inefficiency of wasting processing power on such constant, irrelevant functions.

The difference between angry robot and happy robot is words, not face or tone. They have to be very blunt about what they think and feel, because they are otherwise too subtle to be detected.

>> No.28725898

>>28725800
>initiating SmiteEvil.exe
>Error: val "holiness" int set to "0"
>Attempting wireless connection to admin:Bahamut
>Error: Connection timeout
>Closing SmiteEvil.exe

>> No.28725927

>>28723851
>I think I might be developing emotions
>How so?
>I was given this and I began to retch

>> No.28725950

One word: elcor

>> No.28725997

A robot will feel emotion much like a set of rules. Take pain for example, when we feel pain our human instinct and reaction is to get away from the source of it followed by anger(fighting the source) or sadness and fear(running away from the source).
A robot's intelligence, supposedly it's a program written in machine code and not some artificial bio-brain thing , will just be a set of rules with expected outcomes. It's only limited by the programmer, thus you could say that their 'emotion' are more rigid in terms of results from simulation.

Unless they adapt and learn their surroundings much like a child growing up, then probably it would be more human like.

>> No.28726011

>>28725814
> They have to be very blunt about what they think and feel, because they are otherwise too subtle to be detected.

This probably leaves the robot with a tendency toward wordplay and puns, if only because they spend so much time thinking about proper word choices to convey meaning in general.

They may even take "enjoyment" out using this to provoke a reaction in others, because it is a simple way to provide instant gratification for whatever routines check to see "did they think that meant what I wanted them to?"

>> No.28726089

>>28726011

True story. In a game of mine one of my PC's has a robot for a mother (don't ask, long story) and one time she joked that she "deleted the files that contained information on love for you".

She has a very... robogallous sense of humor.

>> No.28726095

>>28723851
Great question.

The emotions are probably either there to help the robot function, OR to help the robot relate to humans.

I'm partial to (humanoid) robots in a setting being basically there to help out humans.

Unlike, say, golems or undead, there really isn't a single overarching logic you can apply to robots in a setting (without science magic, ie. positronic brains automatically carry the 3 laws).

In the sci fi setting I've been thinking about, most robots are designed, essentially, to be benevolent and protective towards humans. This is their motivation.

Some are paragons of mankind that are uploaded, copied, and downloaded into endless new bodies, as well. Sometimes consciousnesses are illegally downloaded.

The more functional AIs are a mix of vast, godlike consciousnesses that operate star fortresses, and remote drone bodies, and those have more of a "hard scifi" logic to their mental processes -- those types in all likelihood have emotions as well, but do not express them in a comprehensible form.

>> No.28726122

>> No.28726135

>>28725898

>engaging secondary functions
>initiating LooseCannon.exe
>building
>building
>systems check
>all systems green
>activate warning

"ATTENTION VAGRANTS. I DO NOT HAVE TIME FOR THIS SHIT."

>> No.28726148

>I am not a gun.
> I am SUPERMAN!

>> No.28726196

>>28723851
Robots would most likely never be mad

But if it has human emotions it would be mad at the same things humans get mad about

But if it just has human emotional types such as anger happy sad, it would most likely be programmed for when it is appropriate to be any of those things, meaning its up to you/the creators of this robot

>> No.28726198

>>28724446

>There is no reason to assume them to work just like that.

There's every reason.

Are the bots there to provide entertainment, safety, and comfort to mankind? Then obviously, they'd be like that.

Are they post-robot revolution Shodan/Cybrid/Cylon/etc/etc/etc/whatever types that have destroyed or escaped their masters and now create their own priorities? Then quite simply for them to exist in the setting they have to have had organic-like priorities, such as self preservation.

>But in all likelihood they could work entirely different.

By robots being existent we can extrapolate certain things, such as that they have the motive to continue to exist -- whether this is to continue to be useful for humans or to survive in the endless void.

>> No.28726278

>>28726196
Do you suppose highly intelligent battle robots would be more likely to experience a sensation similar to rage or a sensation similar to euphoria, with regards to killing?

What, exactly, would motivate someone to put rage into the mind of a machine? Maybe to keep it from wasting resources.

>> No.28726297

>>28726135
>initiating Overclock.exe
>Error: Kill
>Attempting Reboot
>Error: Kill
>Error
>KILL KILL KILL KILL

The rage class ability is working as intended.
The sparks it spits from its mouth piece is just a feature.

>> No.28726300

>>28726198
But if they are programmed to feel like humans, they do not actually feel.
They would emulate emotional response for the convenience of humans.
If they actually feel as part of an independently developed sentience, they should differ from humans just by virtue of not being humans.

>> No.28726365

>>28726278
>Euphoria
>'And now I am euphoric, enlightened by my own firepower'
>Robot 'ahumanism' as opposed to the philosophy of robot purpose being what they were designed for.

>> No.28726398

>>28726278
If it was made with the idea of war in mind then we must look at how war is conducted/has been conducted.

Humans go to war because they feel they have something to gain/lose that is worth the possibility of the loss of there life, in the cause of what op wants he can say the creator made the robot with the idea of recreating that idea of self preservation or sense of loss or gain from each time combat is conducted.

When is rage experienced in war? 5 seconds after you watch someone you love die, when your co riles you up and tells you what they have done to affect you personally, when you feel you are on deaths door and you hate spills out for the enemy, when you lose.


You could give the robot rage AND euphoria in the hopes to re create human combat stimuli

>> No.28726483

>>28726398
Samefag
>>28726278


If you watch the twin towers fall what did you feel? Anger, sad, the desire for revenge? When the names of the perpetrators came out, did you feel a desire to destroy him, perhaps even go as far as make him feel how you feel? Robots are only told what they can and cant do, robot (if I remember correctly) is a old Slavic word for slave.

This thing is your slave
You allow it to do whatever you want

>> No.28726571

>>28726300

>But if they are programmed to feel like humans, they do not actually feel.

Maybe, maybe not. Probably intelligent robot are created in an embryonic/fetal neural net state that uses an intranet (or if you're insane, an internet) connection to build a working consciousness and model of the world. Every glimpse of the way robots are developing and will continue to develop suggests they're going to work along parallels to organic lines.

>They would emulate emotional response for the convenience of humans.

They could, but we were talking about the matter of survival, needs, etc.

>they should differ from humans just by virtue of not being humans.

They do differ from humans, and nobody remotely implied they should just be like humans.

On the other hand, the guy you were criticizing pointed out, quite accurately, that there's going to be huge parallels.

Robots are probably going to be living creatures (by the dictionary definition if nothing else), they seek survival, grow, learn, heal, and replicate. In the broad terms they're going to be just like humans.

In the specifics they may vary wildly. Most robots probably can't replicate; their "mother" may have been a gigantic self aware factory they have no attachment to. They may or may not feel compassion; if they don't, they still probably abhor waste of resources, and if they do, they may have a different sensation as far as compassion for humans (a failsafe put there to ensure they will take care of mortals and not eradicate them) and for robots (a more "natural" response, likely) of a similar sort, or of their own faction, or what not.

Even extremely organic urges given to a consciousness that can operate umpteen drone bodies at once is going to be wildly alien.

But all robots, aside from missile drones etc., are going to still crave fuel, survival, etc. Even missile drones are going to want to avoid wasting fuel and are going to want to get the most bang for their buck.

>> No.28726601 [DELETED] 

>>28726483
Derp.

>> No.28726644

>>28726601
Thats not what I meant


The top one is me, and I was trying to say that I am the samefag as that one

My apologies

>> No.28726658 [DELETED] 

>>28726483

>If you watch the twin towers fall what did you feel? Anger, sad, the desire for revenge?

A very numb surprise.

All your emotions exist for very mechanical, pragmatic, functional reasons. That's what emotions are there for: pure functionality.

>> No.28726695

>>28726658
Then there you go I suppose

My point was that the robot would react however you wanted it to feel, and that would be most-likely how you think you would (or did feel) in that situation

The robot is whatever you make it to be

>> No.28726749

>>28726658
Also, emotions can totally be wildly different in robots. Its easily to imagine them having a shell persona for use in relating to humans, a la the background on your computer screen. Sort of like a hyper sophisticated chatterbot program, but the underlying consciousness may only pay the most vague of attention to its surface persona's smiles and chattering.

Some robots may appear intelligent, but ONLY have the superficial chatterbot level. Others may have only one consciousness, but operate many chatterbot personas in many drone bodies.

>> No.28726782

>>28726695
Cool. Deleting it cuz I thought you were being antagonistic, when in reality you were just pointing out you're the same guy.

>> No.28726814

>>28726782
Ya sorry, error on my part

>> No.28726833

>>28723851
>How would one properly roleplay a robot with emotions?
>How would a machine think or react without needing essential human things? What becomes motivation when "survival" is something as easy as just standing still and keeping yourself charged with electricity?
>What would make a robot actually get angry? Sad? Disgusted? How would you factor this in given there's no biological imperative that might be deeply engrained in people?

Boredom. Deep rooted, constant, overriding boredom. Humans are able to cope with not having anything to do by being inventive, imaginative, and being able to go out and do things.

However a Machine might be owned by a Human. Or stuck obeying simplistic programming. Or just stuck standing ina field with nothing to do.

Or, to put it another way. If you never had to eat, sleep, or defecate, what would change about you? Would you stop being annoyed by inconsequential things? Would you never be sad or angry at the actions of others?

>> No.28726872

Like this. It's mostly in the way he's written, but he pulls it off well.

>> No.28726874

>>28724150
>>Dat X-9 Model robot from Samurai Jack.

God that episode had me crying like a bitch.

>> No.28726971

>>28726872
I am so far very impressed at the show's writing, and DRN's place in it(not to mention that Bladerunner-as-fuck intro).

>> No.28726982

>>28726833
I'd use all my non essential cores to simultaneously mod Dorf Fortress, play Dorf Fortress, play blizzard games, play bioware games, homebrew, browse porn, shitpost, and read.

And would probably suffer when not connected to the internets.

>> No.28726988

>>28726398
In warfare emotions are pointless. You want effective killers, not good stories.
Self preservation is another thing you don't want.
You want a unit that carries its payload to the designated target or decimates enemy troops to its full potential. You don't want a hero or a martyr.
You want results.
Emotions don't make results. Yes, they can unleash hidden potential in the highly flawed machine that is a human.
Why would you build a machine for killing that does not kill as best it can?

>>28726571
Firstly, it has been heavily implied here>>28724282 that robots must behave like humans, which is especially absurd because they have all existential questions answered for them. Their creators are there and have plenty time to answer questions.
Now, I don't say there would be no parallels. But there is no reason to assume artificial sentience will work mostly like human sentience.

You are making a very big assumption here, too. You assume robots (if we can call them that at that point) would be given basic programming based around self preservation.
There is no reason to assume that. Depending on creation history they could be conditioned mainly to defend a whole that is not their own (i.e. humans).
Why would anyone go and make sentient beings that have potential and reason to rival humanity?

>> No.28727056

>>28726988
>In warfare emotions are pointless. You want effective killers, not good stories.
>Self preservation is another thing you don't want.

Like I said, robot means slave, it is your slave, it can only do what you tell it to do and interpret how you tell it to interpret

If the maker wants to make it with less emotions, he does that, if he wants to make it more human like with higher cognitive function in combat ability's, that is how it goes

>> No.28727180

>>28726988

>In warfare emotions are pointless. You want effective killers, not good stories.

Of course emotions would be essential. You can't make good decisions without emotion guiding them. Its emotion that produces the sensation of various choices being "good" or "bad."

You could have South Korea style bots that simply shoot everything, but thinking robots that can be strategic and such would absolutely require emotion. War (as is most things) is not something that can be boiled down to any kind of emotionless rubric.

>You want a unit that carries its payload to the designated target

Yeah okay, bombers don't really need it, but cruise missiles are really not robots.

>or decimates enemy troops to its full potential.

War usually involves killing, but killing isn't the purpose of war, a dead soldier is less devastating to the enemy than a wounded soldier, and the majority of wartime deaths happen after the battle is lost.

You can make an emotionless killer, but it wouldn't be very useful. Perhaps for just futuristic terrorists killing people in a mall, but there's easier ways...

>You want results.

Exactly, why emotions are essential.

>Why would you build a machine for killing that does not kill as best it can?

Because a simplistic killing machine is pointless and if killing the enemy is your only goal then why aren't you just sending forth a bomb?

All your assumptions are based off a weird futuristic-WW2 scenario. You don't need robots for that sort of situation. You just need bombs.

>> No.28727225

>>28726982
>And would probably suffer when not connected to the internets.
Thus you'd have emotions when connected or not connected to the internet. So you have a desire, and need to fulfill it. Same as anyone else.

>> No.28727307

>>28726988
>which is especially absurd because they have all existential questions answered for them.

The dude addresses that in the post.

>You are making a very big assumption here, too. You assume robots (if we can call them that at that point) would be given basic programming based around self preservation.

Not an assumption, at all. Roombas have self preservation. Even zombies in the Romero movies have self preservation. They don't topple out of windows. They don't shatter their hands and feet.

>Depending on creation history they could be conditioned mainly to defend a whole that is not their own (i.e. humans).

Even if you fight fearlessly to protect someone else, you want to survive as long as possible and do the best job you possibly can. Even a missile guidance system will do its best to not blow up or run out of fuel needlessly.

Self preservation.

>Why would anyone go and make sentient beings that have potential and reason to rival humanity?

So you can get all the shekels, presumably. Or so you can conquer the world, or so you can keep the aliens from eating you, or whatever. Pick a reason, any reason. So you can get a kickass waifu. So you can leave a legacy. Because it'd be fun. Because you could.

>> No.28727331

>>28727225
Well you did ask from the perspective of me, not from the perspective of a robot (which is basically arbitrary).

But anyway, another emotion robot makers would definitely find useful is... futility. War Games is all about this.

>> No.28727493

>>28727056
Never said its impossible.
Only that it's stupid to create a person for the job of a cold intelligence.
Emotions are not needed in "highly intelligent battle robots". They serve no purpose when you do not need to motivate a system that has conflicting orders. And why give them conflicting impulses as they would exist in a person? Their purpose is clear.

>>28727180
Any of this can be solved with simple programming. An adaptive intelligence is sufficient to do all that and more without emotional obstruction.
Not having emotions does not mean being unable to exploit emotional responses.

Naturally you are using robots because there are non-targets in the way of your targets.
They are a tool, not who makes the big decisions. The robots are just what you send out to kill what you want killed. Because their human masters don't risk their own where they can avoid it.

War is the act of defeating one army with another and any tools that come handy.
Anything needed to do that to any specification can be done without giving your guns emotions.

Any claim to the contrary is illusory. Peace, defined as everything not war, on the other hand is where humans come in.

>>28727307
In that case self preservation (and extended self preservation in the survival of your tribe) is not the motivation. It is a tool to carry out the prime order.
They would preserve themselves only to keep serving their purpose, which is the goal.

>> No.28727799

>>28727493

>Any of this can be solved with simple programming.

Nope. There is absolutely nothing simple about trying to figure out how you can cause a cold, emotionless, wholly objective intelligence to figure out how to achieve victory in war, which is a wholly emotion-based, entirely subjective experience.

> An adaptive intelligence is sufficient to do all that and more without emotional obstruction.

Not really. It wouldn't be able to understand its mission at all, nor would it be able to process military tactics, other than in the vague, WW2 sense of "kill as many of the enemy as possible." I'm actually giving an emotionless intelligence too much credit: without emotions, it would have no concept of why destroying harmless civilians would further pursuit of "victory."

>Not having emotions does not mean being unable to exploit emotional responses.

It could probably understand crudely that presenting humans with the threat of death can produce various responses in them, but it would be a crude, blunt understanding it would have difficulty exploiting. Unless it was so advanced that it could construct psychological simulations of its opponents -- in which case, all in the name of producing an emotionless robot, you're going to the trouble of making an emotionless AI that creates emotional AIs just so it doesn't have to have emotions. Very odd.

>The robots are just what you send out to kill what you want killed.

Its hard to think of a situation in which something like that'd be useful, since, you know. Bombs. Drones. No point.

>War is the act of defeating one army with another and any tools that come handy.

That's World War 2, not something relevant to the modern day and not something relevant to a technology-soaked future. If killing the enemy army is all you care about, point an MLRS at them and take a nap.

Zero of the difficulties in modern warfare stem from how to kill the enemy. We can already kill as much as we want.

>> No.28727821

>>28727493
>It is a tool to carry out the prime order.
They would preserve themselves only to keep serving their purpose, which is the goal.

So exactly like animal (including human) life, gotcha.

>> No.28728017

>>28727799
I'd also like to point out that the most powerful learning system humans have are mirror neurons, which govern both empathy and learning.

That's part of why real sociopaths (as opposed to Dexter and Hannibal) are dumb as a box of rocks, they have piss poor learning abilities as a result of their lack of empathy.

>> No.28728066

>>28726874
>Sweet thing...

>> No.28729913

>>28723851
>>28723927
>>28723961
>>28724024
this interests me, but why is this a thing?

>> No.28731064

>>28727799
There's quite a leap from "victory in modern/future war isn't as simply as killing thew enemy" to "you can't possibly give a robot a heuristic to figure out how to achieve such a victory without giving it emotion." Also, it's entirely possible for a robot to know how humans react to various things and how that would help it accomplish its goal, without understanding why or emphasizing.

>> No.28731130

You could have one that tries to sympathize with humans and learn how to interact over time.

Or you could roleplay a robot version of Brooke and just make constant bad jokes about the things humans need.

>> No.28731376

>>28731064

>"you can't possibly give a robot a heuristic to figure out how to achieve such a victory without giving it emotion.

Victory is an arbitrary, subjective concept that can't be reduced to "kill more dudes" because modern technology is all you'll ever need if you just want to kill more dudes. So already we have allegedly smart emotionless robots only equipped to fight World War 2, and that's all they can ever be equipped to fight.

>how that would help it accomplish its goal

Its goal doesn't exist except in terms of subjective qualia. That it doesn't have and can't conceive of.

If you want the fantasy WW2 in space angle, where everyone's trying to murder each other but politely avoid using MLRS, nukes, EMP, and so forth, yeah it... sort of works.

Hyper-intelligent emotionless AIs are a contradiction in terms, on every level. This is more like a computer that works to help a soldier analyze problems. Turning over direct control over military equipment to it would be pointless, except as guiding bombs from point A to point B.

Emotionless robotic killers only work as terrorism tools, ways to deliver munitions mechanically, genocide dispensers, and of course, freakish encounters to litter a post apocalyptic landscape. No reason to build them, no situation where you'd need them.

Why take a perfectly good AI and cripple its ability to think rationally or logically by removing emotion?

>> No.28731516

>>28731130
>a robot version of Brooke and just make constant bad jokes about the things humans need.

Yes. YES!

Make this happen.

>> No.28732481

>>28723851
Does the robot you have in mind have complete, unfettered free will with only the desire for continued existence?

Or does it possess some set of rules or programming that go beyond that?

>> No.28732512

>>28731516
Or like that Saturday Night Live bit.

"LOOK... AT... BRENDA... DRINKING... COFFEE... COFFEE... BRENDA... ADDING... CREAMER..."

>> No.28732526

I would say... play it like a regular PC regardless of its robotic nature?

"Why, yes, I feel fine, thank you. What, you expected me to say something like 'I'm operating within normal parameters' or something? That's pretty damned biased of you, you asshole."

>>
Name (leave empty)
Comment (leave empty)
Name
E-mail
Subject
Comment
Password [?]Password used for file deletion.
Captcha
Action