[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/lit/ - Literature


View post   

File: 409 KB, 600x410, archai.png [View same] [iqdb] [saucenao] [google]
13327873 No.13327873 [Reply] [Original]

Is strong AI possible? Will the AIs of the future ever be capable of comprehending semantic - and not just syntactic - properties?

>> No.13328001

>>13327873
It probably is possible, but eventually we'll probably realize we don't really need it.

>> No.13328016

The more important question is whether it will be capable of distinguishing coffee without cream from coffee without milk.

>> No.13328023

>>13328016
go home zizek, you're drunk

>> No.13328030

To truly solve the problem of strong AI would require solving the hard problem of what consciousness is in the first place. Anything less is a bundle of algorithms. mechanical in principle. We should be more worried about an "algorithmic" being of sufficient complexity being enough to kill us anyway, when retards stupidly trust it to "make decisions" for them, or worse, such a being getting coopted by demonic/ahrimanic forces.

>> No.13328037

We don't have complete proof that it is impossible. While that doesn't mean it's possible, it does leave the door open. Maybe some kind of theorem or experiment in computer science can demonstrate a proof, but we're not nearly there.

>> No.13328048

>>13328030
>trust it to "make decisions" for them
There's a difference between having it make decisions and having it inform decisions. There may be issues of such complexity that the only way we can act on it is through external help. Perhaps climate change is one such issue idk.

>> No.13328058

>>13328030
>consciousness isn't just a bunch of algorithms

>> No.13328060

>>13328058
prove that it is, given that algorithmic complexity is still not sufficient enough to a produce a system as conscious as a human being

>> No.13328071

>>13328060
if you mean 'complexity' is one point of exception handling then yeah

>> No.13328081

>>13328071
consciousness is the algorithm that knows it is an algorithm, hence why it is consciousness and not just an algorithm

gas all reductionists

>> No.13328095

>>13328081
>it is an algorithm

fair call

>> No.13328112

>>13328081
>gas all reductionists

I'd like to enlist in your crusade

>> No.13328120

>>13328081
what about the physical brain do you think is so special that it has consciousness but other physical systems don't?

>> No.13328149

>>13328120
the brain has self-consciousness, other physical systems have (more or less) some form of consciousness, or internality, yes.

read whitehead. consciousness is a member of the set called "relationality", it is not itself that set (which is the mistake whitehead accuses modernists like descartes of, that is, they ontologize the immediacy of self-consciousness)

>> No.13328160

>>13328149
>the brain has self-consciousness, other physical systems have (more or less) some form of consciousness, or internality, yes.
So what does the brain have physically speaking, that creates this subset of relationality that is self-consciousness?

>> No.13328176

>>13328160
>>13328149
>>13328120
>>13328081
The more I think about it, the more panpsychism makes sense. Whatever we call consciousness is just being, and our knowledge of the world (including what we decided is "ourself") must be an abstraction.

>> No.13328186

>>13328160
neurons

an amoeba has no neurons, and yet clearly relates to its environment. i have neurons, and relate to my environment on a far more finely "gradated" level.

i have to agree with searle, then, that self-consciousness is a specifically biological phenomenon, though i dont discount the possibility of my robot vacuum cleaner having a dim kind of internality.

is there a certain threshold of algorithmic complexity that robots will eventually be able to breach into self-consciousness? maybe, but i doubt it, or rather, whatever threshold they do breach, the other side will not be recognizable as anything like self-consciousness

>> No.13328196

STOP

before you proceed you must not conflate consciousness with self-consciousness or 'bunch of algorithms' with 'algorithm'

>> No.13328198

>>13328186
>maybe, but i doubt it, or rather, whatever threshold they do breach, the other side will not be recognizable as anything like self-consciousness
But why? The brain has a structure, so you could make an analogous structure out of a different material no? Even if you couldn't and the substrate is itself important, could we not eventually make artificial brains out of neurons?

>> No.13328199

>>13328176
i agree, though i have to qualify your statement a bit, whitehead was a panpsychist, if anything he's something like a "pan-internalist". he doesn't believe an amoeba is conscious of its surroundings, but nonetheless, there is absolutely something like a "prehensive" or otherwise "self-relating" center present in the amoeba that any child can observe

>> No.13328212

>>13328198
this is the exact thought process i had. if consciousness is substrate-independent, then i could conceivably construct a brain out of wet dish towels, napkins, an electric current, etc. you get the idea

i don't believe this is possible

>> No.13328218

>>13328199
yes, imo the awareness of surroundings and decision-making are very specific add-ons, only necessary in the context of a macroscopic animal such as ourselves.

>> No.13328220

>>13328081
>self-hosted compilers are sentient

>> No.13328224

>>13328212
>then i could conceivably construct a brain out of wet dish towels, napkins, an electric current, etc. you get the idea
It would have to be colossal but I dont see why not in theory.

>> No.13328234

>>13328218
right, its more consciousness is a phenomenon of a particular gradient of biological complexity than it is present "dimly" in everything.

though i suppose, concretely, the difference between a relational center and a self-conscious one doesn't seem very clear or intuitive to me, and yet, whitehead's arguments themselves are extremely intuitive (subjectivity isn't a hierarchy, human subjectivity is one term in a heterogeneous series, not the capstone of a pyramidal chain of being)

>> No.13328239

>>13328224
that's true, why I made this thread. i can't discount the possibility but it seems extremely unlikely. what a dish towel brain would produce, would be nothing comprehensible to a good ol' biological one, anyways. just like the "center" of an amoeba is obviously existent, but not translatable to my own

>> No.13328251

the average home computer is already smarter than a modern christian

>> No.13328252

>>13328239
A dish towel is not a more foreign or banal object than the molecular components of a neuron. They are both absurdly different than our conscious experience.

>> No.13328258

>>13328252
true, but like i said, what it would produce would be something totally unlike human consciousness.

>> No.13328277

>>13328258
How do you know though? It's basically impossible to test this until we start trying to replace bits of brains with hardware and see what people consciously report back

>> No.13328311

>>13328277
i know it because i see it in animals. i look into there eyes and i sense an internality i can't easily slot into some hierarchical arrangement

you could say there's something more "personalized" about human eyes compared to the doll's eyes of, like, a shark, but i also see that personalization in elephants, some primates, even big cats, and they're not living like us humans, that's for sure.

there's no reason to assume AI will be a difference of degree, and not kind

>> No.13328322

>>13328311
Maybe it's the use of the terms degree and kind that is vague here. Whatever is going on in a dog's awareness strikes me as fundamentally similar to human awareness, but missing some functions we have, and probably having other ones we don't, as well as the prominence of those functions we share being different from dog to human.

>> No.13328348

>>13328322
trust me, I know the intuition you're communicating, but given the sensory apparatuses of a dog or cat, i struggle to determine what it is that human consciousness is the "amplification" of. however, whatever that property would be, it is present in chimps (look at their eyes), and i don't think it's a coincidence that a propensity for war and violence seems to correlate with the intensity of self-consciousness.

i would argue there are properties that, within certain phyla, become "hierarchized", but still exist in a flat plane.

>> No.13328403

>>13328348
Im not saying that human consciousness is the amplification of a dog's. Im saying humanbrains have entirely novel or repurposed structures, functions that dogs don't have like language. But we share lots of functions like sight and hunger, and probably experience them relatively similarly.