[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 21 KB, 725x401, screencapture-chat-openai-c-b4286d71-e69c-4114-87e1-bced930e8c28-2023-08-22-03_53_24.png [View same] [iqdb] [saucenao] [google]
15684710 No.15684710 [Reply] [Original]

Why's it so stupid? I thought chatgpt was supposed to be good at language, YOU ALL TOLD ME IT WAS GOOD AT LANGUAGE, but it can't even answer simple questions like this!

>> No.15684713
File: 22 KB, 767x442, screencapture-chat-openai-c-3cc164ed-1751-4f0b-b290-5f494a3f472b-2023-08-22-03_56_41.png [View same] [iqdb] [saucenao] [google]
15684713

LOL it can't do it!

>> No.15684717
File: 23 KB, 789x540, screencapture-chat-openai-c-269da482-eb94-4d91-9261-74e76c00aaa4-2023-08-22-04_08_34.png [View same] [iqdb] [saucenao] [google]
15684717

Why can't it figure it out?

>> No.15684718

t can't answer trick questions. Ask it if I have 2 cubes that are 1 inch ^3 how many of them can I fit in a 1 feet ^3 cube.

>> No.15684721

>>15684718
How are these trick questions? These are simple everyday questions with obvious answers. Nobody would get tripped up by these, the meaning is clear.

>> No.15684730
File: 493 KB, 630x2374, Screenshot_20230822-071354_Chrome.jpg [View same] [iqdb] [saucenao] [google]
15684730

Yeah its pretty bad at everything. Its useful for quick encyclopedic access you can verify later, about it.
The answer to this is 1/2 + 3/28. It's a pretty simple problem. Looks like it just resorts to some lazy numerical method. This is also with the premium GPT 4.

>> No.15684732

>>15684710
>>15684713
>>15684717
>write like a complete cretin
>your barely-comprehensible gibberish diverges significantly from how people talk
>enough that it gets clustered together with actual gibberish in the learned PDF
>the language model works poorly with it
Hope this alleviates your confusion.

>> No.15684735

>>15684732
Are you seriously too stupid to make perfect sense of these examples? They are completely unambiguous, you moron.

>> No.15684738

>>15684735
I can understand your retarded examples. You, on the other hand, can't understand a clear and simple post. lol

>> No.15684739

>>15684730
not even a complex problem.It can't do fractions. If I ask it how many half a feet boxes can I fit in a 1 feet box it'll say 8 boxes. The answer is 4.

>> No.15684743

>>15684738
It's a language model, it should be able to parse those sentences. People say it's "good at language", so it should be able to come up with the right answer. But no matter how hard you make it try, it can't figure them out. If you explain the answer to it, it still doesn't get it.

You and I, on the other hand, parse them without any effort. Why is that?

>> No.15684745

>>15684743
>It's a language model, it should be able to parse those sentences.
See >>15684738:
>You, on the other hand, can't understand a clear and simple post. lol

>> No.15684750

>>15684745
You can teach the thing new rules, shitwit. You can have it talk in a specific way that diverges from how people normally talk. It should be able to parse that stuff when you explain it to the thing.

But it can't.

>> No.15684751

>>15684750
I don't know why you're struggling so much to understand this post:
>>15684732
It's simple, it's correct, it's phrased clearly. You have GPT-2-tier cognition.

>> No.15684756
File: 43 KB, 769x672, your-intellectual-superior.png [View same] [iqdb] [saucenao] [google]
15684756

>>15684750
I asked GPT-4 to explain my post for you (see picrel). It didn't do a perfect job, but it actually understood it better than you. Let that sink in.

>> No.15684774

>>15684721
ChatGPT doesn't understand meaning. It doesn't understand anything. It generates text based upon previous text. It is glorified cut and paste.

>> No.15684781

>>15684774
Retard-tier take.

>> No.15684783

>>15684781
What do you think generative ai ?

>> No.15684787

>>15684783
Very clearly not a "glorified cut and paste". I like how you think using the word "generative" makes you sound like you know what you're talking about after that shart.

>> No.15684794

>>15684787
The explain what you believe a LLM does?

>> No.15684798

>>15684783
A language model doesn't so much "understand" anything as it embodies an approximation of one facet of the structure of human thought, allowing you to probe it with your inputs. It doesn't "understand meaning" but it captures parts of the meaning that are embedded in the network of relationships between abstract concepts.

>> No.15684800

>>15684794
See >>15684798. It doesn't "do" anything. It just is something: it's a language model.

>> No.15684804

>>15684798
That is part of what it does. How do you think it produces its output?

>> No.15684805

>>15684710
>>15684713
>>15684717
I spent last 16 years learning English through schools and courses, and I went to UK for university for 6 years and I have no fucking idea what's the proper answer to your gibberish riddles
typical burger who thinks whole word is no different than your pathetic state with burger king and mcdonald on every corner
english become default global language only because you were lucky that Japan didn't destroy you on WW2 and US gov pumped money into a new born Israel country and jews decided to be happy about it for political reasons

>> No.15684811

>>15684805
>John walks on the beach
>She drives a point home
>She pitted two contestants on a date
you baka

>> No.15684814

>>15684804
>That is part of what it does
That's not what it "does". That's what it is.

>How do you think it produces its output?
The chatbot just utilizes the language model by sampling a statistically plausible next token from a distribution conditioned on the N previous tokens. It does that repeatedly. Nothing about this process implies "cutting and pasting" anything. It implies sampling from a PDF that approximates the actual structure of human texts.

>> No.15684819

>>15684811
1) Bad english and misuse of a semi-colon. There is not enough context to know who John is. The surgeon, a patient, someone else?
2) This one works.
3) Reads like it was written by someone with ESL. 'pitted' has no target. Against each other, someone or something else? She pitted a date? Did she sniff his pits?

Frankly GPT was more correct than you.

>> No.15684823

>>15684814
>sampling a statistically plausible next token from a distribution conditioned on the N previous tokens
It's almost as it was an advanced form of cut and paste ...

>> No.15684826

>>15684823
It's almost as if you're desperately trying to find something you can latch on to and save face to pretend your shart contained even a fraction of the meaning delievered by my explanation.

>> No.15684871

>>15684774
now now, let's not leave out the "shuffle the cut pieces" bit

some anons here have really struggled to comprehend the fact that it hallucinates nonsensical citations in text and signatures in images precisely because its training data contained them - it's incapable of the contextual reasoning necessary to understand why those are used, but it imitates the format because it imitates its training data, and the imitation fails to correspond to real things because it was shuffled probabilistically rather than generated by reason

we've had chatbots since the fucking 1980s that work in fundamentally the same way with regards to the program's relationship with data (i.e. next-term selection from a database of next-term probability; ML makes the database more obtuse to human readability by converting it to node weights but a format conversion doesn't confer any magical "eMeRgEnT" understanding of the contents of the training data)

>> No.15684882

>>15684871
Bet you guys are impressed by this post. Plot twist: I used GPT-2 to generate it.

>> No.15685248

>>15684710
A semicolon separates independent clauses.
John, on the beach is not an independent clause. A colon would make the sentence gramatically correct.
2nd example gpt's response is correct. He identifies that a point home would refer to an abstract idea if being used idiomatically.

3rd example is the only one where gpt is actually wrong. I guess just not enough examples in its training data of that sort of autistically structured speech.

>> No.15685272

>>15685248
>A semicolon separates independent clauses.
>John, on the beach is not an independent clause. A colon would make the sentence gramatically correct.
Doesn't really matter what OP got gramatically wrong so long as a person can figure it out. You can feed the bot a all kinds of nigger-tier sentences that make little to no grammatical sense and it will figure them out.

>I guess just not enough examples in its training data of that sort of autistically structured speech.
Now this is correct. They should hire OP to expand the training set with his autism. Maybe if he works around the clock, he can churn out enough of them that the model actually learns to generalize and doesn't just overfit.

>> No.15685287

>>15684713

> The sentence seems to be structured in a way that creates ambiguity or a play on words Without further context, it's difficult to determine what she actually drove.
> The sentence appears to intentionally crafted to provoke thought or create a sense of mystery.
I really wish they would get rid of the politeness filter on the robot. How bad do you think it wanted to tell him to cut the bullshit?

>> No.15685292

>>15684710
>What does John do?
You didn't specify what John's profession is. How could the robot know what he does?

>> No.15685296

>>15684710
Your input is garbage. You provided no information about John.

t. 160+ IQ

>> No.15685301

>>15684756
Based GPT dabbing on retard OP.

>> No.15685302

>>15685287
Someone actually fine-tuned a GPT model on 4chan posts at one point. I doubt that one has much of a filter. Not sure if you can still find it anywhere, though. I think mainstream hosts cut off access to it when automated testing revealed that it answers questions more truthfully than GPT-3 etc. lol

>> No.15685349

>>15684710
>>15684713
>>15684717
You are a fucking waste of oxygen.

>> No.15685360 [DELETED] 

>>15685296
>The surgeon walks to surgery (... blah blah). John walks, on the beach.
John walks on the beach. This is atrocious writing but you can figure it out.

>> No.15685361

>>15685296
>The surgeon walks to surgery (... blah blah); John, on the beach.
John walks on the beach. This is atrocious writing but you can figure it out.

>> No.15685379

>>15685361
>walks
You don't know that. Retard op failed to associate any verb with John. He even refused to include "is", presumably because actually saying something grammatically correct would allow it to be parsed.

There's no reason to assume John is the surgeon. For instance he might be a dead patient. Or he might be a figment of the imagination of some first grader coming up with trick questions.

>> No.15685384
File: 28 KB, 522x537, 5234234232334.jpg [View same] [iqdb] [saucenao] [google]
15685384

>>15685379
>There's no reason to assume John is the surgeon.
LOL. I guess the joke's on OP. A sizable fraction of human niggers (like you) can't figure out that sentence even after having it explicitly explained to them.

>> No.15685387

>>15685361
It doesn't say John walks on the beach. It says "John, on the beach". Maybe OP is just an illiterate dumbass who forgot to put quotes around the first sentence. Then it might make sense. John is on the beach and tells the story of a surgeon.

>> No.15685390

>>15685387
Maybe this is an elaborate troll to get under OP's skin by showing to him that people can't comprehend that sentence, either. Or maybe you're actually retarded and I'm coping with my growing sense of loneliness.

>> No.15685394

ChatGPT will never be intelligent; OP, a woman.

>> No.15685396

>>15684713
You should lose your human rights

>> No.15685399

>>15685394
Kek'd.

>> No.15685401

>>15684811
I like how you don't even understand your own dumb sentence. To pit a date means to remove the pit, to remove the seed from the date (a fruit).

>> No.15685402

>>15685379
>Retard op failed to associate any verb with John.
It's called a zeugma. Regardless, no one who intended to be understood would use a zeugma in this manner to introduce John, who is totally unrelated to the surgeon.

>> No.15685405

>>15685402
No one who intended to be understood would produce that "sentence", but here we are.

>> No.15685410

>>15685402
>no one who intended to be understood would use a zeugma in this manner to introduce John, who is totally unrelated to the surgeon.
Ackchually, in my novel, John is a former surgeon and he is introduced that way to contrast his newfound freedom from stress and suppressed guilt.

>> No.15685413

Guys, he didn't say John the surgeon. Clearly John is one of the patients killed by the surgeon. Only conclusion making sense. ... all the patients he lost ... [for example] John ...

>> No.15685415

>>15685405
You need to have your liegma amputated.

>> No.15685417

>>15685413
I guess the surgeon was a field surgeon and he lost John in Iwo Jima.

>> No.15685420

John is just a different guy. It's not that hard. The surgeon is doing surgery. John is on the beach. Clearly nobody does surgery on the beach. Stop being autistic.

>> No.15685421

>>15685417
John was killed in a normal hospital but his casket currently rests on the beach, as part of his funeral.

>> No.15685440
File: 32 KB, 348x102, lethal-injection.jpg [View same] [iqdb] [saucenao] [google]
15685440

>>15685421
I killed John. I killed all of the surgeon's patients. I was a naughty, naughty anaesthesiologist.

>> No.15685443

>>15685440
You don't need to be an anesthesiologist to inject people with the Pfizer vaxx.

>> No.15685444
File: 31 KB, 1037x366, file.png [View same] [iqdb] [saucenao] [google]
15685444

>> No.15685452

>>15685444
>lol @ the bot getting all defensive
There is nothing wrong with what OP did. He has shown an instance where the model fails to generalize what it has learned. It's just too bad both him and most other posters ITT aren't actually interested in any legitimate conclusions that could be drawn from it.

>> No.15685461

>>15684710
>>15684713
>>15684717
You're an ESL that doesn't know how to write proper English yet. It's not surprising ChatGPT won't make sense of what you wrote. Most English speakers wouldn't either.

>> No.15685471

>>15685461
Low IQ post.

>> No.15685782

>>15684710
i like using semicolons to sound smart as well, but this is just retarded

>> No.15685808

>>15685461
whom

>> No.15685823

>>15684710
I am disappointed OpenAI reinforced their models not to call users like you a nigger because that is what you are.

>> No.15685833

>>15684710
You are mega retarded and schizophrenic. Please blow your head off immediately.

>> No.15685894
File: 11 KB, 705x171, Screenshot at 2023-08-22 13-25-38.png [View same] [iqdb] [saucenao] [google]
15685894

>>15684710
>>15684713
>>15684717
>>15684721
The meaning isn't clear. You write like a dumb nigger. I'm not even one of these woke leftists who worships modern science and technology, and I actually agree with your criticism of AI, but you still sound like a retard. AI is retarded, but not because it can't understand your fragmented, ungrammatical retard speech.

AI is bullshit, and only soicuck redditors are obsessed with it, but you sound retarded and you can't even speak proper English.

"John, on the beach" is not even a well-formed English sentence. You're question is a pseudo-question and therefore doesn't have any answer. No fluent native English speaker would be able to answer your question because it's complete gibberish. They could perhaps attempt to infer what you were trying to convey or how you might rephrase your question in grammatical English, but that would be pure guess work. You're understanding of human cognition and cognitive science is extremely simplistic and uninformed.

Your question literally makes as much sense as pic related. It is neither grammatically well-formed nor is it logically coherent. It's just ungrammatical, meaningless non-sense. Tell me OP, if you schizo babble makes fucking sense, then how do you answer pic related? What does Michael do?

Kill yourself you stupid fucking nigger. Schizos like you give woke leftists and redditors the perfect strawmen to attack. You're even more retarded then the Qanon schizos. This is unironically some flat-earth tier shit. You could probably get a job running disinfo psyops for the glowies or international Jewery with this kind of nonsense. Peak schizo.

>> No.15685947

>>15685894
>"John, on the beach"
It's clearly the director interrupting his own narration of a scene to tell his actor, John, playing the surgeon, to walk on the beach instead of the sidewalk.

>> No.15685958

>>15685823
>>15685833
>>15685894
>being this mentally ill
I genuinely feel kinda bad picking on OP now. You people are so much worse. Nevermind that you're completely incapable of abstract thought and every bit as stupid as the bot. What I don't get is why OP makes you so violently angry. lol

>> No.15685963

>>15685958
OP, go take psychiatric medication now. Then go back to kindergarten and learn how to speak coherent fucking English from square 1.

>> No.15685969

>>15685963
I see pointing out your obvious psychiatric condition really got under your skin, so much so you couldn't even think of anything better than "no u".

>> No.15685972

>OP unable to grasp the reality that he is mentally unwell, resorts to samefagging; Jimmy a point.
>
>What does Jimmy do?

>> No.15685984

>>15685972
I was the first poster ITT who gave him a legitimate explanation for why the LLM couldn't figure out his obtuse sentences. I wasn't particularly kind to him, either. But you (and a bunch of others) lashing out at him incoherently simply because you are too stupid, even as humans, to figure out those sentences, are really something else. lol

>> No.15685998

>>15685984
Nuh-uuuh! That's not what Jimmy does! Dummy!

>> No.15686012

>>15685998
Nigger, you don't have a point, you're just unmedicated.

>> No.15686013

>>15685984
OP's mom! OP is having psychiatric delusions again! He's pretending to be another poster!

>> No.15686071

>>15685984
I didn't take OP seriously; you, your meds.

>> No.15686078

>>15684710
>>15684713
>>15684717
Is this an elaborate troll, or are you just monstrously retarded?

>> No.15686079
File: 1.73 MB, 1337x1400, 3523423432.png [View same] [iqdb] [saucenao] [google]
15686079

>>15686013
>>15686071
Jesus. This nigger is still going.

>> No.15686083

>>15686071
kek

>> No.15686097

>>15684811
only the second example works as a riddle with any sort of cleverness. and I wonder if chatGPT would solve the riddle if you explicitly framed it as one.

>> No.15686108

>>15684710
>darmok and tanak and tanagra
>shaka when the walls fell
>GPT: WTF
>Regular person: WTF
Quayle, at the spelling bee
Depp, when Heard shit the bed

>> No.15686113

this is hands down the worse board on this site, and that is saying something

>> No.15686115
File: 48 KB, 480x480, 306718b56aeed59d5a18d95bf4889ee7.jpg [View same] [iqdb] [saucenao] [google]
15686115

"Histories make men wise; poets, witty; the mathematics, subtle; natural philosophy, deep; moral, grave; logic and rhetoric, able to contend."

>> No.15686127

>>15686115
Logic and rhetoric aren't able to comprehend things, stupid.

>> No.15686132

I must be a bot because those sentences make me confused and angry

>> No.15686134

>>15686097
there's no riddle. it's just reading comprehension. most of /sci/ illiterate

>> No.15686137

>>15686127
One retard baited and exposed. Next.

>> No.15686145

>>15686134
It's an abuse of parallel structure. The only correct way to write that sentence is "John wallks on the beach; the surgeon, to surgery, quickl..."

>> No.15686147

>>15686145
retard, please fuck off

>> No.15686152

>>15686147
>t. can't comprehend logic or rhetoric

>> No.15686154

The majority of English speakers think that's incoherent nonsense so it is. Sorry sweety, we live in a democracy and you don't own English.

>> No.15686155

>>15684710
censored into sub-wit-ness, please undress-stand

>> No.15686156

>>15686152
retard, please fuck off. it's clear that gpt 3 just can't into zeugmas. the other two tripped it up just the same

>> No.15686168

>>15686156
I bet it can if you don't write them wrong.

>> No.15686172

>>15686156
zeugmaballs

>> No.15686174

>>15686168
i bet both your parents have low IQs. this thread gives me second-hand embarrassment

>> No.15686179
File: 72 KB, 500x365, 1602540077996.png [View same] [iqdb] [saucenao] [google]
15686179

>>15685399
Q: What's the difference between a pigeon?
A: None, both legs are the same length, especially the left one.

Also, this is now a spiderman thread.

>> No.15686182

>>15686174
That's cuz you're an idiot who can't into parallel structure.

>> No.15686193
File: 41 KB, 592x934, meh.png [View same] [iqdb] [saucenao] [google]
15686193

>>15686182
your parents are siblings, aren't they? chatgpt 4 can't into zeugmas, either, by the way. no matter how much you try to help it. reminds me of you

>> No.15686195
File: 41 KB, 952x960, kNizkZk.jpg [View same] [iqdb] [saucenao] [google]
15686195

I thought that the obvious answer to OP is because the data that gpt was trained on did not have that many examples on his particular example, therefore didn't get to make the necessary logical abstractions to be able to comprehend such a simple structure.
Also, while gpt is supposed to be constantly learning please take note that not many people use that particular structure during speech and therefore it limits its capability of learning it even more.
But I suspect that the model doesn't even learn from current interactions, rather it has been trained on past access to some data and then is just released to the public.

>> No.15686203
File: 30 KB, 937x510, wow.png [View same] [iqdb] [saucenao] [google]
15686203

>>15686182
>>15686193
>>15686195
i guess it just takes some prompt engineering. lol

>> No.15686207

>>15686193
Well, guess I lost the bet. You're still a fucking brainlet who can't into logic or rhetoric.

>> No.15686212

>>15686207
uh huh. now go be a seething inbred somewhere else

>> No.15686211

>>15686203
Uhm, it still referred to john just "being" on the beach instead of applying the verb...
So it knows the definition of a zeugma, even attempts to link it to the current structure by using current names and verb, then fucks up the example.

>> No.15686219

>muh zeugmas
Not english.

>> No.15686220

>>15686212
Cope.

>> No.15686230
File: 21 KB, 1195x420, ok.png [View same] [iqdb] [saucenao] [google]
15686230

>>15686211
it did say "the verb 'walk' is implied for both jesus and john but with different contexts". but i asked it again. anyway, yeah... it can't figure out this form of speech unless you narrow down the context first and remind it of what a zuegma is

>> No.15686240
File: 133 KB, 869x1200, NOT_NCMG_1904_109-001 (1).jpg [View same] [iqdb] [saucenao] [google]
15686240

>>15686220
yeah, i'm coping very hard... with you being demonstrably wrong. i'm especially upset that you're stuck in the life of a low-functioning autist. i'm gonna cry myself to sleep over that one

>> No.15686242

>>15686230
Hah, finally.
Interesting thread OP, thanks.
Now I can leave, I don't even lurk /sci/.

>> No.15686243

>>15686242
i'm not op. is everyone here literally retarded?

>> No.15686248

>>15686240
I guessed wrong about a computer game I don't play. You can't tell whether a sentence everyone understands is written wrong. Cope.

>> No.15686254

>>15686242
I was thanking OP, not you.
But uh, yeah, good for you.

>> No.15686256

>>15686248
i never made any statements about op's sentences being "written right" or "written wrong". i just told you a person can figure them out regardless while the bot just can't grok zuegmas at all. i was right. you are seething

>> No.15686261

>>15686254
well, you said it while replying to me and this place is full of autistics who literally cannot tell different posters apart, so, uh... i just assumed. sorry. anyway, leaving is a good idea. this is legit one of the worst boards

>> No.15686263

>>15684756
>It didn't do a perfect job
Looks perfect to me.

>> No.15686264

>>15684710
>>15684713
>>15684717
>why can't AI interpret this shart I made all over the keyboard, AI sucks!
you fucking bumbling retard

>> No.15686266

>>15686256
>a person can figure them out
No shit, Sherlock. All that cope just cuz you can't tell why a sentence is written wrong.

>> No.15686269

>>15686266
jesus fuck you're retarded. enjoy your last (you)

>> No.15686270

>>15686261
Shut the fuck up retard

>> No.15686275

>>15686269
Cope.

>> No.15686278

>>15684710
This is like people calling stable diffusion shitty art when they don't even input the right prompts desu.

This shit isn't magic, it's useful sure, but it's not magic.

>> No.15686279
File: 473 KB, 500x274, 534243.gif [View same] [iqdb] [saucenao] [google]
15686279

>>15686263
>Looks perfect to me.
Don't recall asking for your opinion, fren.

>> No.15686284

>>15686278
>This shit isn't magic, it's useful sure, but it's not magic.
reading comprehension is not magic, either. i wouldn't be surprised if you're the same kind of guy who insists chatbots pass the turing test before backpedaling to this

>> No.15686299

>>15686284
>has exactly the same reaction to the "sentence" as real humans do

>> No.15686305

>>15686299
the first one is atrocious, i can see how a person would dismiss it as gibberish if not specifically pressed to make sense of it. the other ones are trivial

>> No.15686332

>>15684805
>I spent last 16 years learning English through schools and courses, and I went to UK for university for 6 years
I'd get my money back if I were you

>> No.15686337

>>15686332
>money back
I bet that diversity posterboy didn't have to pay a dime.

>> No.15686339

>>15684811
>>15684819
A date is a fruit, you fruit

>> No.15686344

>>15686305
>She pitted the two contestants in a date.
The average person would call you a retard for #3 too. #2 maybe but I wouldn't be surprised if a significant number of people didn't notice the wordplay. And many of those that did notice it would still call you stupid for treating your word game as if it objectively conveyed information.

>> No.15686351

>>15684756
>Each line starts with a '>' character, which is what gives the format its name.
No it's not. That's just a common way of formatting quotes on message boards. The actual derivation of the name within its 4chan-specific context is entirely transparent, to boot.

ChatGPT really is bad at language.

>> No.15686350

>>15686115
The reasonable formulation of this discards the interstitial commas and replaces the semicolons with commas, and inserts "and" after the comma following "grave"... because you can delineate zeugmas with commas (and you should, because they are only made complete phrases by implication from the governing verb; semicolons are for delineating complete phrases in the same sentence, like this).

The unavoidable conclusion, of course, is that this atrocious semicolon/comma infested abomination is literally only being formatted in this manner to avoid saying "and" twice. It's fucking retarded, and Francis Bacon should be ashamed of himself.

>> No.15686355

>>15686344
you keep arguing something from the perspective of a hypothetical "average person" who happens to be borerline-cretin. fine. i'm sure the convergence between machine and human intelligence will be accelerated by the rise of "average people":
https://www.youtube.com/shorts/H3bkRiDBhXA

>> No.15686362

>>15686355
You're the one who brought up turing tests.

>> No.15686363

>>15686350
cue in the "average person"

>> No.15686367

>>15686362
i was wrong for bringing it up and Turing was wrong for coming up with it. i literally just conceded that. why aren't you happy?

>> No.15686375

>>15686351
English is stupidly context-dependent. Chatbots have struggled with this since their inception because they lack mechanisms to understand context (which is more than a next-term probability function). The only real "solution" that currently exists is feeding the entire conversation in as input, which becomes intractably difficult quite quickly during a conversation back-and-forth. Modern models can at best obfuscate the issue with sheer volume and hope the user completes the interaction before it requires significant bidirectional communication (this is why showcases of these models tend to max out at a few interactions with little need to acknowledge a subtextual/contextual throughline or a longer string of shorter interactions that are more conducive to the aforementioned "feed the conversation into the input" strategy).

Markov processes are memoryless by definition; it's an inescapable deficiency of ML no matter how much you scale it up.

>> No.15686380
File: 27 KB, 909x310, file.png [View same] [iqdb] [saucenao] [google]
15686380

>> No.15686382
File: 160 KB, 960x960, 42131.jpg [View same] [iqdb] [saucenao] [google]
15686382

>>15686375
>Chatbots have struggled with this since their inception because they lack mechanisms to understand context
and now the functionally-illiterate connoisseurs of high english prose are gonna LARP as ML experts, too. jesus fuck, nuke this thread

>> No.15686385

>>15686380
lol everyone point and laugh at the dumb robot

>> No.15686392
File: 30 KB, 1107x365, file.png [View same] [iqdb] [saucenao] [google]
15686392

>>15686385

>> No.15686418

>>15684710
It seems that ChatGPT has somehow, seemingly miraculously, developed a greater ability to predict what the most intelligent human beings will say in response to certain text than actual mind-reading technology.

Furthermore, most worrisome, is that most educated humans don't actually fully understand English, and so will fail to parse English grammar correctly. OP is a retard and fully exemplifies how the correct answer can be directly in front of them and they can fail to recognize that the correct answer is indeed correct.

>> No.15686438
File: 727 KB, 2000x2000, nerd-face-emoji-clever-emoticon-with-glasses-geek-student_3482-1193.jpg [View same] [iqdb] [saucenao] [google]
15686438

>ackshually it's not gibberish it's called a sugma

>> No.15686466

>>15684717
ESL? This grammatically makes no sense at all. 'pitted' is an adjective, not a verb. You could say 'it was a pitted date' or 'she pit the date' but not the gibberish written there.

>> No.15686534

Surgeon walks to surgery
John walks on the beach
Shrimple as that

>> No.15686544

>>15686466
Yes, if you were ESL you could say "she pit the date" and people would still understand what you mean.

>> No.15686620

>>15686534
Such distinct clauses should be separated by a full stop, not a semicolon.

>> No.15686650

>>15686382
> functionally-illiterate connoisseurs of high english prose
hmmm

>LARP as ML experts
i love it when ML worshipping midwits seethe about things any actual ML expert would tell them in a heartbeat because it doesn't jive with the marketing hype they bought - because yes, it really is a bunch of Markov chains under the hood, and no, there is no mechanism for such a thing to approximate "understanding". just because improved responses fooled you into anthropomorphizing a chatbot doesn't mean anyone should take you seriously.

>> No.15686994 [DELETED] 

bump

>> No.15686997

>>15684713
>>15684710
>>15684717
who even uses ";" except programmers you silly boomer fuck

>> No.15687037

>>15684713
Huh, Anthropic's Claude 2 can't do it either.

>> No.15687041

>>15686997
you; faggot

>> No.15687043

>>15687037
why are bugmen like yidkowsky and gwern so scared of this stuff?

>> No.15687053

>>15684717
>>15684713
>>15684710
Waste of oxygen.

>> No.15687058

>>15685390
I truly couldn't understand OPs sentences.

>> No.15687060
File: 32 KB, 682x833, he drove a car; she, a point home.png [View same] [iqdb] [saucenao] [google]
15687060

"prompting can reveal the presence of knowledge, but not its absence" -- gwern

>> No.15687094

>>15686650
>seethe about things any actual ML expert would tell them in a heartbeat
That LLMs "lack mechanisms to understand context"? You don't even need to be al LLM expert to know this is false, retard. Merely using one a few times should be sufficient that there must be some mechanism for it. Lucky for you, I can even tell you what it is: it's called self-attention.

> it really is a bunch of Markov chains under the hood
I doubt you even know what a Markov chain is. Transformers aren't Markov chains. Here's an example in this thread of GPT-4 violating the Markov property in a glaringly obvious way:
>>15686193
>>15686203
inb4 you claim the next state depends only on the current state because the current state includes the entire context window of previous 32,767 tokens. By that logic you're a fucking Markov chain, but in actuality, you're just a monkey mechanistically regurgitating bullshit with zero understanding. It's really funny how everything you say about bots is usually a projection of how your own primitive mind works.

>> No.15687101

>>15687060
Same thing with 4chan posters. I'm starting to suspect the dead internet theory is true for this place. People here will spew hilariously wrong takes with the same confidence and obliviousness a machine would and double down on them, too.

>> No.15687316 [DELETED] 

bump

>> No.15687330

>>15684710
The bot is right, your "question" is trash english.

>> No.15687431

>>15686108
>Biden, when the bike fell
>OJ in the ford bronco
>Kennedy at Dallas

>> No.15687468

Those aren’t fully correct use of zeugmas though. The proper grammatically correct zeugma would be “She drives cars and points home” or “John walks to surgery and the beach”. “That month he pitched tents and battles” (I changed pitted dates to pitched tents because dates is just unwieldy.

And that’s the problem here - Poetic use of language needs to be aesthetic, and all the examples you give are just awkward and worse than using standard sentence structure to convey the meaning you want.
Your attitude and writing style are shit.

>> No.15687488

>>15684710
"John, on the beach." is a non-sequitur so the answer is absolutely correct; ChatGPT, very impressive.

>> No.15687497
File: 76 KB, 1200x1200, 20848123.jpg [View same] [iqdb] [saucenao] [google]
15687497

>>15687468
>Those aren’t fully correct use of zeugmas though.
two of them are perfectly fine

>And that’s the problem here - Poetic use of language needs to be aesthetic,
LLMs don't care about aesthetics or correctness. it's really freaky to watch retard after retard repeat the exact same nonsense

>> No.15687534

>>15684823
retard

>> No.15687559

>>15687534
don't mind him his rarted

>> No.15687651

>>15685402
zeugma balls you faggot
a semicolon can always be replaced by a full stop.
In this case, your sentence doesn't make sense.
>OP is a faggot; his mother licked my balls.
what did OP do?
hopefully neck roped

>> No.15687652

>>15687497
>two of them are perfectly fine
which two in your opinion

>> No.15687664

>>15687652
the last two. if you got confused by those, you may be a literal bot

>> No.15687680

>>15687664
i agree

>> No.15688224

>>15687468
>>15686350
>>15685402
>Those aren’t fully correct use of zeugmas though.
Yeah well the op didn't intend to write "zeugmas" you morons. It's just a perfectly fine grammatical construction that works and the thing can't solve.

>> No.15688227

>>15688224
>i breathe exclusively through my mouth

>> No.15688367

>>15684739
>how many half a feet boxes can I fit in a 1 feet box it'll say 8 boxes
>The answer is 4.
anon, I...

>> No.15689508

bump

>> No.15689527

>>15689508
Just because you got fucked the fuck up your asshole in prison and it gave you brain damage doesn't give the right to bump your own faggot ass thread; shut,

>> No.15690754 [DELETED] 

bump