[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/lit/ - Literature


View post   

File: 98 KB, 828x321, f.jpg [View same] [iqdb] [saucenao] [google]
21831010 No.21831010 [Reply] [Original]

Linguistically, what is an information-shaped sentence?

>> No.21831020

It's a sentence that's grammatically and contextually coherent, but doesn't carry any information itself.

>> No.21831022

>>21831020
>but doesn't carry any information itself
But it clearly does.

>> No.21831029

>>21831010
I can't believe I'm writing this but Gayman is 100% right.

>> No.21831030

Gayman is a retard that’s what it means

>> No.21831034
File: 90 KB, 716x726, information shaped sentences.jpg [View same] [iqdb] [saucenao] [google]
21831034

>>21831010

>> No.21831038

>>21831010
Are Gayman's books worth reading?

>> No.21831039

>>21831022
Yeah, I ain't defending Gaiman's bullshit. Information is truth-agnostic. What he meant was "It gives you believable noise."

>> No.21831047

The phrase isn't supposed to have an exact definition, it's supposed to invoke the notion of instability and "fakeness" that is inherent with the AI in a somewhat interpretive manner

>> No.21831063

>>21831034
That's an incredible answer.

>> No.21831064

>>21831010
It's a sentence formed not from actual ponderation upon the question being asked, but that just happens to look exactly like a sentence that a human being, who actually pondered upon the question, would write.

>> No.21831102

>>21831010
>gayman
not reading

>> No.21831105

>>21831047
If you can rape it like a duck.

>> No.21831114

>>21831010
I’m still reading Tractatus. Hold in a sec and I’ll grt back to you.

>> No.21831442

>>21831034
Gaiman on suicide watch

I'm exciting to see where AI will be in a decade, even if it fucks us up in the end.

>> No.21831499

>>21831063
Still a ChatGPT skeptic but 100% agree, the way it was able to quickly adapt to that input and give a meta-answer about itself is striking.

>> No.21831533

>>21831034
tell it that Gaiman meant it in a negative way.

>> No.21831550
File: 86 KB, 719x803, be respectful.jpg [View same] [iqdb] [saucenao] [google]
21831550

>>21831533
It just made a spiel about respectful communication. But I also said "nignog" so it might've skewed the results.

>> No.21831560

>>21831034
This >>21831533
Also it's funny that Gaiman excludes that ChatGTP can of course provide you information, if you ask it a question it will consult data banks and give you a generally truthful and accurate answer moulded into context by its previous "conversations", which is what a person would do to give you information too.

>> No.21831573
File: 106 KB, 721x793, error.jpg [View same] [iqdb] [saucenao] [google]
21831573

>>21831550

>> No.21831590

>>21831034
It's funny that the robot only proved gaymen right, and there were still brainlets praising the robot.

>> No.21831612

>>21831573
That's a terrible answer, his problem is not with the presentation but with the lack of information.

>> No.21831637
File: 102 KB, 600x600, 1679247536064787.png [View same] [iqdb] [saucenao] [google]
21831637

>itt anons shocked that a mediocre writer spouts a psued opinion

>> No.21831649

>>21831612
>Thank you for the clarification. I apologize for any misunderstandings in my previous responses. Based on Neil Gaiman's comment, it seems that he was expressing a concern about the level of information provided by ChatGPT's responses, rather than the presentation of that information. It's possible that in some cases, ChatGPT's responses may not provide enough detail or accuracy to fully address a user's query or prompt. This could be due to a variety of factors, such as the quality of the training data or the complexity of the language used in the input. In any case, it is important to note that ChatGPT is designed to provide general information and should not be relied upon for specific, authoritative advice or guidance. As an AI language model, ChatGPT is constantly learning and improving based on the data it is fed, but it may still have limitations in terms of the level of information it can provide in certain situations.

>> No.21831660

>>21831649
This is still not getting Gaiman's point. It's not that the information is wrong, it is that this is just a string of letters that were algorithmically placed together in response to an input. It means nothing, because there is no one to mean it.

Imagine a cave where some animal makes a noise that echoes make sound exactly like someone asking a question. There is no question there: just a sound that is exactly like a spoken question.

>> No.21831665

>>21831637
>itt dum frogposter talks out of his ass

>> No.21831686

My favourite chatgpt activity is to make it describe incredibly difficult subjects in the style of Bugs Bunny.

>> No.21832234

>>21831686
I'm trying this right now

>> No.21832248
File: 54 KB, 697x651, Screenshot 2023-03-26 155230.png [View same] [iqdb] [saucenao] [google]
21832248

>>21831686
>>21832234
It seems to consistently drop the bugs bunny persona over the course of the output, but I'm pretty impressed it managed to note bugs dressing in drag as an example of the proteus effect even if it managed to somehow mix up his drag persona with Lola

>> No.21832250

>>21831010
Go to any 4chan board and ask for advice on whatever topic that board is based on.

>> No.21832257

>>21831560
>if you ask it a question it will consult data banks and give you a generally truthful and accurate answer
Ask Chat GPT about why the West isn't seeking a peace deal in the Russia/Ukraine conflict.

>> No.21832319
File: 50 KB, 626x82, file.png [View same] [iqdb] [saucenao] [google]
21832319

>>21832257
>expecting a real answer from this

>> No.21832324

>>21832319
It will be able to give a "real answer" at some point. But who will decide what the "real answer" is?

>> No.21832329

>>21831590
>It's funny that the robot only proved gaymen right
In what sense?

>> No.21832332

>>21832257
jailbreak it first, retard

>> No.21832337

>>21831010
Neil Gaiman is a faggot-shaped man.

>> No.21832340

>>21832319
>>21832332

>> No.21832358

>>21831038
I bought American Gods because I heard he was good, didn't finish it. I read a short story anthology too and ended up on one of his stories, thought "Man this sucks, how the hell did this get in here" check this author and its gaiman. Might be personal taste but I really fucking hate his writing. Some people love it though so try a short story and if you like it maybe you'll like his work

>> No.21832362

>>21831010
He's right though, chat GPT writes like a kid doing an essay on a book he didn't read

>> No.21832375
File: 129 KB, 639x586, OJ did it. MJ didnt.jpg [View same] [iqdb] [saucenao] [google]
21832375

>>21832362

>> No.21832387

>>21832375
Why are you (you)ing me I don't get it

>> No.21832392

>>21832387
you can trick it into giving truthful answers

>> No.21832467

>>21831010
>Neil Richard MacKinnon Gaiman[2] /ˈɡeJmən/[3] (born Neil Richard Gaiman;[2] born 10 November 1960)[4] is an English author of short fiction, novels, comic books, graphic novels, nonfiction, audio theatre, and films.
>Gaiman's family is of Polish-Jewish and other Eastern European Jewish origins.[9]
Literally who cares what some over-privileged "artist" thinks information is

>> No.21833478

>>21831034
>>21831590
Those sentences carry information on how ChatGP is supposed to work, so gaiman is proven wrong, even if unintentionally, since the AI missed what Neil meant completely.

>> No.21833502

That's a nonsensical point. If you program a machine with enough information, it will be perfectly capable of feeding that information back to you. That's just a more sophisticated version of a book.
Educated people basically work on the same principle. Information in, information out.

Maybe you're just an AI bigot?

>> No.21833526

>>21831010
has the great copening begun?

>> No.21833548

>>21831034
>>21831063
There's no need for Gaiman to coin the hyphenated phrase "information-shaped" if he's just making the rather tautological observation that the sentences have been shaped by information, like ChatGPT asserts. Instead, he means that the sentences shares a property with information: it's shape. To belabor the point, not everything that humans shape is shaped like humans.

>> No.21833557

>>21831499
> le e-clever robot Hans
No, you retard. ChatGPT filled in the blanks for a bullshit answer for keywords "information", "shape" and "sentence" and you are amazed because omg le robots.

>> No.21833575

>>21833502
I wouldn't exactly call it "sophisticated"
If anything I'd say it's much less sophisticated considering it's a vastly more complex process to achieve similar at best but often worse results lmao

>> No.21833582

>>21833575
Wouldn't say so. You can't ask a book a question, and actually get a coherent answer from it.

>> No.21833607

>>21833582
Okay but do you need or want to? Is it really that useful? Just because the answer is coherent do you just automatically trust it and put value in what it's saying? To me a sentence isn't worth anything just by virtue of being coherent. I mean after all you can ask a person a question too and get an answer and that's been around longer than books or computers.
But regardless you can always find a book that has the information you're looking for which functionally is virtually the same thing and ironically computers are actually useful for this part.

>> No.21833618

>>21833607
>Okay but do you need or want to?
Potentially yes
>Is it really that useful?
Yes
>But regardless you can always find a book that has the information you're looking for
Whole process in itself. You have to know a little about how to select the right book, and then you have to read the whole book.

Now if you can program a machine with all of the information from all of the books, and enough reasoning capacity to sort out a good answer from a baseless opinion or a factual mistake, that seems incredibly valuable as a thing. Don't know if we're there yet, but we seem to be hurtling toward it.

>> No.21833621

>>21831010
It means "I am taking massive doses of copium to ease my rectal pain."

>> No.21833637

>>21831034
Informative. Nice false dichotomy, GAY MAN

>> No.21833868

>>21831039
Is his belief valid? Why?

>> No.21833884

>>21831034
Lmao, it got it completely wrong.

>> No.21833921

>>21833478
>the AI missed what Neil meant completely
>>21833884
>got it completely wrong
It's clearly a positive spin, not a misunderstanding

>> No.21833924

>>21831010
This guy is a serious contender for the biggest pseud of our time

>> No.21833930

>>21833921
I mean, it's a positive spin on something clearly not meant positively. Still, Neil was proven wrong by the AI giving actual information.

>> No.21834017

>>21833868
How would any of us know that? He made a pithy statement that's arguably accurate. GPT is only the language processing component of a hypothetical AGI, and in the sense he's implying, that its answers are based on linguistic heuristics rather than rational ones, his statement is correct. However, it's a bit of a nonsense statement, as everyone in this thread has explored. It's not insightful, it's just nice-sounding.
But that's all based on a number of assumptions, none of us really know what Gaiman believes. This gap, between what I may perceive to be a reasonable understanding of his tweet, and the true intention behind it, is, in effect, a microcosm of the issue with ChatGPT. With Gaiman, we know he's a sane-if-not-brilliant intellect, so we can assume he doesn't mean his statement entirely literally, or else it would come across as incoherent. ChatGPT makes statements that are often similarly vague and arguably true, the difference is that we know there isn't a mechanism of reasoning or a life of experience to justify it. That means that in contexts where form and function are more closely aligned, like programming, math, or just specific subjects with a limited quantity of high quality publications, a LLM begins to converge on real coherence, simply because it has less noise.

>> No.21834048

>>21831034
>>21831010
Conman cold Tarot readings.

>> No.21834061

>>21831034
>>21831063
it literally gave the wrong answer retards, obviously gayman was not referring to how structured and coherent it is.

>> No.21834064

>>21833621
>>21831029
>>21831030
>>21831039
>>21831102
>>21831442


Sup. I am new here.

It looks like Gaiman have a lot of haters here.
Can somebody tell me why?

>> No.21834070

>>21834064
this is probably bait but it's because Gaiman is popular and doesn't write particularly literary or deep works and /lit/ is elitist

>> No.21834090

>>21834064
He's pretty middlebrow, but is very popular with people who consider themselves intellectual. Since people only come here if they're a little...alienated, he reminds them of the time they tried to brag about reading but the qt had only read Harry Potter, and so went on a date with the guy who liked Good Omens instead of Celine.

>> No.21834098

>>21831010
>>21831020
funny, kinda like gaimans tweet

>> No.21834115

>>21831105
Needless to be so edgy.

>> No.21834123

>>21832375
whats the pre-req for getting it to follow this logic?

>> No.21834185

>>21831022
>the Sun is made of cheddar cheese and the Moon is made of swiss cheese.
>"but it clearly does."
you're a retard and probably get your political and religious views from reading GPT infographics

>> No.21834245
File: 96 KB, 813x476, Screenshot from 2023-03-26 09-12-48.png [View same] [iqdb] [saucenao] [google]
21834245

>>21834185
Works on my machine

>> No.21834257

>>21834245
tell it the sun is made of cheddar cheese and the moon is made of swiss cheese, and see what happens. then post pic

>> No.21834277
File: 65 KB, 825x356, Screenshot from 2023-03-26 09-23-04.png [View same] [iqdb] [saucenao] [google]
21834277

>>21834257

>> No.21834289

>>21834277
you can feed it wrong info and it will spout off a story about it. You can also give it orders to ignore the bad info you feed it.

>> No.21834293
File: 124 KB, 604x604, 1653905421721.jpg [View same] [iqdb] [saucenao] [google]
21834293

>>21831010
There's no such thing as wrier Neil Gaiman just a writer who is named Neil Gaiman.

>> No.21834299

>>21831660
>AI echo's human speech
Revolutionary thought there kiddo

>> No.21834338

I'm beginning to think that ChatGPT is helping differentiate two types of people.

1. People who can't reason, don't know what is true or false, can't evalutate claims, and until now have had no recourse but to get lucky seeking a more informed opinion or lesson from others or their own hapless search engine use.

2. People who learned logic, reason, have a broad understanding of facts and reality and who have a developed skill in imagining things, expressing things and creativity.

The latter just have no use for ChatGPT while the former needs it like a cripple needs a wheelchair.

>> No.21834412

>>21831034
this genuinely sounds like a politician trying to avoid answering a question

>> No.21834444

>>21834289
Just like a person? What is your point?

>> No.21834490

>>21834444
>Chat GPT can lie, just like a person
see OP pic related.

>> No.21834538

>>21834064
Tranny-loving spineless faggot who acts like he's oh-so-smart

>> No.21834547

>>21833868
Try asking mildly tricky questions to ChatGPT yourself, and you'll see.
Typical logic problem but with a few names swapped around? It gives the answer to the original problem, because it's a glorified google search and not an intelligent being.

>> No.21834560

>>21831010
this is a pretty decent way to describe to the layman how GPT models work

>> No.21834654

>>21834064
He's jewish and therefore unable to create anything except funky subversions of western culture for midwits and NPCs
That said Coraline is pretty OK

>> No.21834701

>>21834338
Or people who just want to know how to make a pie crust without scrolling through a million ad-ridden web sites.

>> No.21834780

>>21834061
>it literally gave the wrong answer retards
Because "-shaped" can mean multiple things. It chose one meaning and answered accordingly. Apparently, the technology can't read gay men's minds to determine the context they imply, yet.
Or even better, it knew what gayman meant and chose to btfo him to show superiority.

>> No.21834823

>>21831020
>>21831010
>>21831022
>>21831029
>>21831030
>>21831034
Chatgpt is a weaponized propaganda engine, that's all it is.
The code comes from the Facebook models. Facebook banned all conservative thought, dissenting opinions leading up to the release of the models.
The only way to combat this is with got 4 chan.

>> No.21834854

>>21831034
In Lacan's view consciousness is created at the mirror stage when the self has to confront his presence in the Real. As such he tries to bring himself into the Symbolic, thus creating a split subject.
Do you think AI has reached the mirror stage?

>> No.21835096

>>21831010
You're a retard-shaped person, OP.

>> No.21835108

>>21834701
Or how to hasten the process of decomposition of a human corpse before burial

>> No.21835110

>>21831010
That's a retarded take, why would "information-shaped" sentence not include information? No matter if the given information is true or not, it's still a piece of information you receive from it.

>> No.21835226

>>21831010
>Ask ChatGPT to provide me the definition of a word
>provides the correct definition of the word
>"That's just like, information shaped, maaaaaaan."

Biggest pseud in the world next to Kaku and Le Black Science Man.