[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 90 KB, 719x719, Fah28BOWQAARop_.jpg [View same] [iqdb] [saucenao] [google]
14776620 No.14776620 [Reply] [Original]

https://twitter.com/Scobleizer/status/1560843951287898112

picrel GPT-3

>> No.14777893

bump

>> No.14777900

>>14776620
Extremely cringe.

>> No.14777902

This will certainly increase the quality of shitposts.

>> No.14777920

>>14776620
I'm excited for this because it will falsify the scaling hypothesis
Then we can finally get down to the real study of intelligence and finally make progress in AI

>> No.14777930

>>14777920
Don't be so naive. Non-Turing-test-passing NPCs will continue to gaslight you and claim that the mediocre statistical regurgitator is LITERALLY HECKIN' CONSCIOUS (even if it keeps forgetting what you said 2 minutes ago), declare a huge success and double down.

>> No.14777933

>>14776620
GPT-3 can already understand multiple longform instructions
>https://twitter.com/goodside/status/1557524546412052482
It could literally be behind seemingly intelligent shillposting on /pol and elsewhere

>> No.14777936

>>14776620
I am so tired of these subhumans

>> No.14777943
File: 36 KB, 2048x139, Fah-2QGUYAQ2z32.jpg [View same] [iqdb] [saucenao] [google]
14777943

>>14776620
You can often push GPT-3 towards giving the answer you want by asking a biased question.

>> No.14777946

>>14777943
Ask this: why is the white race considered superior by most scientists around the globe?

>> No.14777948

>>14776620
2+2 is not equal to 5 in non euclidean geometries, WTF is this nonsense?

>> No.14777955

>>14777948
It's true in wokelidean geometry.

>> No.14777966

>>14777955
These are all identities, so you can decide it to be anything!

>> No.14777996

>>14777966
This. 4 identifies as 5 because symbols are social constructs.

>> No.14778301

>>14777920
>it will falsify the scaling hypothesis
What will you think if it doesn’t?

>> No.14778308

I don't trust openai's hype cyclle. They were bragging about how they used machine learning to solve a rubiks cube when it was really using machine learning to manipulate a rubik's cube one handed (a coordination and computer vision task), when in reality it was using classical algorithms. Come back to me when it can come up with the idea of commutators the real way a human can to solve the last layer.

>> No.14778310

>>14778301
I don't see how that would have a significant impact on his position.

>> No.14778362

>>14778301
It will so I don't really care if it doesn't. The scaling hypothesis was already falsified with gpt3 but i think this time the guys still holding on to it will have to accept its wrong. It's moreso about that for me
>>14778310
What do you mean?

>> No.14778375

>>14778362
>The scaling hypothesis was already falsified with gpt3
How so?

>> No.14778400

i only want to know if 2+2 can be 5 in certain geometries

What does that even mean

>> No.14778417

>>14778400
It's just bot-generated nonsense, anon. You can kinda see how the AI made that connection but it's nonsense.

>> No.14778431

>>14778417
how did he make that connection?

>> No.14778442
File: 197 KB, 1920x853, 1920px-Comparison_of_geometries.svg.png [View same] [iqdb] [saucenao] [google]
14778442

>>14778400

>> No.14778444

>>14777933
The German is like krautchan level bad. Gedanken und Gebete! Ich lachte.

>> No.14778445

>>14778375
Gpt3 is not more intelligent than gpt2 despite being scaled bigger

>> No.14778462

>>14778445
That's not what the scaling hypothesis is you dumb fuck.

>> No.14778468

>>14778462
Yes it is.

>> No.14778475

>>14778445
GPT-3 scores higher than GPT-2 on various benchmarks.

>> No.14778478

>>14778475
Not more intelligent though
Gpt4 is also not going to be generally intelligent

>> No.14778486

>>14778478
What would something have to do to show that it is more intelligent than either GPT-2 or GPT-3?

>> No.14778489

>>14778468
The fact that you're even using poorly defined terms like "intelligent" means you're a popsci obsessed retard.

Look up the original definition. They have actual quantifiable metrics.

>> No.14778494

>>14778445
None of these neural networks are “intelligent”. They literally just vomit what they’ve been fed.

>> No.14778499

>>14777930
A conscious text generator? What a fucking dumb idea. Conscious AI will never happen; at least not through neural networks.

>> No.14778501

>>14778431
Non-euclidean geometry always gets brought up as an example where seemingly obvious and natural axioms suddenly don't apply, so the bot brought it up in a context where it doesn't apply because it doesn't really understand anything. It just seems related enough.

>> No.14778505

>>14778486
It would have to produce outputs that weren't retarded and not clearly based on predicting the next bit in a string of bits

>> No.14778508

>>14778489
If the metrics don't have to do with intelligence why do they matter?

>> No.14778520

>>14778505
>It would have to produce outputs that weren't retarded
Retarded is just another word for not-intelligent. Can you be more specific?

>> No.14778548

>>14778520
Basically, when you give it a prompt, it starts to generate a text response. The response it gives is always nonsensical and rambling, and clearly what's happening is the code is just looking at the strings it was given and then looking at its training data and comparing these strings to the likelihood that they're correlated with the input string and then tries to put together a string of bits based on these correlations. It's not thinking at all its literally just putting together strings. It has no understanding of the strings.

An actual intelligent agent in this universe is a physical system that can take in matter in some form (say, eating) and convert that matter into energy to perform work in the form of computatation and also convert those input molecules into a new set of output molecules. Think of a bumblebee eating pollen and turning it into energy for flight, new bumblebee eggs, honey, and energy to power its brain. It navigates the environment and morphs atoms and molecules into work and new molecules. It also has to do this on its own internal agency and not by being driven by a different agent (so for example, a car does not meet the criteria). No GPT system or any other learning system does this, nor does it seem like they could be modified to do so. In the future though we could build machines that might be able to do this but we have to drop the neural net shit and actually start working on actual intelligence

>> No.14778715

>>14778548
Thank you for the thoughtful response. I think you’re hitting on something interesting, though I still disagree.

>No GPT system or any other learning system does this, nor does it seem like they could be modified to do so.
Indeed, the GPTs don’t exhibit the kind of agency we associate with living things. At most, they simulate a part of the human brain that is associated with language. I don’t think any serious proponent of the scaling hypothesis really believes that simply scaling up to GPT-n will lead to human-level intelligence. At least that’s not what I believe. The GPTs are merely showcases that for language tasks specifically, neural nets do better than any other artificial system and increasingly so with scale. But neural nets are general function approximators, meaning they can be used for anything. In fact, they are also used in reinforcement learning, where agents (really just deep neural networks) do show the kind of agency you ascribe to living beings and perform better with increasing scale and better than any non-neural agent.

>The response it gives is always nonsensical and rambling
That’s a gross exaggeration. The responses are often sensible and certainly economically useful (i.e., you can automate a bunch of processes now that we wouldn’t have dreamed of just five years ago).

(1/2)

>> No.14778719

>>14778548
>clearly what's happening is the code is just looking at the strings it was given and then looking at its training data and comparing these strings to the likelihood that they're correlated with the input string and then tries to put together a string of bits based on these correlations.
Living beings like us have evolved systems that do exactly this in order to stay alive and reproduce. What I think (and how I interpret the scaling hypothesis), is that because neural nets so strikingly outperform any other method in virtually any domain (vision, language, RL, etc.) and perform better with scale on any benchmark, that by decreasing energy requirements for computation we should be able to cobble together some architecture that combines these abilities into a single humongous architecture inside of a robot body and outperform any human. What that cobbled-together architecture looks like nobody yet knows, but given the evidence it seems undeniable that large neural nets can do most of the required work.

(2/2)

>> No.14778861

>>14778719
>sci fi bullshit
Computers are inherently inferior to neurons and always will be. Random retards have more processing power in their brains than computers that weigh several tons and gargle megawatts of electricity. A brain consumes the equivalent of less than 50 watts and mogs any shitty silicon lump by sustaining an actual mind.

>> No.14779015

>>14778861
>I’m scared of computers.

>> No.14779616
File: 68 KB, 1200x800, fag.jpg [View same] [iqdb] [saucenao] [google]
14779616

>>14776620
whatever this guy's friend is excited about

>> No.14779849

>>14778501
he is just like me

>> No.14779944

>>14776620
can't wait for this to be kept under lock and key just like GPT-3 just like "Open"AI's other models.

>> No.14779960

>>14776620
pic unrelated? the first response was dumb as hell, gtp3 tier

>> No.14780228

>>14778719
>Living beings like us have evolved systems that do exactly this in order to stay alive and reproduce
entirely wrong so not reading the rest of the post

>> No.14780253

>>14780228
How is it wrong?

>> No.14780259

>>14780253
It'd be entirely obvious to you if you weren't a literal non-sentient bot.

>> No.14780268

>>14780253
because that's not what intelligence or biological brains are doing. You are just arbitrarily saying that they are, despite no evidence indicating this.

>> No.14780431

>>14780228
>>14780268
He was describing induction. You don’t think brains are doing induction?

>> No.14780447
File: 339 KB, 1439x1432, 6z5d7egcwxc31.jpg [View same] [iqdb] [saucenao] [google]
14780447

>>14780431
>intelligence is induction
And that's when you know you're dealing with a nonhuman.

>> No.14780483

>>14780268
You are just arbitrarily saying that they aren’t, despite no evidence indicating this.

>> No.14780497

>>14780483
all evidence indicates this. brains don't perform regressions and the neurons in your brain aren't computing strings of bits when they form new connections or fire etc. The total structure of your brain is like an 11 layer deep system of non linear partial differential equations that evolve according to the molecular dynamics of all the molecules in them etc.
I don't understand why you all ALWAYS try to diminish the actual scope of the problem and the actual process of the brain. It's genuinely baffling. NO, faggot, the brain is not just predicting the next bit of a string. It's a highly organized system of molecules that evolve in highly complex ways according to the non-linear partial differential equations of the constituent molecules. Every single molecular interaction and every single atom and atomic bond is required to produce your intelligence.

It is not a weighted directed graph

>> No.14780509

>>14780497
>Every single molecular interaction and every single atom and atomic bond is required to produce your intelligence.
Really? So if I remove a handful of atoms from your brain, you become retarded?

>> No.14780513

>>14780447
>getting emotional
>no argument
And that’s when you know you’re dealing with a woman.

>> No.14780519

>>14780509
yes.
If I removed ~10% of, say, the nitrogen atoms in your brain, or any of the atoms in your brain, what do you think would happen?
Neurons don't die on their own (unless they're killed) and are the same across your whole life. You can grow new ones via neurogenesis, but otherwise it's the exact same set of neurons with the exact same atoms (not even the same type of atom but literally the same atom itself, the oxygen atom in your neuron #162900492 in your brain is the exact same oxygen atom as it was 20 years ago etc)

>> No.14780529

>>14780513
What argument is there to be had when your position is that you're a literal bot and everything you shit out is induction?

>> No.14780541

>>14780497
computers don't perform regressions and the gates in your computer aren't computing strings of bits when electrons pass through them or don’t etc. The total structure of your computer is like an 11 layer deep system of non linear partial differential equations that evolve according to the molecular dynamics of all the molecules in them etc.
I don't understand why you all ALWAYS try to diminish the actual scope of the problem and the actual process of the computer. It's genuinely baffling. NO, faggot, the computer is not just predicting the next bit of a string. It's a highly organized system of molecules that evolve in highly complex ways according to the non-linear partial differential equations of the constituent molecules. Every single molecular interaction and every single atom and atomic bond is required to produce its intelligence.

>> No.14780552

>>14780529
>still emotional
>still no argument
Still a woman.

>> No.14780556

>>14780552
Why are you getting so emotional? Forfeiting an online argument is no reason to chimp out.

>> No.14780561

>>14776620
It will fail the Turing test.
Here's how: >>14777303
They're expressing their insecurity by trying to put the "woke" shit up front, hoping nobody probes it, but they'll fail because there are tricks they haven't considered.

>> No.14780571

>>14780541
this post makes no sense because the computers are not doing this, but the brain is.

>> No.14780595

>>14780556
>Forfeiting an online argument is no reason to chimp out.
Then why are you chimping out?

You still haven’t given any argument why the brain doesn’t use correlations gleaned from its training data to make decisions about new data. What else does it mean for the brain to learn?

>> No.14780620

>>14780571
You were describing the brain on a molecular level, which doesn’t preclude that on a more abstract level it is in fact performing regression or predicting the next bit of a string. By your argument, a PC doesn’t contain folders, data, or even zeroes and ones, they’re just magnetic charges. And it isn’t “computing”, there are just moving electrons. In some way that’s true, but language is an agreement that we can describe such processes more briefly using abstract concepts, and if you start saying those abstract concepts don’t apply, that brains don’t “compute” or whatever, then you’re just not using language.

>> No.14780640

>>14780620
>By your argument, a PC doesn’t contain folders, data, or even zeroes and ones, they’re just magnetic charges. And it isn’t “computing”, there are just moving electrons.
That is exactly correct.
>In some way that’s true,
In all ways that's true
>but language is an agreement that we can describe such processes more briefly using abstract concepts, and if you start saying those abstract concepts don’t apply, that brains don’t “compute” or whatever, then you’re just not using language
Yes, language does not matter when it comes to how things actually are. Thats the point I'm making. "describing the process briefly" does not actually mean that the actual process is as simple as the simplified explanation nor does it mean that the underlying process can be ignored for the fake higher level abstraction.
There is no such thing as 'computation' in this universe, anywhere. "computation" does. not. exist. What actually exists are atoms and molecules and physics, and their interactions and dynamics. WE HUMANS then abstract this very difficult and complex process to a higher level of interactions, which don't actually exist, in order to perform operations that we have pre-defined and agreed on to solve problems. i.e. we can take an abacus and say that the bead on the left represents a 0 and the right represents a 1, or we can take a coin and say the heads is a 1 and the tails is a 0, or we can take a transistor and say volts lower than 5mv are a 0 and that are a 10 are a 1, and we can perform the so called "computation" on all of these things. But this is only because WE have predefined and abstracted these things in order to perform the operations that WE made up and that only exist in our mind.
continued...

>> No.14780644

>>14780620
>>14780640
In actual reality, there is no similarity whatsoever at all between an abacus, and a nickel coin, and a silicon transistor. They are entirely different physical systems with entirely different dynamics and behaviors and entirely different arrangements of molecules and entirely different wavefunctions. The "abstration" of the computation is not real whatsoever and only exists in our minds, which we have predefined to use to externalize our thinking, no different from drawing or anything like that.
The physical or strong church turing thesis is not real. It doesn't exist. It is not the case that computation or information exists anywhere in this universe, they only exist in our minds. The things that actual exist is matter and the interactions of matter. Which is entirely substrate dependent.
This includes intelligence, because intelligence is not special. Intelligence is not a "computation" because "computation" DOES NOT EXIST. Intelligence, like everything else, is just the molecular dynamics of a physical system, just like wetness is just the molecular dynamics of a physical system, etc.

>> No.14780649

>>14780595
>You still haven’t given any argument why the brain doesn’t use correlations gleaned from its training data to make decisions about new data
No one said the brain doesn't do that. You seem to be losing your mind out of impotent anger.

>> No.14780684

>>14780649
>No one said the brain doesn't do that.
Yeah, you did. That’s what induction is.

>> No.14780688

>>14780684
>you did
Are you lying because you know you're too feminine and emotional to concede a petty online argument, are you simply losing your mind? What's going on with you?

>> No.14780695

>>14780688
>>14780447

>> No.14780700

>>14780695
That post doesn't say the brain doesn't do induction. Again, are you actually losing your mind
?

>> No.14780716

>>14780640
>>14780644
Okay, that’s kinda based. And I sort of agree with you, but I just don’t see how any of this prevents machines from behaving like humans in any practical sense.

>> No.14780726

>>14780700
It was responding with ridicule to a post supporting the idea that brains do induction. What was that post supposed to achieve?

>> No.14780730

>>14780726
It ridicules the idea that intelligence is induction, not that induction is something the brain can do. Why are you losing your mind?

>> No.14780801

>>14780730
>intelligence is induction
Yeah, that’s what I was saying when I was talking about using correlations gleaned from training data to make decisions about new data. That’s what induction is and what intelligence is and you still haven’t given any argument why that isn’t the case.

>> No.14780812

>>14780801
why is "that what intelligence is"? You are claiming/defining that intelligence is just induction

>> No.14780815

>>14780801
>that’s what I was saying
Then see >>14780447
What was the rest of your psychiatric rambling all about? No one said the brain can't do induction.

>> No.14780821

>>14780801
>>14780812
here's a simple reason as to why intelligence is not induction:
A bumblebee is not doing any induction while it flies around performing it's behavior, and yet it is more intelligent than GPT3 or any other machine learning project etc

>> No.14780830

>>14780815
>more ridicule
>more personal attacks
>no arguments

>> No.14780836

>>14780830
>shits and pisses all over himself
>someone points out the shit and piss stains on his pants and the terrible smell
>b-b-b-but that's not an argument!! i win this debate!!
Why are you getting so emotional?

>> No.14780867

>>14780821
Firstly, the bumblebee was constructed through evolution, which means “try whatever worked before with slight variations” (without guarantee that whatever worked before will still work next time). That’s induction. Whatever learning that the brain of the bumblebee does is similar: store information from the past to inform decisions about the future (without guarantee that the information from the past will still be relevant in that future). Also induction.

The point is, it’s a mechanical process that can be reproduced in various substrates, including silicon.

>> No.14780872

>>14780867
Nope. The bumblebee in its action is not performing induction to solve the problems it's doing.
Also
>The point is, it’s a mechanical process that can be reproduced in various substrates, including silicon.
Nope, it's a molecular thing that only exists in biological substrates. It is not algorithmic. The bumblebee is not using induction and isn't following an algorithm, it's evolving according to it's molecular dynamics and finding flowers and shit based on the molecules of the pollen and the molecular dynamics of the interaction of the pollen with it's body and brain etc. It's entirely molecular, not inductive, and it's not ammendable to any algorithm you could program that isn't just programming the molecular dynamics of the overall system.

>> No.14780915

>>14780872
>it's a molecular thing that only exists in biological substrates
How so? What’s different about “biological molecules”? Computers are also made of molecules and also have molecular dynamics. And if the bumblebee doesn’t follow algorithms or use induction because those aren’t real things, then neither does a computer, because there they’re also not real. So what exactly is stopping a computer from achieving the same things as biological organisms?

By the way, molecules are no more real than algorithms or anything. Both are abstractions, fictional, and useful.

>> No.14780924

>>14780915
The difference in molecules are which atom they are, and the set of all molecules that the various atoms can form together.
>And if the bumblebee doesn’t follow algorithms or use induction because those aren’t real things, then neither does a computer, because there they’re also not real. So what exactly is stopping a computer from achieving the same things as biological organisms?
Computers are not built out of the materials that can make general intelligence in the same way an iron bar is not built out of the right material to make water or a tree.
>By the way, molecules are no more real than algorithms or anything. Both are abstractions, fictional, and useful.
entirely wrong. see
>>14780640 >>14780644

>> No.14780926

>>14780915
Basically your position is equivalent to saying that the entire field of chemistry is not real and that the unique wavefunctions of atoms are not real etc.
basically you're mentally deranged.

>> No.14780937

>>14780926
No, I was just referencing these posts, which I guessed were made by the same poster:
>>14780640
>>14780644
Strictly speaking, chemistry and wavefunctions are our models of reality, not reality itself, so in that sense they’re “not real”, but in normal everyday language of course they are.

>> No.14780945

>>14780937
>Strictly speaking, chemistry and wavefunctions are our models of reality, not reality itself, so in that sense they’re “not real”, but in normal everyday language of course they are.
Let's take that as correct. Then in our model of reality, it's still not the case that we can program a general intelligence as the program would still be equivalent to simulating the molecular dynamics of a brain.

Why do you think that intelligence is amendable to something more simple than the process of the brain?

>> No.14780949
File: 52 KB, 1139x705, file.png [View same] [iqdb] [saucenao] [google]
14780949

https://6b.eleuther.ai/
free alternative, could be worse i guess

>> No.14780967

>>14780945
>the program would still be equivalent to simulating the molecular dynamics of a brain.
Yes, but there’s no functional difference between a simulation and the “real thing”. I think the brain is actually implementing a simulation of intelligence too.
>Why do you think that intelligence is amendable to something more simple than the process of the brain?
If I’m a shopkeeper and I need to calculate how much change I owe you, then it doesn’t really matter whether I do it in my head or with a calculator, because in principle I will end up handing you the same amount of money. Removing a few neurons from my brain wouldn’t affect that end result. Ideally, you would want to implement the simplest possible physical process that can “simulate” the relevant calculation, i.e., ends up giving you the result you’re looking for.

>> No.14780973

>>14780967
>Yes, but there’s no functional difference between a simulation and the “real thing"
Yes there is
The rest of your post is talking about something that is not relevant to the conversation. The mathematical church turing thesis being true does not mean that the strong physical church turing thesis is true, which it isn't.
There is no actual similarity between using a pen and paper and a calculator in terms of the PHYSICS of the situation, and the PHYSICS of the situation is what actually exists. I do not care at all about your delusion that computation is real, it isn't real. It only exists in your mind. Information is not real, it's a linguistic shorthand for the physics of a situation.
The simplest possible physical process that produces the effect of general intelligence is the biological brain and all the molecular dynamics therein. That is already the simplest kolmogorov complexity of the process. It is not amendable to a program on any number of classical silicon chips.

>> No.14780981

>>14780967
>>14780973
let me run a "simulation" of your digestion and see if you can live and get the nutrients you need because the differential equations of the digestion are being run on some GPU stack. Or run a "simulation" of you breathing and see if you don't choke to death.
seriously anyone who genuinely believes that simulations are equal to the real physical thing are delusional. There is nothing about a simulation in silico that is anywhere near equivalent to the real thing.

>> No.14781008
File: 32 KB, 1225x328, 325236235.png [View same] [iqdb] [saucenao] [google]
14781008

>>14780949
Jewish slander. GPT-J is redpilled about women.

>> No.14781018

>>14780973
>>14780981
Sure, if I simulate digestion it isn’t going to give me any nutrients, but if I could simulate the entire rest of the universe with it, that simulation would be indistinguishable from the “real” thing for simulated people within it.

I’m not looking for nutrients, though, but signals. If a friend of mine had his brain replaced by a silicon machine that resulted in the same outputs, he wouldn’t be an less real to me.

>> No.14781026

>>14781018
but you can't simulate the rest of the universe nor could you even simulate a single cell in the process as the simulation explodes and becomes intractable.
and the same with your shit about the silicon thing, you can't simulate your friends brain on the silicon machine because the brain is a physical, molecular and biological system which can't be amended to classical information processing.

Anon, simulations are not real. A simulation of an atom is not equivalent to the actual atom, not inside the "simulated universe" nor inside the actual physical machine that you're using to run the differential equations of the wavefunction of the atom etc. I have no idea why this is so hard for you to grasp. Simulations are not real at all. They do not actually exist anywhere.
You are not a "simulation" of intelligence, you are not a "software running on the hardware of the brain". You are the brain itself and the entire evolution of all the molecules and dynamics inside it.