[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 27 KB, 500x313, tumblr_pk8riy9gaP1r42s8u_og_500.jpg [View same] [iqdb] [saucenao] [google]
10239966 No.10239966 [Reply] [Original]

https://aeon.co/essays/your-brain-does-not-process-information-and-it-is-not-a-computer

Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.

(…)

Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.

Humans, on the other hand, do not – never did, never will. Given this reality, why do so many scientists talk about our mental life as if we were computers?

>> No.10239967

>>10239966
what is
neural
networks

>> No.10240007 [DELETED] 

>>10239966

Oh look another thread that will inevitably devolve into non-mathematical AI/consciousness/philosophy

Okay I'll bite, but sage.

>Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.

What makes you think there isn't an algorithmic representation of the brains operations? What makes you think any of your statement isn't true of a brain?

Also, I won't even begin to describe the reasons that
>>10239967
is irrelevant.

>> No.10240029

>>10239966
Just because our brains don’t work exactly like electronics doesn’t mean that we don’t process information in some general sense

>> No.10240031

>>10239966
>They really have physical memories
Why do people with physical brain trauma lose memories?

>> No.10240052

What a horrible article. Restating your assertion over and over and over does not constitute an argument.

>> No.10240058

>he doesnt understand that you can approximate symbolic processing through distributed representations

>> No.10240062

>>10239966
What meaningless drivel. Process doesn't denote a specific thing. We know the brain receives information, runs it through several areas, formulates a response, bounces that around a couple of areas and actions it. That's a process.

>> No.10240076

I once knew this CS major who finished all his courses, became really disenchanted with STEM, and then finished an English degree in a year after all his CS requirements were filled. This article reminds me of him and all the anti STEM gobbledygook you find in the humanities. Now he’s getting an English PhD

>> No.10240089

>>10240076
A person gets a degree, then gets another degree.

Somehow you who cant even get one think that this makes him less qualified somehow?

cope
o
p
e

>> No.10240094

Haven't read the article yet but is it going down the "field of consciousness" approach to human cognition? Or is it discussing something else?

>> No.10240099

>>10240089
I have a degree, bucko

>> No.10240102

>>10240094
It's not discussing much of anything. There are a few shallow references, but it mostly just keeps repeating that brains are not computers. The main arguments seem to be "brains are not like computers because brains and computers are different" and "brains are not like computers because that's preposterous".

>> No.10240120

>>10240099
do you have two?

>> No.10240206

>>10240120
I could have 10 degrees and it wouldn’t change the antiscientific worldviews of some academics

>> No.10240300

>>10240007
To me its clear when you look at the brains hardware that it doesnt use algorithms or process things like a computer does.
Computers are designed: brains self-organize like e.g. a snowflake.

snowflakes dont use symbols.

>>10240062
This is not actually how the brain works at all imo.


And for all, the long, shitty article is about enactivist cognition. We dont process incoming information into symbols but through how we act in the world.

>> No.10240310

>>10240300
oh and to clarify, i endorse embodied cognition but think the articles shit.

>> No.10240347

>>10240102
>>10240094
>>10240076
>>10240062
>>10240058
>>10240052
>>10240029
>>10240031
you have autism. the article is arguing for an even more physicalist account of consciousness that's entirely biological and based in mechanics not in information and algorithms. The idea is to move away from abstraction towards purely embodied descriptions of "consciousness" or "thought" or "memory" which are metaphors for what an organism is doing when interacting with itself and its environment. The idea is that information is a much more useful metaphor for computing than it is for living systems. Similarly in genetics we use "information' when we really mean macromolecules and their geometry, spatial relationships and the physical systems that they constitute. The problem is that CS and information theory people don't understand that there are literal physical correlates which yield more explanatory power as to the why and how of the behavior and structure of these biological systems like the CNS and the genome than do representational and abstracted notions of these systems like "information". You're not adding anything to anyone's knowledge by saying algorithm, it doesn't help us understand the structure and mechanics of the brain or the genome, using biochemical and physical descriptions does, its a near 1:1 model of what's happening for our purposes and doesn't need to be translated again when we move away from theory and engineering applications (which also need to be translated into the specifics of the machinery being assembled using that theory). I say you have autism because you are the one's who don't ask "what is information" "does it have a physical basis" "what structures does it correlate with" "what is an algorithm as an embodied process". This is more interesting to scientists than to engineers because it has actual potential to reveal why a system behaves the way it does rather than merely using symbols to derive general heuristics for mimicking the system

>> No.10240356

>>10240310
>>10240300
Based. I was going to respond to a few of the idiots ITT, until I found you, at least on person that seems to know what they are talking about.
I'm wanting to read some contemporary radical approaches to cognition. I don't know much about psychology, but I am very into philosophy and the life sciences, CS Peirce and to a lesser extent biosemiotics are the backbone of my current understanding of cognition. I'm going to read Gibson and Varela after reviewing Mereology and Location. I don't know what to think about Varela, I don't understand what is supposedly animating Autopoesis. It seems like it's using dynamic systems to cop-out of explaining the teleological nature of life. I don't see the logic in it, now Terrence Deacon did a good job. I'm a big semiotics guy. Gibson's Afforance theory seems closer to the truth. Then again I'm not sure how afforance is claimed to work in Ecological Psychology.
What do you think about Ecological Psychology?
Reading recommendations?

>> No.10240375

>>10240347
Sorry but noone will gonna read all that shit wall of text

>> No.10240382

>>10240375
TLDR: information is a metaphor, at a certain point especially when dealing with living systems you have to go beyond it. the same problems that plague genetics are present in neuroscience, where physical explanations are superior to one's from information science. Author is not arguing for woo-woo consciousness, the complete opposite. Memory is just a change in brain structure and activity, no information is stored, stimuli get paired with one another and then change is elicited as a positive function in the brain. Structure and nature of the mind is entirely dependent on past brain states and the context of the organism and its genome, cannot transfer this to a different medium or platform. That's the core of the author's thesis. They're arguing for embodied consciousness moving away from information theory.

>> No.10240396

>>10240347
I completely agree with you but that’s not really what the article said. Also I think the author could be the autistic one. He’s just saying scientists lose track of what they’re studying because of metaphors, but I think they all know that the brain isn’t made of silicon.

>> No.10240417

>>10240300
Yes but your opinion is wrong. The brain literally runs information through loops, parts adjust, communicate, passes again until something is executed.

>> No.10240581

>>10240356
havent read any of them really. just know gibson, only heard of some of the others. downloaded a book a radical embodied cognition by anthony chemeros that ive read a few pages on thats all and a couple articles and 'pedia pages.
Dynamical systems is a big part of embodied cognition though and imo embodied cognition is actually antiteleological. Isnt semiotocs contradictory of embodied cognition?
I dont know anything about ecological psychology desu.

>> No.10240689

>>10239966
oh my god someone actually posted this faggot's drooling article on /sci/. i fuckin hope it's for the purpose of "look at this fool". I'm not going to bother writing out all the refutations of this guy's babble. "brain don use *algorivm* to catch ball" etc. you can google 'epstein brain retard' or some such.
the main mistake if i remember is he thinks (as many people on sci do) the only way of 'representing data' is like, a bitmap
and the only way of 'processing data' is von neuman architecture

>> No.10240739

>>10240689
In the influential Hopfield ANN paper describes a dissatisfaction with computer views of a brain that works with algorithms and operations like a computer. The brains hardware doesnt seem to work like that. In the paper he says a difference is that computers need to be programmed. the brain isnt. it cannot work the same way.

>> No.10240767

When they invented telephone exchanges, everyone said the brain was like a telephone exchange.

When they invented holograms everyone said the brain was like a hologram.

When they invented computers, everyone said the brain was like a computer.

Is this telling you something?

>> No.10240854

>>10240767
...the brain is like a fidget spinner? Some kind of blockchain?

>> No.10240884

>>10240854
more like a heteroclinic cycle

>> No.10240886

>>10240347
>The problem is that CS and information theory people don't understand that there are literal physical correlates which yield more explanatory power as to the why and how of the behavior and structure of these biological systems like the CNS and the genome than do representational and abstracted notions of these systems like "information".

I love this sentiment so much and it is part of why I left cog sci (and I did mathematics research in cog sci - it's possible to do that without being a computationalist!). Epstein really is right about the complete inability for people in these fields to speak without reference to IP, which is a dangerous habit for a scientist. I would like to buy you a beer or tea or whatever. The response articles were appropriately daft, barely addressing his points.

>> No.10240914

>>10240739
>in the paper he says computers need to be programmed
yes. because he doesn't know what the fuck he's talking about. this is not only wrong but irrelevant. it's like saying
>the brain runs on gluten, computers run on electricity. therefore it is stupid to talk about the brain handling information
and as an aside it's very lax and cringe to say 'the brain' all the time. bees have brains. an any has a brain.

>> No.10240916

>>10239966
So the brain is like quantum mechanic then?

>> No.10240918

If I'm not mistaken, memory recall can be induced via electrode stimulation. So if you memorized a 12 digit number, you very much are "acessing information". Computing a new number from that memorized 12 digit number could be called "information processing".

>> No.10240927

>>10240918
>no dude thAt's noT inFormaTion becAuse you diDn't enCode it into binAry nEceSsarily

>> No.10240931

>>10240914
I dont thinktl brains are computers because brains dont perform operations on information, they just look like they do.

>> No.10240938

>>10240918
Taking the analogy further, memories are stored in a different part of the brain than the part that does "higher order thinking". For instance, the 12 digit number could be stored in region A of the brain. You can have two scenarios, one where someone asks you to add 1 to the memorized number, and another scenario where someone asks you to draw a similarity between the memorized number and some other given number. In both scenarios, the same memorized number is recalled (resulting in the same firing pattern in brain region A both times), and the number is manipulated by some other mental circuitry, presumably in the frontal lobe. This is quite reminiscent of storage, memory and processing in computer hardware language. The important point is that in both instances, the same number is recalled from presumably the same neural firing pattern. So if a certain number X of neurons is responsible for the memory, it could be possible to quantify the information in terms of shannon entropy.

>> No.10240961

>>10240938
I would argue that that place actually isnt for memory solely and memory as it appears in a human isnt actually the same as memory on a computer. I believe our own concepts of human memory is a faulty metacognition which we then applied to computers. ill still call human memory memory but functionally it isnt computer memory.

>> No.10240963

>>10240961
Memory and storage aren't the same thing in computer hardware. Following the analogy, computer memory would be more like human working memory and computer storage would be long term memory in humans.

>> No.10240970

>>10240931
elaborate on what it means to appear to have operated on some information without really having done so

>> No.10240975

>>10239967
a mathematical function that is optimised to give a certain output for a set of inputs. sorry to burst your bubble but current ai memes have nothing to do with actual intelligence.

>> No.10240985
File: 82 KB, 645x729, 1508983875615.png [View same] [iqdb] [saucenao] [google]
10240985

is it possible for the same neuron to be responsible for multiple memories? one neuron lets say related to animals would fire if i hear cat or dog? i think is a dumb question

>> No.10240987

>>10240963
my point still stands i think.

>> No.10240989

>>10240854
it's a sphere formed in the likely of the cosmos and filled with marrow that holds the soul, which is a union of the everlasting with the always changing

>> No.10240992

>>10240987
Yeah just being pedantic sorry. The distinction between memory and storage in computer hardware breaks down at some point anyway.

>> No.10240993

>>10240975
his point is that those run on computers but at the relevant level of abstraction they don't store 'memory' in registers etc. - i mean some do, but if you look at a basic net that does recognition, the 'memory' of what '7' looks like is 'stored' in the 'weights' which is the extent to which each neuron is wired to the other

>> No.10240996

>>10240985
Highly doubt a single neuron stores a single memory, probably a net of neurons. It's possible multiple nets corresponding to different memories overlap with one another. This is assuming a single set of neurons has a 1 to 1 correspondence to a memory anyway. Just speculating at this point. I've heard a lot from various popsci sources that memories are changed every time they are recalled.

>> No.10241014

>>10239966
>2016
The article is hopelessly pedantic in some points, and arguing against non issues in others.
No body ever seriously though that data was stored in a physical location; the entire article is just the autyhpr trying to feel smart.

Data being stored in the structure and networks of neurons is far different than data being stored as bits, but it's data being stored all the same.

>> No.10241030

>>10240007
>What makes you think there isn't an algorithmic representation of the brains operations?

Not that anon but just to bring forward what's at stake here, a system can be non algorithmic but still deterministic, even one that's a classical computer

A common example is that it's mathematically proven that no algorithm will tell if a turing machine computer program will end with a result, or continue to run forever

https://en.wikipedia.org/wiki/Halting_problem

So imagine there's a lightbulb connected to a turing machine that turns off and on depending on if the program has completed or is still running. There's no algorithm to determine the state of the light bulb. The same way, there may be no algorithm to determine the state of a single neuron.

>> No.10241058

>>10239966
>What is the problem? Don’t we have a ‘representation’ of the dollar bill ‘stored’ in a ‘memory register’ in our brains? Can’t we just ‘retrieve’ it and use it to make our drawing?

>Obviously not, and a thousand years of neuroscience will never locate a representation of a dollar bill stored inside the human brain for the simple reason that it is not there to be found.

With this reasoning, the US mint could change the design of the dollar bill, specifically the parts that people can't recall from memory, and no one would notice.

Just because recall sucks doesn't mean we don't have detailed memorization.

>> No.10241068

>>10239966
Based. The concept of "mind as computer" is strangely new in society. Read Technopoly by Neil Postman.

>> No.10241089

>>10241058
yeah - i mean obviously there is a representation but it is structural, symbolic, semantic, etc. rather than bitmapped. Obviously, because ----you can visualise a dollar. I mean you can draw one, but in your mind you can 'draw it' part by part by visualising it, recalling details on focusing. A chopped up, encoded type of representation which because it isn't indexed by trivial geometric axes is beyond epstein's sputtering imagination.

>> No.10241091

>>10240996
the entire structure of the cortex with its modules and the connections between them reflect the worlds attributes and relations between them in the actual world.

if you can get activity to be assymmetric across that structure (e.g. one reason would be input patterns), you will be activating unique instances of that world structure, whether that be via memory or perception.

>> No.10241108

>>10241089
Well how would you know if its bitmapped or not?

I hate this article's double language. How memory is "changes to the brain" and not "stored". The state of binary digits "change" on a computer. We call it storage because it's so precise. The brain may work the same way despite being imprecise in both the way the medium is effected and the way information is retrieved.

>> No.10241113

>>10241108
uh.. i mean the way i store it isn't bitmapped. i can't tell you what the first pixel in top left corner of a dollar is ALTHOUGH
i could memorise it that way if i wanted to
AND
i believe one type of place/space memory has a spatial structure that reflects the real world geometry it represents in mice, i.e. neurons next to each other, which is a generalisation of the mapping concept.

>> No.10241119

>>10241108
>>10241113
Disregard actually, I was a retard and didn't realize bitmap specifically describes an array

>> No.10241162

>>10240417
what the fuck do you mean by that? what executes? what information, where and what do you mean by loops? you’re still speaking in metaphors, no one is suggesting the brain is subject to mechanics and the laws of nature, the contention is that you don’t know what the fuck you’re talking about and are grafting information theory jargon onto neurobiology and that neuroscientists have naively allowed Information theory and CS specialists to think for them by lending them loaded language. You need to be extremely specific when you speak about biological processes, what, how and where, otherwise you’re full of shit.
>>10240581
Semiotics has nothing to do with biology and is another jungle of confusing symbolism to get lost in.
>>10240970
elaborate on what you mean with physical correlates as to how information is stored, and then explain to me why we wouldn’t be left with models from biophysics, cell and molecular biology instead of information theory. If you can’t do this you’re as autistic as i accused you all of being above.

>> No.10241167

>>10239966
>Brain works like a computer
Who? Who the fuck outside of popsci has ever claimed this?

>> No.10241174

That whole part about the dollar bill is the most retarded thing I've ever read.

>> No.10241178

>>10241162
>explain to me why we wouldn’t be left with models from biophysics, cell and molecular biology instead of information theory. If you can’t do this you’re as autistic as i accused you all of being above.
Why do you act like the two are mutually exlcusive? In the realm of computer science (or engineering more specifically), one deals with both physical implementations of computations as well as an information-theoretic description of them. They're both useful.

>> No.10241192

>>10239966
Humans "process" information. We have very automatic responses. Vision, hearing, basic reflexes, hunger, etc. That is your brain processing messages from some sensory organ.

High level thinking is more complex. In my opinion, the way a human figures out 7 x 14 is VASTLY more complicated than the way a computer figures out 7 x 14. When I mean vastly, I mean it's the difference between the intellect of a rock and a dolphin. 7 x 14 for a computer is a series of pre-determined bitwise operations, while a human will have to use mathematic tricks and some guessing and checking to conclude what the answer is. Even then, a person is prone to error because the methods of coming to that answer use, to a certain degree, subjectiveness, while a computer uses pre-programed knowledge.

Honestly, until a computer gives an incorrect answer of 7 x 14, I will consider computers to have less intellect than a human, because as long as they always give a correct answer, they're just relying on human made logic to supply that answer.

>> No.10241195

>>10241178
At some point you have to reduce the description to a language that is as accurately representative of the structures themselves as possible, if this means biochemistry and physics then so be it. Information science as a way of building maps to forage for new structures and to ferret out function is fine, actually taking your heuristics seriously and thinking brains are computers with data storage is too far, you’ve forgotten that you’re merely using analogy and metaphor to aid you in the darkness.

>> No.10241227

>>10241192
>le when computers start giving wrong answers i'll le start paying attention
no one cares about your little aphorism. it's very tired at this point in history

>> No.10241230

>>10241227
I mean, my point is that a computer figuring out what 8102938019480175124 x 10293871208347192837 is extremely less impressive than a human figuring out what 3 + 5 is.

>> No.10241377
File: 159 KB, 639x1051, D7EE595DC2C54C25A731487144F58D07.jpg [View same] [iqdb] [saucenao] [google]
10241377

>>10240347
Tldr

>> No.10241757
File: 780 KB, 1024x576, Epstein.png [View same] [iqdb] [saucenao] [google]
10241757

Fucking idiot doesn't even know the definition of information. Any particle, wave or field interacting with any other processes information.

>> No.10241763

>>10239966
Doesn't really matter. We could still simulate it on a computer, unless you're claiming there is some unphysical process involved there (in which case you need to GTFO).

>> No.10241831

>>10240996
Not exactly changed everytime they're recalled, but whenever you remember a memory, you're recalling your last recollection of it as opposed to the original stimulus. This tends to yield incremental corruption in recall accuracy as any interference in any one recollection accumulates over the process of several recollections

>> No.10241834

>>10240031
it scares away the memory fairies

>> No.10241859

>>10239966
Yeah it probably operates on qubits and not only simple electrochemical processes.

Semi pop sci but really good written:
https://www.quantamagazine.org/a-new-spin-on-the-quantum-brain-20161102/

Paper:
https://www.sciencedirect.com/science/article/pii/S0003491615003243

Wiki (sadly both are shitty and completely outdated):
https://en.m.wikipedia.org/wiki/Quantum_brain_dynamics
https://en.m.wikipedia.org/wiki/Quantum_mind

>> No.10242021
File: 121 KB, 1462x2046, 3c7639f37ed1045fe471bf8a209ababe95ea397e9fe10ca6e8a9d87f24e3e0b6-b.jpg [View same] [iqdb] [saucenao] [google]
10242021

>>10240375
>noone

>> No.10242088
File: 58 KB, 645x729, 80c.png [View same] [iqdb] [saucenao] [google]
10242088

>>10241763
>Doesn't really matter. We could still simulate it on a computer,

>> No.10242781

>>10240347
Your points are valid. Calling others autists just makes you a double autist though, chill.
Post some links to your favorite scholarly articles to educate your brothers

>> No.10242872

>>10241195
this is very fair but i would also say that on the embodied view which is not so clearly stated in the paper (causing much confusion on the thread),

the argument is that whilst computers use representations, brains do not.

an interesting consideration is; do you think of an action you do as a representation? And what if all brain processing/states was equivalent to those types of dynamics.
>>10241831

>> No.10242925

>>10240347
You create a false dichotomy strawman about information to knock it down. Fuck you.

>> No.10243551

>>10242021
>he doesn't know who Peter Noone is

Is that pic your selfie?

>> No.10243703

>>10239966
Your brain may not process information, but lot of brains process information.

>> No.10243714
File: 67 KB, 326x294, 1545695508745.png [View same] [iqdb] [saucenao] [google]
10243714

The Egyptians must have been right, the brain is just stuffing for the skull cavity, information is processed in the heart

>> No.10244046

>>10240382
>Memory is just a change in brain structure and activity
How does he think computers store information?

>> No.10244113

>>10240382
So structure is not information. Information is just imaginary term, isn't it? There is no information in universe at all.

>> No.10245721

>>10241859
Perhaps brainlet post, but my interpretation of the brain would agree with anon above.

As I see it, our brains are parallel processing stimuli from the world and respond actively and passively. Experiences are "stored" via the ability to relive events. For example, with occipital lobe damage, one loses access to visual working memory. Obviously we have it down to neurons, but when looking at the neurons I believe things to be corresponded much like how the retinotopic portions are. We probably have corresponding neurons that are hard 1:1 ratios IN ADDITION to winner take all, and cumulative processing going on, these processes are mediated by the levels of potentiation between the neuron and it's networks?

I realize I am using a lot of computational language, much to OPs point, but some portions of the brain we know and understand to be computational and corresponding. The computer is definitely the most accurate model we have had of the brain as of yet I would think, no? If that changes to quantum computers, I would say it would be even more accurate, but unless we get biological computers, I don't think we would have any accurate analogies, because I don't think anything translates 1:1 at that complex a level.

>> No.10245732

>>10245721
In addition to this, part of our memory, is the understanding of processes. For example you do not need to memorize every single number. You just need to know "directions" on how to produce different numbers, i.e. calculate. Depending on your cognitive abilities, you will be able to do that to a relative degree. Of course you vary. Similarly, the London taxi cab study found that after memorizing the roads in London for the taxi cab test (one of the most intensive spatial memorization tests for humans), the drivers posterior hippocampus area encroached on the area of the anterior hippocampus making the latter smaller. They also found it harder to learn new routes.

I believe there to be a combat between the information correspondence in the neurons. Yet, only in areas with plasticity, i.e. memory.

I'm VERY interested to hear what OP and people of his kind- namely the alternative narratives- think about this topic.

>> No.10246226

What a trash article, I hate philosophers.
There are many errors to point Out.
What is even his definition of intelligence if not processing information.
What the fuck is thst dollar drawing experiment supposed to mean. Just because we can't draw a pixel perfect dollar bill we dont have a representation stored of it either? This just shows how little algorithmic and mathematical intuition the authoe has, the experiment perfectmy shows that we store a compressed representation of the dollsr bill using familiar simple shapes and patterns so we domt hsve to remember each fucking pixel.
It is also mathematically perfectly reasonable that recognizing memoties is easier than recalling them - problems can be asymmetrical.
I hate this fucking shit fuck philosophers.

>> No.10246227

>>10240382
information is not a fucking metaphor it's quite a rigorous concept

>> No.10246286

>>10246226
>philosophers
he's a psychology professor and some kind of behaviourist no less

>>10246227
yeah i think most cunts in fields outside Inf theory and physics have no inkling of this. tragically they don't even know they don't know

>> No.10246296

>>10246286
yea but this article is 100% philosophy and fuck all science

>> No.10246330

>>10240347

information is any thing that interacts in a system, so you could probably say anything is information

prove my ass wrong

protip: you can't

>> No.10246343

>>10239967
It's essentially a multi-layered logistic regression. Awesome and flexible classifier that actually deserves the hype it gets, would recommend. Has nothing to do with neuroscience though.

>> No.10246635

>>10246226
but his point is we can explain all behaviour and cognition without resorting to stored representations which is an empty homunculus concept. one small interesting argument is that you can model cognitive tasks without recourse to any kind of representation.

>>10246286
hes not a behaviourist but an embodied cognitivist and ironically its a field that considers dynamical systems theory as an important tool.

If i were him, what i would ask to all anons: in our purely physical world, what does it mean that something physically is a representation; like in a brain area?

>> No.10246656

>>10246635
>If i were him, what i would ask to all anons: in our purely physical world, what does it mean that something physically is a representation; like in a brain area?

a representation is an encoded model along with the encoding and decoding machinery

>> No.10246665

>>10240975
it's literally what your brain is

a neural network, made of neurons. artificial neural networks are mathematical representations of it.

>> No.10246679

>>10245721
>but some portions of the brain we know and understand to be computational and corresponding.

like what?

>> No.10246709

>>10239966
I'm not processing the visual information to reply to this thread.
Thanks anon, I'm convinced.

>> No.10246744

>>10246656
This is the thing. All descriptions of representations talk about some abstract thing identifiable by us as observers. But you dont tell me anything about how things exist in the physical world. Your definition has benefit to a reader but not to physical processes that actually exist.

>> No.10246761

>>10246744

i agree, it's a little circular to describe a model with another model or equally abstract langauge. consider a boltzmann machine. its parameters encode information which can be retrieved by setting a few of its visible units and letting it run for a while. information is encoded by adjusting these weights and then decoded by running it until you reach a low energy state.

>> No.10246784

>>10246761

cont. in my opinion, these are the most interesting types of neural networks. i wish i could research them. a boltzmann machine is the most basic design for something like this, energy-based models very under-researched.

most other networks are just deterministic one-way filters with fairly predictable (if not well-understood theoretically) characteristics.

>> No.10246789

"Worse still, even if we had the ability to take a snapshot of all of the brain’s 86 billion neurons and then to simulate the state of those neurons in a computer, that vast pattern would mean nothing outside the body of the brain that produced it"-can someone explain this quote from the article?

>> No.10246800

>>10240347
I have always assumed that to upload a human consciousness into a computer the computer would have to 100% simulate the movement of ever atom and subatomic particle inside the whole body of the person to be uploaded in real time in addition to air pressure and food intake.

I have never assumed we would be represented by simple algorithms but by essentially duplicating what our whole body and brain would be doing

>> No.10246816

>>10241162
>Semiotics has nothing to do with biology
>jungle of confusing symbolism to get lost in.
Hello psuedo; you have no idea what you are talking about.

>> No.10246818

>>10246800

exactly. this isn't to say you couldn't build a machine that would do something like this, but our current hardware is very inefficient when it comes to nondeterministic, asynchronous models like the boltzmann machine. they do not fit neatly into serial or SIMD computing machinery

>> No.10246837

>>10246761
Interesting you say this. Some say the brain might work in a similar way to this. Information is thought to travel from our sensory input upward via prediction error. This is analogous to the difference of the endstates of those two processes you describe. A potential hypothesis is that things like dreaming, imagination, planning, remembering and daydreaming are all examples of the latter process as a means to optimise the information we've already learned.

>> No.10246838

>>10239966
wat

>> No.10246855

>>10246837
>planning,

planning and executive functions may be based on a different sort of process, but certainly play a role in memory. you're always trying to optimize your discounted reward. humans have hard-coded objectives. reproduce whenever you can, eat when you're hungry, sleep when you're tired, and a set of higher-level imperatives and processes that balance exploration and exploitation. these are probably genetic and only able to be modified at a very young age. from that point on, the only way to modify someone's behavior is to modify their environment.

>> No.10246863

>>10246837
>via prediction error.

and yes, i've read a bit about this too. deltas between expected rewards and actual rewards stimulate learning

>> No.10246869

>>10246761
I also think connectionism captures some interesting aspects of my view. it works on the mechanistic interactions of neurons rather than symbols.

>> No.10246871

>>10246869

the boltzmann machine seems very fundamental and important, somehow. that was my impression when i first read about it.

>> No.10246888

>>10246855
>a different sort of process
different to what?

forgive me but i dont see what youre getting at quite though its interesting.
>>10246863
no not reward prediction errors, thats completely different. perceptual ones. like errors in knowledge of a pattern.

>> No.10246895

>>10246871
well they were designed as a model of how the brain might work.

>> No.10246906

>>10246888
>different to what?

>>10246888
>no not reward prediction errors, thats completely different

you said it yourself.

errors in the knowlege of a pattern. the more you're exposed to a pattern, the more likely it will impress itself into your memory. you may have false associations between similar patterns, but your encoding will always be a reflection of your perception itself. no "error" is made, necessarily.

>> No.10246918

>>10246906
>errors in the knowlege of a pattern. the more you're exposed to a pattern, the more likely it will impress itself into your memory. you may have false associations between similar patterns, but your encoding will always be a reflection of your perception itself. no "error" is made, necessarily.

and this may be a lower-level process than planning/executive-function/motivation related processes, although they're certainly related and inter-dependent

>> No.10246943

>>10246906
>>different to what?
you said different sort of process.. different to what?
>no "error" is made, necessarily
the idea is that the brain
What im explaining is from a theory of the brain which is modelled mathematically. The way patterns are impressed can be modelled by it too i imagine. What i meant is that in the theory we have an internalised knowledge of the world. we use that knowledge to predict patterns and regularities and categorise novel scenes. Prediction errors if you will are like evidence to hypotheses about the world. they either confirm it or reject it; then we change the hypothesis.


>>no not reward prediction errors, thats completely different
Reward prediction error is different to predictive coding that im referring to in psychological/neuro literature.


>>10246906

>> No.10246948

>>10246906
>>10246918
cont.

but i think reward-prediction errors are likely play a big role in what you probably mean by "errors in the knowledge of a pattern" we like exploring, we are all rewarded slightly for learning new things and finding common patterns in our observations. it's also sometimes necessary to find these patterns in planning out a course of action to achieve other rewards.

what does the mind do when, in this process, it recalls seemingly contradictory information? if your motivation is high enough, you go into learning/exploration mode and try to correct this error. if you cannot do so, your mind gives up to save energy and capacity for other things.

>> No.10246959

>>10246789
>can someone explain this quote from the article?
He is just saying that we have no way for us to interpret what the brain is thinking / doing, as every brain is wired differently with its memory's and thoughts.

The brain is basically a self contained system, with no output for us to read.

>> No.10246973

>>10246943
>Reward prediction error is different to predictive coding that im referring to in psychological/neuro literature.
see
>>10246948

they sound very much related.

>> No.10246985

>>10246948
cont.

and i believe that much of the social engineering we're subjected to is intended to disrupt and disincentivize exploration. this is why it's best to ignore it and let your instincts guide your exploratory efforts.

>> No.10246994

>>10246985

this is why true exploration is severely punished by the engineered culture of modern society.

>> No.10247011

>>10246948

>but i think reward-prediction errors are likely play a big role
no doubt, just wasnt relevant to what i was saying so the distinction was needed.

>>10246918

hypothetically these processes should all work similarly even if they are on different levels since they all work using the same neural circuit structure.

executive functions and motivation also are part of a prediction error related system.
And as i said in my original post; planning (but not all executive processes is part of that is in the latter decodong process of the network running on its own.

>> No.10247032

>>10247011

it's getting difficult to understand you. can you summarize your point?

>> No.10247053

>>10246959
>He is just saying that we have no way for us to interpret what the brain is thinking / doing, as every brain is wired differently with its memory's and thoughts.
>The brain is basically a self contained system, with no output for us to read.
Dan Dennett would disagree

we are forgetting how limited our tech is

>> No.10247056

>>10246973
Unrelated to my original post...

Prediction error in a perceptual process.

>>10246959
no hes saying that we interpret our own neural process based on our own experience. its uninterpretable otherwise which could only be thereafter analysed in terms of dynamics and causal interactions, particularly in terms of the environment and the body; the brain is for controlling an animal in its environment therefore thats essential.

>> No.10247058

If the human brain does not process information how did the human brain go from stone tools to advanced mathematics.

>> No.10247062

>>10247056
>no hes saying that we interpret our own neural process based on our own experience. its uninterpretable otherwise which could only be thereafter analysed in terms of dynamics and causal interactions, particularly in terms of the environment and the body; the brain is for controlling an animal in its environment therefore thats essential.
If you knew what every atom and subatomic particle down to as yet undiscovered quarks had were doing in my brain and had other brains to compare it to and I started thinking about a dancing pink elephant you would be able to figure out I was thinking of a dancing pink elephant if you had enough observation time of my brain logged

>> No.10247082

>>10246226
>What is even his definition of intelligence if not processing information.
Its clear he is talking about processing information in the way computers do it. Algorithms, and memory basically. You're not going to get anywhere attacking him with semantics.

>What the fuck is thst dollar drawing experiment supposed to mean.
He is showing that its absurd that we have a representation of the bill stored in our memory. The recalled version of the bill is not an compressed version of the real image of a dollar bill, its a "performance" of the student, doing a dollar bill.

Its not a bulletproof point, but its easy to see his point, the simplified recalled dollar is not in anyway similar to how we would normally talk about compressed images. If we could truly see the image of a dollar bill in our heads when we imagine it, it makes sense to belive the two results would be a lot more similar. The recalled dollar bill, is not what we have in our head.

>> No.10247087

>>10247056
>no hes saying that we interpret our own neural process based on our own experience. its uninterpretable otherwise which could only be thereafter analysed in terms of dynamics and causal interactions, particularly in terms of the environment and the body; the brain is for controlling an animal in its environment therefore thats essential.

No? You are basically just repeating what I said with more words.

>> No.10247096

>>10247032
Well i dont know if you are who i was asking with but i was describing how a boltzmann machine (a neural network) might relate to a brain that uses predictive coding. Anon seemed to be saying planning is a different kind of process but im just saying it isnt.

>> No.10247119

>>10247062
you would but an alien wouldnt.

>>10247087
but you said it badly and didnt convey the point in the article.

>> No.10247140

>>10247058
>If the human brain does not process information how did the human brain go from stone tools to advanced mathematics.

need this answered desu

>> No.10247143

>>10247119
>but you said it badly and didn't convey the point in the article.
No, even:
>He is just saying that we have no way for us to interpret what the brain is thinking / doing
would be enough to get the heart of the what the article is getting at. The rest was just added to make the point clear, and explain why. What you rambled about is really besides to point.

And you started in disagreement, its not really how you add to or clarify something that has already been said.

>> No.10247145

>>10247140
He is not using "information processing" in the absolute general sense. He is talking about the way a computer processes information. Read the article.

>> No.10247156

>>10247145
>He is not using "information processing" in the absolute general sense. He is talking about the way a computer processes information. Read the article.
Well I think the answer is clearly sometimes it does sometimes it does not. It's certainly demontratably capable of processing information. But yeah I can see how if you trip or something your brain could just be reacting the way a pendulum with an electrical connection to control a servo could stop a rudimentary robot from falling without "thinking" about it.

headline is trolly

>> No.10247157

>>10247145
So his point boils down to "brain isn't structured according to von Neumann architecture". What a ground breaking insight. If only somebody told those unsuspecting neuroscientists and AI researchers.

>> No.10247165

>>10247157
He does go into where and how this metaphor is being used and what he considers to be wasted research based on this. How about you find something in the article to disagree with instead of falling for easy clickbait.

I'd think you were new on the internet or something if I didn't already know so well how you dumb fucks never read anything.

>> No.10247168

>>10247143
>The rest was just added to make the point clear, and explain why
but it didnt; you used a very poor choice of words like brain output
plus what has the fact peoples brains are wired differently got to do with anything?

>would be enough to get the heart of the what the article is getting at.
no because thats not the point of the article. its about embodied cognition, hence the ramble. The fact you dont know what the articles about means i add something.

keep seething.

>> No.10247175

>>10247156
>doesnt even know

>> No.10247201

>>10247168
>brain output
Explain how I was wrong in using "brain output" in my post. It clearly states what I meant.

>plus what has the fact peoples brains are wired differently got to do with anything?
It means we can't map down different brain states to different thoughts or actions. I'm starting to think that maybe you shouldn't be the judge of weather something is clear or not, if you can't get this basic implication.

>no because thats not the point of the article.
Jesus, you are actually autistic, aren't you? We are talking about a very small sentence within the article, did you forget this? If not then try to interpret what is being said charitably in the right context and for more that 2 seconds.

I don't know how someone who could write this abomination:
>which could only be thereafter analysed in terms of dynamics and causal interactions, particularly in terms of the environment
Has so little self reflection that he would attack someone else's choice of words.

>> No.10247228

>>10247201
>It clearly states what I meant.
its clumsy. you shouldnt use the negative of a fantastical example that doesnt exist. you should instead to relate it what humans cant do.

>It means we can't map down different brain states to different thoughts or actions.
say we looked at one human brain... why wpuld this matter.

>Jesus
you mentioned "point of the article" first so seethe at yourself not me.

>dynamics and causal interactions, particularly in terms of the environment
but this is the point of the article and it uses clear defined terms that are related to the academic area and relates to the point.

you should learn to take criticism.

>> No.10247271

>>10246679
Like the motor cortex portions that direct movement. They are literally a winner take all correspondance to action. If there are more go neurons for your diagonal movement, then you go diagonal.

That can be categorized as computational, can't it?

>> No.10247274

>>10247228
>its clumsy. you shouldn't use the negative of a fantastical example that doesn't exist. you should instead to relate it what humans cant do.
This is so far outside of the original point, I'm convinced you are just arguing for the sake of the argument. Its really hard to belive that this was important enough to point out. And I don't even agree that its a fantastical example by the way.
"The brain does not have an output we can read" Should be changed to what exactly?

>you mentioned "point of the article" first
No, I didn't say "point of the article". I talked about what that specific sentence within the article was talking about. Which was the original topic. Normal people don't need every sentence to be laid out like a math formula for them to understand.

>but this is the point of the article
No, its a word salad that barely gets a simple point across.

>you should learn to take criticism.
I'm not buying this cop-out. This was never about criticizing my original post for you. That is incredibly transparent.

>> No.10247288

>>10247228
>say we looked at one human brain... why wpuld this matter.
It means we have no way to interpret it even after making this incredible simulation.

>> No.10247293

>>10247274
>outside of the original point
I thoughy it was difficult to understand.
>Should be changed to what exactly?
Something about interpreting data.


>No, I didn't say "point of the article".
you did. you literally wrote those words in a post i replied to.
no its not - see >Normal people don't need every sentence to be laid out like a math formula for them to understand.


>No, its a word salad that barely gets a simple point across


>I'm not buying this cop-out.
I didnt think it was good enough.

>> No.10247308

>>10247293
>you did. you literally wrote those words in a post i replied to.
Quote it then?

>> No.10247319

>>10247293
>>10247274
Quit bickering.

>>10247082
So this example says that we can only remember things to the extent with which we can perform them? This then distinguishes recognition from recall.

So what is recognition? The evocation of familiarity? This has interesting implications with illusory conjunctions (i.e. false recall, and also false recognition).

Also, demonstrably, some neural network models are supported with phenomena such as priming, what is the rebuttal against that?

>> No.10247325

>>10247288
what im saying is that interpretation is irrelevant of whether one person or two so why would differences matter?


>>10247271
Hmm this isnt seen as contrary to the articles viewpoint though.

>> No.10247334 [DELETED] 

>>10247319
>So this example says that we can only remember things to the extent with which we can perform them? This then distinguishes recognition from recall. The point isn't that the recalled version doesn't look like the the real version. The point is that the recalled version isn't at all like the 2nd picture they drew. They had the ability to draw from picture to paper. But when they were drawing from memory to paper the drawing became a different kind of drawing. Not at all like they were drawing after something.

Now clearly how a dollar looks has to be stored in some form in our head. I think the point was simply that the way it was stored is not like how we think about stored images.

>> No.10247348

>>10247319
Let me try this again.

>So this example says that we can only remember things to the extent with which we can perform them? This then distinguishes recognition from recall.
The point is that the recalled version isn't at all like the 2nd picture they drew. They had the ability to draw from picture to paper. But when they were drawing from memory to paper the drawing became a different kind of drawing. Not at all like they were drawing after something.

Now clearly how a dollar looks has to be stored in some form in our head. I think the point was simply that the way it was stored is not like how we think about stored images.

>> No.10247360

>>10247325
Yes, it is not contrary, but it does support the notion that our models are zeroing in on something. The article makes it seem as if we are choosing flavor of the day theories, when in reality our understanding of the whole world is becoming more refined, not merely changing in some Kuhnian sense.

I do agree it is time to drop info theory as THE model, but as A model the article does not give it enough credit.

>>10247348
I agree. I would hazard a guess that we network a connection between dollar, rectangle, small, green, etc. But those networks are only so far connected as to recognize a dollar bill, i.e. to re-perform one's purpose with it and define it to others. Our brains are cheap, they don't want to do more than what is required of them.

>> No.10247370

>>10247325
>what im saying is that interpretation is irrelevant of whether one person or two so why would differences matter?
If you could compare different brains to map out different brain general states to actual thoughts then you could use the simulated brain and get something out of it, because it would respons similarly to how our brains work. But every brain is different, so we wouldn't even be able to do that.

>> No.10247397

>>10247360
To expand on this, we do not have dollar bill neuron, but we have construct using neurons that are tethered to different concepts. So if one were to be asked to draw a dollar bill, one would activate one's dollar bill mapping. I do not know the exact way this would happen, but I would assume it would begin with the way it is cued and then processed. So, if someone asks you to draw a dollar bill, then you HEAR dollar bill, then that automatically activates all other associations with it through summations, potentiations, etc.

Now, HOW those fundamental concepts are encoded, I do not know, but my guess would be the same (at most, and at the very least analogous/homologous) as the retinotopic, tonotopic, and homonculus mapping we have for other parts. So, if you are not familiar with -topic mappings on the cortex, I would familiarize yourself with that because it would make sense for it to be in the same vein or similar for semantic mapping.

Also, what I do not know, is what the smallest unit is for these concepts. For example, sound is encoded in syllables/phonemes, so I would assume words are phonemes networked together. But how is a phoneme represented? How many neurons per a phoneme? What is the "information" -burden of a single neuron?

I feel that a concept from the article is going over my head. Does the article not agree with these previously mentioned concepts?

I am not OP, but if someone can explain where my thinking is going wrong then that would be much appreciated.

>> No.10247426

>>10247397
The article is basically just saying we have started thinking about the brain like a computer, and think in terms of a computer on how it works. As in memory and information processing.

It would maybe be something like: Imagine inn> Brain processes image> stores image. And same with thoughts.

He shows us why its faulty to think of the brain this way, and how its perhaps influencing science badly. Its not revolutionary or anything, I think most neuroscientists would agree, but pointing these things out could help stop a bad trend.

>> No.10247457

>>10240916
What isn’t quantum mechanic ??

>> No.10247514

>>10247319
>>10247348
I think recognition and recall would be "performances". Everything in cognition would be some kind of performance.

The idea is to not have a concrete representation in someones head but think in terms of how brain statea at one time predict states at another time.

To me this is quite close to the philosophical idea of meaning as relational/holistic. Objects, real or otherwise, have no meaning without their relations to other objects and vice versa.

In a similar way, all meaning of objects in our heads is in how we react to them. The meaning of "book" is in how you react to it in the context of seeing a book.
Think of it like Wittgensteins theory of language. But with all objects.

>> No.10247595

>>10247514
>I think recognition and recall would be "performances". Everything in cognition would be some kind of performance.
I guess, but that also makes the everything pretty vapid. We are really talking about an analogy here, and performance just leaves us with everything in the middle having to be filled in anyway.


>To me this is quite close to the philosophical idea of meaning as relational/holistic. Objects, real or otherwise, have no meaning without their relations to other objects and vice versa.
I'm really not that into neuroscience, but from all I know I think the current conscientious would be something like this.

>> No.10247596

>>10247514
Ok, I've reread the article and it makes more sense now, Sorry, I was going off of a late-night reading of it, which lead to some shameful comments. I've discovered the article-referenced blog now and am understanding the ideas being put forth in embodied cognition.

If
>"performances"

then why use react instead of interact?:
>all meaning of objects in our heads is in how we react to them
>The meaning of "book" is in how you react to it in the context of seeing a book

Are you using them synonymously?

Also, regarding the "storing" of information- if one's perspective is embodied cognition, then what would that mean for memories? What theories and frameworks does this perspective create to replace the old?

How does embodied cognition solve the problem of representations? Neurons "storing" information: how and where?

>> No.10247611
File: 34 KB, 553x317, 315589_1_En_7_Fig4_HTML.gif [View same] [iqdb] [saucenao] [google]
10247611

>>10247397
bare in mind the smallest unit of information isnt the neuron. Its the column. a system of neurons across six layers that act as a whole unit. its also unlikely there is a set network or region per phoneme. Rather you would have an area for phonemes which itself is divisible into many smaller areas related to inputs lower down the hierarchy. And so each phoneme would be related to the specific distribution of activity of all the smaller areas in the whole phoneme region.

>> No.10247728

>>10247611
This is unfamiliar and somewhat new info for me. Thank you. I do not understand your image entirely, but I appreciate the figure.

Mind expanding on your phoneme explanation?

My understanding of conceptual phonemes (ignoring all motor processes attached to them) based on your explanation is there is/are phoneme column(s) where all(?) phonemes reside on the top layer of the column or groups of columns and each activate different connections depending on which is being utilized?

>> No.10247730

>>10239966
>like a computer
Emphasis on the "like".
I mean of course things are more abstract in the human mind. It's a consequence of the chemical makeup of our systems and the way it developed.
Machines involve reduction of what we understand about ourselves. It's a consequence of not knowing the thing in itself one might say.

>> No.10247737

https://www.youtube.com/watch?v=LXFFbxoHp3s

>> No.10247739

>>10239967
The human mind is trickier than you think. We see the networks and are constantly asking - are we the literal physical phenomenon occuring in that network? Or are we differential to that system?
Who are "we" inside that body?

>> No.10247754
File: 68 KB, 600x788, what_the_fuck_am_I_casting.jpg [View same] [iqdb] [saucenao] [google]
10247754

>>10240347
>I say you have autism because you are the one's who don't ask "what is information" "does it have a physical basis" "what structures does it correlate with" "what is an algorithm as an embodied process". This is more interesting to scientists than to engineers because it has actual potential to reveal why a system behaves the way it does rather than merely using symbols to derive general heuristics for mimicking the system
But all these concepts are based on concepts that began in the broader philosophical and religious understanding of our society in the past.
We've merely looked at the detail with science. The answer is quite easy to find if you think broader, or more macro.

"Magic" (or "magick"?) was a term that used to describe early modern science and mathematics.
So you know the memes about "meme wizards" right? This is why that meme manifested. People realised something about science by studying the history of it's development.

>> No.10247759

>>10240396
Many scientists probably think broader, but limit the scope of the concepts they use as a tool to discovering new details of the "big picture". Science is the process of discovering detail. You can only do that when you narrow the scope of the concepts you are using. It is essentially language that helps make science pragmatic.

>> No.10247760
File: 27 KB, 343x250, Anne Boleyn.jpg [View same] [iqdb] [saucenao] [google]
10247760

>>10241162
>what the fuck do you mean by that? what executes?
It's a bit like pic related.

>> No.10247765

If you look at computer hardware you won’t find any symbolic representations either.

>> No.10247767

>>10246226
>What a trash article, I hate philosophers.
>Science is a philosophy
derp.

>> No.10247770

>>10247765
A chip is named so because it's like a wood chip or whatnot, a smaller fragment of the much larger entity.

>> No.10247778

>>10247754
Huh?

>> No.10247788

>>10247765
this is true

>> No.10247806

>>10247778
tl;dr - you can't do science without understanding the scope of the concepts you use for your experiment.

>> No.10247835

The brain is literally an information processor. It doesn't work like a modern computer. But both are physical. It could still be possible to simulate or generate consciousness on a computer. At a certain point it will be complex enough to be a real person.

The differences will become trivial at some point.

>> No.10247838

>>10247835
please see
>>10247737
Consciousness is supposedly a quantum computation facilitated by microtubules in the brain.

>> No.10247845

>>10241195
The point that both you and the author seem to misunderstand, and that completely turns the article into meaningless drivel, is that information processing is NOT an analogy or a metaphor. It is a strongly predictive abstract model, which very much applies to things the human brain does.

Interpreting the human brain as being composed of the parts that are in my laptop, THAT is a metaphor that one should not be taking seriously. A brain is not a Von Neumann machine, it does not execute encoded instructions, it does not store sequences of bits to encode information, and most definitely does not store my understanding of a dollar bill as a JPEG bit sequence.

But that does not mean it is not processing information. A human brain definitely processes information. It performs computational processes, as described by information theory, which by the way does not mention DDR4 memory or hard disks anywhere. Those computational processes are implementations of certain abstract algorithms, even if my brain never executes anything like a for-loop. And it definitely stores a representation of a dollar bill, even if that representation is (1) low fidelity, (2) not in the form of a pixel grid, and (3) much better at recognizing things than at generating a printout.

This is all well within what the words "representation" and "information processing" mean in information theory. If you disagree with this, I strongly suspect you don't actually know what these words mean. It's not "whatever it is my Dell laptop does". Until the author (and you) can demonstrate a basic understanding of the difference between (1) the abstract theory of information processing, and (2) the details of modern artifacts that actually accomplish this, there is very little insight to be had here.

>> No.10247895

>>10240347
>>10239966
She's literally arguing that human brains cannot be described abstractly, i.e. they are fucking magic.
Her whole argument consists of comparing different algorithms to each other not realizing she is comparing algorithms, and then sorting them into two mutually exclusive categories: things computers do and magic brain stuff. I mean at one point she straight up compares one algorithm for catching a ball to another, with the hidden assumption that you couldn't make a computer use the human method of repeatedly approximating where the ball will go. She's probably some boomer fuck who looked at the files on his windows XP desktop and thought "I guess this is all there is to computers".

>> No.10248100
File: 187 KB, 850x507, An-autoradiograph-from-the-primary-visual-cortex-in-the-left-side-of-a-macaque-monkey.png [View same] [iqdb] [saucenao] [google]
10248100

>>10247728
The picture is just what a column would look like with the neurons in its 6 layers. Column is just the smallest possible division of the cortical sheet but as you increase the scale you will find increasingly broader divisions. We only really know the broadest ones (e.g. in order of scale; whole brain, vision vs other senses, what vs where, objects vs places, faces vs chair). Ultimately any division emerges purely from the patterns on the smallest scale i.e. each tiny column will have its own unique connections and will be next to columns with similar connections, so as you zoom the scale out youll find broad patterns of connections over the larger divisions too at every scale.
E.g. all the columns in the face area will have more similar connections to eachother than to those in the chair area but both those areas are nested in the object area. Any in the object area will have more similar connections to eachother than the place area but both areas nested in the what area etc, etc,).
Ultimately connection pattern defines function.

Baring in mind that, whole areas of the brain that seem specialised for certain things because of their connections e.g. faces, will be divided smaller areas that will (seem like) they specialise in different features that define what a face is. Different faces have different combinations of features so that if you activate different combinations of those feature areas (networks) you are basically perceiving different faces. Much like how on a primary visual retinotopic map, activating different combinations of areas might correspond to different spatial shapes (e.g. the picture is a monkey primary visual cortex after seeing that shape when they inject chemicals that leave activation patterns). Im guessing its similar with phonemes.

>> No.10248144
File: 23 KB, 231x170, 1-s2.0-S1364661304002153-gr4.jpg [View same] [iqdb] [saucenao] [google]
10248144

>>10247728
Cont.
Now also bare in mind there is also a hierarchy in the cortex which runs from the primary sensory areas to the limbic cortex. So you might get an area for faces and an area for shapes which may be divisions of similar scale but not the same level of hierarchy. The face area will be above and will probably get some of its connections from the shape area below as obviously faces have shape. (if we zoom the scale in we would say the smaller feature areas of the face area will information from the smaller feature areas for shape.)
Above the face area in the hierarchy would be increasingly more conceptual things.
Id suspect that words would be above phonemes in the analogous auditory hierarchy and be connected similarly.


And a note on the column picture i sent before.

The brain works by predicting sensory information. It needs to because sensory information is ambiguous and so needs to be constrained.

Each area in the brain predicts activity in lower areas of the hierarchy and sends errors from predictions back the otherway. Basically, errors enter L4 from the area below, gets resolved in L2/3 and then the information goes to L5, from where, predictions are sent to the area below's L2/3. If those predictions dont work well then its that L2/3 in the area below which sends the error up to L4 like in the beginning. The red neurons are inhibitory (as opposed to black excitatory which all the pathway i just described is). Theyre basically there to regulate the whole thing and control how it behaves over periods of time through inhibition. Its thought seizures happen when those inhibitory neurons arent working properly and activity just reverberates unchallenged. On the otherhand, Xanax and alcohol basically exaggerate what they do as inhibitory neurons use gaba which is what those drugs work on.

>> No.10248153

>>10247611
And btw when i said picture in the first line of >>10248100 I meant the picture in the first post that you replied to with the diagrams***

>> No.10248392

>>10248100
Regarding this image, it seems as if the pattern viewed is mirrored down into the lower layers of the cortex? Is this coincidence?
Also, what does the epsilon mean? Fascinating picture, thank you.

>chair area, face area
So this is a network on the physiological level. Very cool.
>the smaller feature areas of the face area will information from the smaller feature areas for shape.
By this do you mean that little rectangles on face, are covered by little rectangles in shape area? If so, I follow.

>Above the face area in the hierarchy would be increasingly more conceptual things. Id suspect that words would be above phonemes in the analogous auditory hierarchy and be connected similarly.

In what way conceptual? Conceptual objects I am assuming? Conceptual "whats"?

So, if Face=word, then shape=phoneme? Interestingly, this might be supported by the fact that syntax is the first thing to register semantically, even before grammar! I believe that would fit the model of the brain's hierarchical structure that we are discussing.

>errors enter L4, resolved in L2/3, then goes to L5, where predictions are sent to the area below's L2/3

So the error enters a lower layer, becomes resolved in a higher layer, then goes to a lower layer? This sentence confused me a bit, but I get the gist and it makes sense.

Really interesting, thanks for sharing, this is greatly appreciated. Have a blog or anything? Wouldn't mind staying in contact to pick your brain on things I may come across. You seem knowledgeable.

>> No.10248872

>>10242088
Not him, but explain why it couldn't be simulated. You are literally an /x/ retard.

>> No.10248911

>>10239966
Both article author and OP are attention seeking whores spewing pretentious bullshit out of their ass, wouldn't be surprised if they're the same person. Not worth discussing.

>> No.10248950

>>10240300
>imo
Oof. No one cares about your opinion

>> No.10249174
File: 103 KB, 813x648, 1-s2.0-S1364661315000923-gr2.jpg [View same] [iqdb] [saucenao] [google]
10249174

>>10248392
>mirrored down into the lower layers of the cortex? Is this coincidence?

what do you mean by thus?

im not sure what epsilon but just part of the reference system of a visual fied i guess.

>>the smaller feature areas of the face area will information from the smaller feature areas for shape
yeah i meant smaller feature areas of the face area will retrieve information from the shape ones which are hierarchically lower down. im just trying to get across the idea that the brain has a hierarchical structure for things across the surface but also this structure can be looked at from different scales.

>In what way conceptual? Conceptual objects I am assuming? Conceptual "whats"?
yeah exactly. so e.g. abstract personal information about someone might be mediated higher up.
But yeah i think youre right what you said before about -topic maps. The whole cortex is pretty much just one big map.

>So this is a network on the physiological level. Very cool
other objects with similar properties/features will activate each of those same areas broadly and there is some overlap between the areas.

>So, if Face=word, then shape=phoneme?
Im just saying their relative order in the hierarchy would be the same.

Syntax is thought to be controlled through interactions between the frontal cortex and those posterior auditory areas, so that is actually further up the hierarchy. If you think about it this makes sense because up the, hierarchy the brain maps more abstract, temporally extended information e.g sounds to words to phrases and sentences.

>So the error enters a lower layer, becomes resolved in a higher layer, then goes to a lower layer?
ive always thought of it as mid to high to low but yeah. And then its that last lower level that projects the prediction back down. Quite a weird circuit but makes sense. And its just a recurring structure on the whole cortex.

And no I dont have a blog unfortunately.

>> No.10249246

>>10247739
>We see the networks and are constantly asking - are we the literal physical phenomenon occuring in that network? Or are we differential to that system? Who are "we" inside that body?
But we don't, anon. We already know that ANN's are just linear algebra and any 'physical phenomenon' are just numeric values that have been changed as the network model gradually processes the input data.

Although, you probably are interested in more of the practical use/applications of ANN's.

>> No.10249726

>>10249174
>what do you mean by thus?
I mean that it seems as if the pattern that the monkey viewed was represented 1:1 in the cortex. Perhaps I'm misunderstanding, but this seems as if we showed the monkey a square, and opened the cortex to find a square in the autoradiograph. This is fascinating, if true.

This also makes sense because analogously, muscles are recruited in much the same way. They recruit smaller muscles first and then larger ones as necessary. Perhaps this works within the brain areas. This is also interesting because if the brain is organized into columns, it fits a larger picture of the body being organized in the same way throughout the muscles (i.e., muscles are fibers containing fibers containing spindles, etc), much like how the neural doctrine asserted.

What are your personal thoughts on this thread and OPs/the articles assertions btw? And also your thoughts on the other far or slightly out of reach topics and problems of neuroscience and psychology today, i.e. consciousness, intelligence, neurogenesis?

>> No.10249795

>>10249726
yeah im pretty sure its 1:1. Everything in the brain should be, its just that in these primitive visual areas we have the luxury of seeing it explicitly.

Whats the neural doctrine?
With the brain im not sure how much primacy i'd give to small vs. large. Essentially its an ongoing, balanced struggle between the top-down and bottom-up of hierarchical processing as seen in the error vs. prediction interaction in the circuit i described. I might even go as far to say top-down/ the information at the top of the hierarchy is more important. You can alter the topdown balance though e.g. anticholinergic drugs, schizophrenia classic examples.

ill give some thoughts next post.

>> No.10249802

>>10249726
oh forget what i was saying about top down bottom up re: what you said because i think i misinterpreted what you meant but get you now.

maybe you could clarify about
>They recruit smaller muscles first and then larger ones as necessary. Perhaps this works within the brain areas

>> No.10249825

>>10249802
>maybe you could clarify about

Well, it shows that we are hierarchical creatures I suppose. If muscles are recruited hierarchically, then I suppose the brain's hierarchic structure is in line with the neuron doctrine. Although the doctrine is defined as "the concept that the nervous system is made up of discrete individual cells", it came about because Camillo Golgi posited that the brain was continuous while Ramon Cajal insisted that the structures are still cellular. In essence, Golgi tried to make the case that the brain was an exceptional circumstance on the cellular level and thus would redefined our understanding, where Cajal took the position to stick with the fundamental understanding we have. Perhaps this way of thinking is carrying over into today where we think the brain is distinct from other muscles moreso than it actually is - obviously, the brain is distinct in a way though.

Thoughts on the big questions of neuro? Also, are you a student?

>> No.10249841

>>10249726
The articles really long but from what ive read i dont like it so much and it seems abit sensationalist and a problem as seen on this thread is the definition of "being like a computer" is quite broad. I generally accept the view it is supposed to be arguing for but at the same time, i feel like hes just talking about problems that havent been solved generally as opposed to some kind of novel viewpoint everyone else is missing out on.

The hard problem of consciousness is not a subject of neuroscience but i think the easy problems of the neural correlates and how they work and why is something that is within reach.

Neurogenesis; dont know what the issue is with that. I just know it only occurs in two non-cortical brain areas.

Intelligence; people often say iq doesnt measure intelligence when they have no idea how to define it so they have little grounds to evaluate ANY type of test someone puts infront of them. IQs advantage is in its predictive ability but then again the questions as to why are speculative (even if seemingly common sensr) and other traits are good predictors too. My suspicion is that IQ or G isnt measuring a definitive definable trait but rather the effects of variations in types of brain structure. People here also emphasize IQ way too much here when any individuals life events are far too complicated to be predicted or valued based on IQ.

>> No.10249884
File: 258 KB, 771x578, Bild51721.png [View same] [iqdb] [saucenao] [google]
10249884

>>10249825
oh yeah the whole body is necessarily hierarchical in terms of scale. cells, tissues, organs, body. The cortex is hierarchical both on scale and across its connections.

You would find the recent use of markov blanket in neuroscience interesting: how to define a living thing and how those things are nested in multicellular life like people. It may be a difficult concept though. And no im not a student.

>> No.10249931

>>10239967
no backward propagation of errors in the brain.

>> No.10249955

>>10239966
Literally just lies. Bad troll.

>> No.10250001

>>10249841
>>10249884
Interesting. Thanks for the suggestion, I will definitely check it out. Ya disregard neurogenesis. It's an interesting tool we may be able to introduce to the rest of the brain. But only through genetic engineering I believe.

Regarding intelligence, I agree with you, and want to point out that IQ has also been traced to the PFIT network. Alhough Jung and Haier got a lot of flack for equating IQ to intelligence, it seems that this may be the real deal. While IQ may be off the mark a tad in assigning numbers to individuals, I think the finding that much of the variation in IQ tests being the result of the connectivity and density of the BAs involved in the PFIT network are a huge deal, from what I remember it could be from .3-.7 in correlation.

I'm interested in what the individual variation in this would be caused by (something I would have liked to research if Neuro weren't so damn low paying). Another interesting finding is that FMRI activation of women solving math (I think it's math, but perhaps analytical problems), was broad on both frontal lobes while men were more localized to the PFIT. Of course, this is all so speculative with the subject being politically charged, open to bias from either direction, and lacking much replication.

How is it that you've become knowledgeable without the formal education? Textbooks and self studying? Is it related to what you do?

>> No.10250072

>>10250001
>It's an interesting tool we may be able to introduce to the rest of the brain.
Im not sure how useful it would be.

>PFIT
whats that?

I did go to school and i used to work in a lab but desu yeah most of my knowledge is my own reading of papers desu. You dont really get taught anything interesting in school imo apart from practical stuff i guess. courses just arent deep enough, rarely teach groundbreaking or novel stuff and half the time the lecturers dont even like teaching or arent even covering their own field. textbooks are bad too; thin and overconservative. you never get more than a superficial understanding.

>> No.10250202

>>10249955
I suspect it's idiocy rather than deliberate lying. But yeah, most of the stuff written in that article is just false.

>> No.10250223

>>10250072
>>>10250001 (You)
>>It's an interesting tool we may be able to introduce to the rest of the brain.
>Im not sure how useful it would be.
It would be very practical for CTE. Also would be a door for cortical implants.

>>PFIT
>whats that?
It is a cogneuro theory, the parietal frontal integration theory proposed in 2007 from a meta analysis of ~30 fMRIs. The theory says that utilizing the arcuate fasciculus that you relay info from parietal to frontal. The book by Haier also outlines the specific BAs involved, but I forget them exactly. I believe the theory suggests that you "hold" info in your frontal lobe and modulate it in your parietal and superior parietal areas and work out solutions between the two areas through the AF. It's very interesting but I think is rather nascent. You can find it online, but I'm phoneposting so I cannot link it to you unfortunately.

>I did go to school and i used to work in a lab but desu yeah most of my knowledge is my own reading of papers desu. You dont really get taught anything interesting in school imo apart from practical stuff i guess. courses just arent deep enough, rarely teach groundbreaking or novel stuff and half the time the lecturers dont even like teaching or arent even covering their own field. textbooks are bad too; thin and overconservative. you never get more than a superficial understanding.

Ya, I bought several neuro textbooks (Kandel, Bear, Barich) and realized that my familiarity with concepts is extended more so with papers. Right now I'm trying to read a review of working memory studies with Ana Shoehler. Fascinating stuff also and in line with the pfit via the dlpfc. I currently work in a physiology lab right now and it's somewhat unfulfilling. It seems things might be better from the outside sometimes.

Have you read anything on creativity? Also interesting is Dr. Syeds creation of the neurochip (he used a silicon chip and got snail neurons to potentiate!)

What do you do now?

>> No.10250382

>>10250223
>It would be very practical for CTE. Also would be a door for cortical implants
this is true

>PFIT
ah ive heard of the frontoparietal relation but not specifically of that theory.

>Ana Shoehler
cant find anything about her. guessing youre most interested in intelligence and those similar functions then? Prefrontal stuff is quite interesting.

Read a couple things on creativity but not much. Stuff on the default mode, mind wandering, flexibility that kind of stuff.

yeah those chips sound awesome. technology is perhaps the most important thing for advancement right now.

Nothing really at the moment; bar work, just inbetween the last thing and the next.

>> No.10250465

>>10250382
Lmao, I'm sorry. It's Patricia Goldman-Rakic. Idk why I thought her name was Ana Shoehler. Ya, she did some pioneering work on working memory, current theory is continuous firing from cells whilst looking away from stuff though I've not read much of it. Somewhat related to the dorsal and ventral streams of the occipital parietal and occipital temporal lobe, except these streams project to the dorsal dlpfc and ventral dlpfc for visuospatial (dorsal) and object (ventral) memory. Also known as the where/how and what pathways respectively (when referring to the projection to the temporal). I'm not too knowledgeable about these streams, just familiar somewhat.

I'd be pretty interested to see the variation between people on this much like how the article suggests about needing to understand someone's entire life to read their neurons. How individual life experiences change one's cognitive neurobiology physiology, circuitry, etc.

Ya, I'm curious as to what giftedness is and how it develops. I'd guess it's a relative brain structure relative to the current set of problems a society faces; granted this hypothesis would be contrasted by one where giftedness is a consistent structure that allows one to pioneer in all endeavors. I'd also be interested in some sort of typology of the brain structures and wiring etc., something that I think we all experience socially.

Ahh bar work, ya. C'est la vi man. We were born too early for the neuro revolution, probably missed it by about 25 years?

>> No.10250555

>>10240347
quality post

>> No.10250581

>>10250465
ah yeah ive read her especially when i was in school. Very important use of animals for prefrontal + dopamine.
D'esposito has a very good general review of the cognitive neuroscience of working memory (2015)

>Somewhat related to the dorsal and ventral streams
youll find every sensory modality + prefrontal & motor has a dorsal/ventral divide.

>I'd be pretty interested to see the variation between people on this much like how the article suggests about needing to understand someone's entire life to read their neurons

>giftedness
guess its a very socially loaded term. ive read iq is very related to working memory updating performance and relational integration; other than those connections its hard to guess what it would be related to generally. i guess people just dont know how it develops. genes must be a massive part. i read robert plomin suggesting it as based on pleiotropic genes that contribute to all cognitive tasks.

>probably missed it by about 25 years
ha or 25 years too late. will probably just end up in a lab again. i would do more education but cant really afford it.

>> No.10250650

I feel things pretty hard behind my eyes