[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/lit/ - Literature


View post   

File: 39 KB, 628x314, 5cc16afd2400003200250663.jpg [View same] [iqdb] [saucenao] [google]
14763136 No.14763136 [Reply] [Original]

There is no such thing as 'consciousness'.

Every being is solely driven on its need to preform actions that benefit it. Take an ant for example. An ant is quite content in simply performing its duties; foraging, building etc. It has no need for the ability to muse on a piece of literature or debate philosophy with its peers. Why would it? It does not benefit it in any way, it simply needs to perform its functions to survive and it is quite content in doing so. The same can not be said for a family of chimps. Chimps are highly social creatures, while still performing their roles to forage and survive, they also choose to play with each other and assert social dominance in some instances. Does this social role mean that a Chimp is at a higher state of 'consciousness' than the Ant? I would argue that this is not the case, the social aspect of chimp life is a function/job and isn't any different from an ant foraging for food. Both of these roles have the same end goal, to ensure that the being is content with it's existence.

Even now as I write this, I am simply performing a role to make myself content. Pondering on this subject makes me happy and I enjoy doing it, which in turn will make me perform my other functions better in general. Happiness is a great stimulant for success, happier humans tend to perform better in the workplace, in social settings etc. But does this make me any different to the Ant? While I agree musing on the subject of consciousness is far more complex than foraging for food, the end goal is largely the same. So even with a more complex function, can I really assert that I am at a higher level of consciousness than any other sentient creature going about its day performing its many functions? I don't think so.

>> No.14763161

>>14763136
You're right, 'consciousness' doesn't exist. (And even if it did, it would be completely irrelevant.)

What we should be talking about is free will.

>> No.14763163

>>14763136
It's not about "the end result". It's about the computational complexity of brains. An ant's brain has fewer neural connections than a chimp's brain. They are both sentient creatures, but the latter has more sophisticated cognitive resources than the former.

>> No.14763164

>>14763161
Free will definitely doesn't exist.

>> No.14763174
File: 189 KB, 1200x1555, 30901b9046459449c2aa7c574733cf85-imagepng.png [View same] [iqdb] [saucenao] [google]
14763174

>>14763163
Like I said in the OP, there is no doubt that certain tasks demand higher brain complexity, but what I am talking about is the level of consciousness we as humans seem to label different creatures with. We see Chimps and see great similarities with ourselves, social aspects intermingling with our more 'primal' needs to hunt/forage etc.

I would agree with >>14763164 on this. Our level of consciousness is dictated solely by the complexity of the tasks we need to perform. Human beings need more complex brains as our lives are built around more complex functions. But what I'd argue that because the end result is the same, we have no right to say we are at a higher level of 'consciousness' than an ant, we might be smarter in some aspects, but we are both simply performing our functions.

>> No.14763181

So your point is that every animal's behavior is goal-directed and therefore consciousness doesn't exist?

>> No.14763207
File: 6 KB, 300x168, 90978182.jpg [View same] [iqdb] [saucenao] [google]
14763207

>>14763181
I'm talking more about the end goals of our functions than the functions themselves. There are more intelligent beings and less intelligent beings but this is only dictated by the roles that we need to perform. I.E Smarter beings tend to have more complex roles. But does this mean the level of consciousness is any different between these beings? I don't think so, which is why consciousness is completely irrelevant. You can't measure or quantify it or even prove its existence like you can with intelligence.

Take religion for example. It is a powerful stimulant irrespective to whether or not a higher metaphysical plane exists or not. If the user simply believes it is real, then to him it is and to that end lends its beneficial effects. Faith to the religious makes them content, just like in >>14763136 I said that musing on this subject also makes me happy. Whether or not the function is complex or not the end goal remains the same. So can we really say we are at a higher level of 'consciousness' than an ant colony performing its various functions?

The point I am trying to make is that 'consciousness' is a pointless, arbitrary term and should be done away with.

>> No.14763233

>>14763207
So we are conscious but only gradually more so than an ant and not categorically? What would make for a categorical difference in consciousness, in your opinion?

>> No.14763238

>>14763136
>he thought as he typed it out

>> No.14763273

>>14763136
>>14763174
None of this is relevant to the shitskin invasion of Europe, and the ongoing global genocide of the White Race.

>> No.14763290
File: 95 KB, 720x538, Ivanov-art-6.jpg [View same] [iqdb] [saucenao] [google]
14763290

>>14763233
No, consciousness does not exist. We can say we are more intelligent than an ant, that we can measure due to the complexities of our functions. But we as humans (not saying you or anyone specifically) tend to see creatures such as ants as less conscious than ourselves simply because our functions are different even though the end result is the same, contentment. The only factor that creates a difference in our roles, is the intelligence required to perform them. We are not more 'conscious' than an ant because we can muse on the divine, more intelligent perhaps, but not more conscious. This is because consciousness isn't real and I'd like to do away with the concept.

The only benefit we as humans can get from trying to figure out what consciousness is the happiness we get from trying to figure it out at all. It is no different from the religion analogy. Consciousness is not real and there is no way of proving that it is real, but despite this we do benefit from discussing the topic. We are content when discussing topics that interest us, just like the ant is content when it is foraging for food. These are both 'functions' and the end result is the same irrespective of the intelligence required to perform them.

So in that respect, the Ant has it far better off, it can be content in performing only simple functions whereas we must put more effort and intelligence into only achieving the same result. Contentment is contentment. You cannot be more content or less content, you are either in that state or you are not.

>>14763273
The relevance is actually quite the same. Racism is a function that you perform because it is significant to you, the same in which talking about consciousness is significant to me. Musing on the subjects for both of us achieves the same outcome of contentment.

>> No.14763312
File: 274 KB, 1908x1146, crow solving puzzle.jpg [View same] [iqdb] [saucenao] [google]
14763312

>>14763136
You're just playing a semantics game.
You recognize humans have a more complex set of functions to strive for "being content". Most people call that more complex set of functions "consciousness". You don't, but you still recognize it exists. You just don't like to think of that as being consciousness.
You kind of destroyed your own argument.

>> No.14763325
File: 48 KB, 1080x743, m5v9ns4.jpg [View same] [iqdb] [saucenao] [google]
14763325

>>14763136
This is the same conclusion I reached, no one is conscious (except me), you see I can clearly perceive a self separate from my experience of reality itself. However I find no evidence for this in others, of course everyone will assert they have a conscious experience of reality but this is as reliable as a clockwork toy saying so, it's all just biomechanical gears primed for survival so of course they will all assert their consciousness as they have no frame of reference to compare to. My conclusion is that I have a soul but I have no idea about anyone else, all medical and biological research has been done on other humans so of course the models apply to those humans. But no one has ever researched ME so how can I trust the assertion that my conscious experience is the same as all the other flesh-bots?
tl;dr I agree with 99.99% of what you said anon, just with one small change.

>> No.14763336
File: 452 KB, 670x363, ghost-5.png [View same] [iqdb] [saucenao] [google]
14763336

>>14763312
You're right, most people do call a more complex set of functions 'consciousness', but those people are wrong, that is the point I am trying to make. Why do we link complex functions with an arbitrary term like 'consciousness' when in fact it correlates more with intelligence.

I'm trying to say that rather than spend time trying to figure out what consciousness is and trying to label beings as having a certain level of it, as if we were on some sort of fucking Joe Rogan podcast, we should take a more materialistic view and look at what the input and outcome is. The mental input changes depending on the complexity of the task but because the output of contentment is the same across the board, are we really that different to other beings?

>> No.14763347

>>14763290
I don't quite understand what conception of consciousness you're arguing against. Consciousness, as I (and most people) understand it, is a useful term to designate people who are conscious as opposed to unconscious or who are conscious of something, e.g. a pain in their foot or whatever. But you seem to be referring to something completely different, more related to meaning or higher purpose. Could you try to spell out what exactly you mean by consciousness?

>> No.14763358
File: 87 KB, 830x960, 87427644_135339817973956_5036831091458048000_n.jpg [View same] [iqdb] [saucenao] [google]
14763358

>>14763325
But are you really conscious? Or are you just performing the same functions as me and anyone else? Your perspective on this is irrelevant and like you said, you can't measure it in others. So using your own analogy, in my perspective I cannot prove that you are conscious, so how can you assert that you are?

It makes more sense to me that none of us are conscious, the only thing driving us is completing our various functions in order to reach contentment.

>>14763347
I'm not referring to the physical state of consciousness and unconsciousness, more the side of free will and sentience. I thought that was clear from the beginning.

>> No.14763370

>>14763358
Being conscious (as opposed to unconscious) presupposes being sentient, as does the second use of the term I mentioned. So you agree that there is, in fact, use for the term "consciousness", namely in designating people who are conscious/unconscious and in describing the fact that you're conscious of certain sensations, yes? And what you're really arguing against, then, is free will (and presumably that human beings, as opposed to animals, have some sort of higher purpose), yes? Then do yourself a favor next time and use the appropriate terms from the beginning, if you want people to understand what you mean.

>> No.14763381
File: 14 KB, 480x360, octopus solving puzzle.jpg [View same] [iqdb] [saucenao] [google]
14763381

>>14763336
>Why do we link complex functions with an arbitrary term like 'consciousness' when in fact it correlates more with intelligence.
Because part of our set of functions is designing words for arbitrary terms. Language is one those functions that we developed way more than other animals, apparently, and using it apparently makes us content.
>but because the output of contentment is the same across the board
In what sense? "We are all content"? But don't you ever feel like certain things bring you more of this feeling than some other things? How do we know the amount of "being content" is the same for different people? How do we extrapolate this for different animals?
I'm not sure what conception of "consciousness" you had beforehand, but the little I have studied about this tells me the current understanding of consciousness is along the lines of what you say. Animals are considered to have different levels of consciousness, much like you say they have different levels of intelligence.
As in, an animal with eyes has an extra dimension of consciousness compared to an animal without eyes, because they can use visual information to locate food/etc.

The main problem when it comes to humans is that we don't have a full grasp of all of our functions, or how they relate to how content we feel, or even what is the exact number of inputs/outputs coming to/from the brain. I also believe we could talk about all human beings having complex sets of functions, and human beings being content, and still having human beings with different levels of intelligence. So it seems to make sense to have a separate word to describe the "set of functions", besides intelligence. Maybe just give the word "consciousness" a try as this, instead of whatever it is you currently think it is.

>> No.14763387

>>14763358
I experience qualia, and I have read David Chalmers paper on the subject, he ascribes qualia to organisational invariance, but organisational invariance itself as a justification for qualia as an evolutionary trait is weak because it depends on one of two arguments:
>a priori materialism/physicalism
This argument presumes materialism to be the de facto basis for properties of the universe but the basis for qualia is that non material properties (if they existed) can only be investigated non materially, so you're essentially saying that materialism is true because it's definitely true. You need to explain qualia in its causal relationship with materialism.
>evolutionary selection
The next part of the argument is that organisational invariance is a result of selective evolutionary pressure. This is a weak argument because it relies on the idea that a 'p zombie' or non-qualia organism cannot evolve, i.e. that due to emergence as an organism reaches certain complexities it stitches sensory experience in to a unified consciousness. The problem is it IS perfectly possible to imagine a human that operates like a deterministic cascade of non unified senses, e.g. no qualitative perception of any sense, just immediate reaction based on separately processed data points. This leads to the final key argument: emergence
The idea that processing data CAUSES sensory experience of that data, the two are inseperable. But this is literally just an assertion, I can very easily imagine a biological machine that does this but it's impossible to test for it, because again if its an emergent property then building a machine that copies human behaviour perfectly, according to emergence, EXPERIENCES that reality much as I do.
In essence it's a long circular argument that basically says qualia is physical because everything is physical.
In truth we must each operate on the data we receive, to do otherwise is to live in bad faith. I know I'm conscious because I experience it.

>> No.14763400
File: 89 KB, 976x549, p01bfbdy.jpg [View same] [iqdb] [saucenao] [google]
14763400

>>14763381
>But don't you ever feel like certain things bring you more of this feeling
Yes I do, I can feel somewhat content or extremely content. I have had this inner monologue with myself as I knew this question would come up, I've asked it myself. But as far as I can tell there is no way of measuring contentment past the point in which YOU specifically feel. I can only surmise that because of this, contentment is a state of being. You may feel like you are more, or less content, but in reality you are just that, content.

I have no issue with humans using arbitrary words for concepts in which we understand little, but at the end of the day consciousness, however useful a word, is still the wrong word.

>> No.14763439

>>14763400
I'm not sure what other people think of consciousness, but you seem to assume everyone has the same definition and understanding of it.
I'm telling you my definition seems to overlap a lot with whatever it is you call "set of functions" or "levels of intelligence". I'm not sure what you mean by it being "the wrong term".

There are plenty of subjective experiences we can't easily measure. Like take the Scolville scale for instance. It's supposed to measure how hot/spicy something is. Yet two different people might eat the same pepper and suffer two completely different reactions. I'm willing to bet even genetically identical twins would not have the same exact reaction. Everything related to our subjective experiences is hard to measure, even if we ascribe numbers to it.

In much the same way spiciness has different levels, and these perceptions vary for different people suffering identical stimuli, I have no reason to believe consciousness is different, especially if what we call consciousness is the sum of all those input/output relationships.

>> No.14763456

>>14763164
>Free will definitely doesn't exist.
Of course it does.

Information complexity (which is the same thing as information entropy) exists in nature and is something that can be measured.

Another thing that exists in nature are various singularities - like, for example, black holes.

A singularity of information complexity is what we call 'free will'.

Your bugman response is pure ideology and definitely is not science.

>> No.14763459

>>14763456
Can you show how this is calculated for a simple organism like a bacteria? I'm pretty sure they don't have free will when it comes to reacting to environmental stimuli.
Can you then show at exactly which level of biological complexity this "non free will" becomes "free will"?
(I know you can't and you're probably some kind of theoretical-physicist/mathematician who never measured a single thing in your life, so get the fuck out of here with your spherical chicken models)

>> No.14763478

>>14763459
> Can you then show at exactly which level of biological complexity this "non free will" becomes "free will"?

You're using the word 'complexity' in some loose bugman reddit sense of the word, even though in this context it has a strict mathematical (and physical) definition.

> Can you show how this is calculated for a simple organism like a bacteria?

I'm sure statistical observation and/or modeling can give some bounds for a bacteria information complexity estimate.

If anybody actually cared to do this, that is. Obviously nobody cares, because this is all about bugman ideology and not actual science.

> I'm pretty sure they don't have free will when it comes to reacting to environmental stimuli.

An interesting question to ponder: this isn't about biology. For example, does the stock market, as an aggregate sum of human behavior, have free will? Is its information complexity greater or less than the information complexity of its constituent parts?

>> No.14763493
File: 212 KB, 1710x2048, nYTAFuN.jpg [View same] [iqdb] [saucenao] [google]
14763493

>>14763439
You're neglecting to consider the additional mitigating factors different people have in relation to the Scoville scale. While i'd agree that this scale is largely arbitrary, it is still far easier to quantify and measure than consciousness. The scoville scale sets an average (for lack of a better term) but it can never be 100% correct. People are different and will react differently relative to their tolerance of hot food. Also
>hot sauce

Cmon lad

>> No.14763518

>>14763478
>You're using the word 'complexity' in some loose bugman reddit sense of the word
Nice try but I'm not gonna let you run away that easily. You told me it has a very strict mathematical definition. I'm pretty sure you're gonna come with some information theory shit about average amount of information and possible outcomes. So please go ahead and apply this to an unicellular organism, the simplest form of life on earth. Give me the equation for the information entropy of a single cell.
>I'm sure statistical observation and/or modeling can give some bounds for a bacteria information complexity estimate.
Well, I'm """sure""" it can only give an averaged account of what bacteria do on average. That's not what's being asked here.
>Obviously nobody cares, because this is all about bugman ideology and not actual science.
If you could demonstrate rigorously that consciousness can be predicted/measured by information entropy in any sort of reliable or meaningful way I can guarantee to you you'd publish that shit on Nature/Science/PNAS or whatever the fuck. You're just 100% full of shit. The matter of consciousness is one of the oldest unsolved problems both for soft and hard sciences.
>this isn't about biology.
This is 100% about biology.
>does the stock market, as an aggregate sum of human behavior, have free will?
The stock market is not a living being. Free will is ill defined for non-living beings. I believe you might be the first idiot to even go in that direction. So no, the stock market does not have free will. The individual humans in it might have it, although I'm inclined to believe they don't.

Let me ask you something too then. Can you tell me about a macroscopic phenomenon that is truly random? Say, when you tell me information complexity of 5 successive coin throws, and you assume them to be random, are you telling me the outcomes of those macroscopic events are not 100% pre-defined by their initial conditions?

>> No.14763537

>>14763493
Taste is one of the input/output functions humans have (I'm trying to stay within your terminology).
Spice is one of the types of things we can taste.
I'm trying to attack your notion that because something related to subjective experiences can't be measured easily, it must mean it's a state of being. I just couldn't arrive at that same conclusion.

>> No.14763539

>>14763518
> Well, I'm """sure""" it can only give an averaged account of what bacteria do on average. That's not what's being asked here.
Your use of the word 'on average' is incorrect in this context.

But yes, some aggregate knowledge of statistical distributions for behavior is exactly what we're looking for here.

> You told me it has a very strict mathematical definition.
Go educate yourself. Start with Wikipedia, might be easy enough even for you to understand:

https://en.wikipedia.org/wiki/Entropy_(information_theory)
https://en.wikipedia.org/wiki/Kolmogorov_complexity

> If you could demonstrate rigorously that consciousness...
Again: there is no such thing as "consciousness". Please stop using this meaningless bugman term.

> Free will is ill defined for non-living beings.
No it is not, you absolute mongoloid retard.

I already gave you the correct definition: what we call 'free will' is properly defined as an information complexity singularity.

> Can you tell me about a macroscopic phenomenon that is truly random?
No such thing as 'macroscopic', please stop using bugman keywords you don't understand or know the meaning of.

>> No.14763552

>>14763537
I get what you're getting at. Im not trying to state that all subjective experiences become a state of being, but I feel that certainly is the case with contentment.

>> No.14763556
File: 24 KB, 661x492, 300A1317-4001-41E0-A590-0683722F8A43.jpg [View same] [iqdb] [saucenao] [google]
14763556

An analysis of free will begins with simple objects being manipulated in simple ways. When a reasonable degree of orderliness appears, the arrangements can be made more complex.

>> No.14763576

>>14763539
>Your use of the word 'on average' is incorrect in this context.
You really should start justifying this random shit you say. Your "corrections" are otherwise meaningless.
You could have every single human being react to the same stimuli in the same way and still be unable to determine whether they have free will or not. The fact this simple methodological flaw seems to escape you stems from the fact that, again, I'm 99% sure at this point you're some sort of theoretical scientist that has never done a single measurement in your life. Go ahead, correct me on how "99% sure" is incorrectly used.
>Go educate yourself.
Am always doing so. You should try it too.
>Please stop using this meaningless bugman term.
Please demonstrate it, preferably with a model that can be confirmed by measurements.
>No it is not, you absolute mongoloid retard.
Did staying too long in some statistics department deprive your brain of basic human contact? Do you honestly think "the stock market" entity decides things, for real? Do you think it's a living being? Do you think earth is a living being too? Do you think two people deciding something together form a new organism? Are you some kind of well programmed chat bot?
>information complexity singularity.
You can't even begin to explain to me how to calculate it for a single celled organism, and you're trying to convince me human decision making processes represent a singularity. This is why no one likes fucking theoretical shit eaters like you.
>No such thing as 'macroscopic', please stop using bugman keywords you don't understand or know the meaning of.
Please study statistical physics and read papers on semi-classical experiments and make a single fucking measurement in your fucking useless piece of shit inbred mouth breathing chalk eating life.

>> No.14763613

>>14763136
Then kys

>> No.14764077

>>14763136
dont compare me to an ant bro i dont like it

>> No.14764270

>>14763136
>There is no such thing as 'consciousness'.
BEEB BOOP

>> No.14764454

>>14763136
You're such a faggot, lol.

>> No.14764546

>>14763456
moron

>> No.14764833

>>14763136
>posits a Thing as not-Being-the-case
>Thing
>Being
Yeah, I'm thinking you're spooked beyond absolution

>> No.14764837

Wait...

>> No.14764946
File: 26 KB, 464x429, A063E33F-3F64-4892-8558-39ABBA46DDFD.jpg [View same] [iqdb] [saucenao] [google]
14764946

The difference between you and the ant is a matter of complexity. Most of the decisions made by an ant colony as a whole are the queen’s, as she decides where to settle and start pumping out offspring. Her offspring are machinelike because they are genetic extensions of her, especially the first few generations. You, on the other hand, are a member of a species that adopted a more flexible approach to social behavior that gave you the comfort and investment to develop an UNSUSTAINABLY huge brain.

That brain is motivated by reward, sure, but it’s so densely forested with neurons that have so many complicated, plastic pathways between act and reward that you can find a million novel ones.

You can want a reward and take a winding, esoteric mental route to get there, informed and shaped by factors like “will this be as good as a delayed reward with greater intensity?” or “will this hurt my standing in the group?” (again, your amazing brain can consider the “group” to be anything from your family to the human species to some metaphysical gestalt “life”).

Being disturbed that your brain is motivated by reward is like being disturbed that dna has only four nucleotides. It’s simple at the base, sure, but complexity gives the system its identity. It’s a toolset.