[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 99 KB, 500x375, 611VJSyJsnL.jpg [View same] [iqdb] [saucenao] [google]
4278211 No.4278211 [Reply] [Original]

>mfw there are people who believe things without evidence or direct experience

If you don't know something, know that you don't know. Don't just make something up or believe any nonsensical explanation just so you can feel better about the fact that deep down you don't know anything for sure about your own existence.

I'm a skeptic.

>> No.4278231

The hard line of Aspergerism

>> No.4278234

>implying you can ever be 100% certain of anything

>> No.4278235

>>4278231
>Aspergerism
Are you retarded?
Also, I like how every moron on /sci/ uses Asperger's as an insult, what is so bad at being an aspie? Why do you have problems with autistics? We are not that different of you idiot.

>> No.4278241

>>4278234
I am 100% certain that you posted what you just said and that I will post this.

>> No.4278245

>>4278241
No you arent. No you arent.

>> No.4278253

>>4278234
* This is a hand.
* Here is another hand.
* There are at least two external objects in the world.
* Therefore an external world exists.

>> No.4278254 [DELETED] 
File: 190 KB, 900x939, Mandelbulb_Icosahedron.jpg [View same] [iqdb] [saucenao] [google]
4278254

Ontological nihilism

>> No.4278259

>>4278253
herp derp
therofre HERP EDERP

>> No.4278263

Materialism mofo

>> No.4278279

>>4278234
This is true.
Although while I think humans can't be sure about anything, truth has to exist in some form or another.

>> No.4278280

>>4278211

Some people would rather feel comforted. The value of truth and comfort is subjective. As a result, you can't make any claims championing either.

>> No.4278282

What if you have evidence for something but not the logical capacity to process it?
>psychedelic experience

>> No.4278283

>>4278234
Suppose you can never be certain of anything, but then you're certain of not being certain of anything, which is a contradiction.

Therefore, there exist statements that we can be certain of.

>> No.4278299

>>4278280

Philosophies are essentially a method of determining some form of personal paradigm for truth and understanding. There is no transcendental truth. You privilege a more skeptic and "scientific" outlook to the development of knowledge. You value a certain view of what evidence is and therefor piece together an "empirically" based list of things that can be valued as evidence. A person in New Guinea who believes in spirits has absolutely as much "evidence" as you do, based upon their own personal paradigms, and use this to validate their beliefs.

You are no different than any other in the sense that the construction of meaning is universally arbitrary.

>> No.4278310

Not sure how to describe it, but it goes a little something like this:

"Knowledge" is beyond human comprehension. It's not that we don't know, it's that we have no means of verifying the truths we reach. The only things we have are:
1) A goal
2) How well we achieve that goal.

God may or may not exist. The soul may or may not exist. Reality may or may not exist. "You" may or may not exist. The only assumptions humans can make are conventional ones.

While I do believe knowledge is unobtainable, I also believe that the search for knowledge is of intrinsic worth. What makes people special is our constant attempts to explain or understand reality in a conventional manner. An animal might see a stick long enough to know to avoid it, but a human might look at it longer and try to make use of it.

Long story short;
Reality is subjective. But believing you can pray a shuttle into space doesn't mean you'll be sending anyone up there soon.

>> No.4278338

>>4278310

For HUMAN BEINGS reality is subjective. Not reality in itself. Even if the reality is that all reality is subjective, that would make it objective.

>> No.4278345

>>4278283
It's actually just a paradox.
You can't be certain you can't be certain about anything.
You can't be certain you can't be certain about anything.
And so on...

>> No.4278348

>>4278345
I meant to add another "You can't be certain" to the beginning the second time to illustrate that you can go on forever.

>> No.4278355

>>4278299
i like what you're saying. keep talking.

who's philosophies do you subscribe to?

>> No.4278360

>>4278280
So you're a comfortist?

>> No.4278362
File: 36 KB, 442x500, 1326324784552.jpg [View same] [iqdb] [saucenao] [google]
4278362

>mfw there are people who think they don't do anything irrational

>> No.4278391

>>4278299
So evidence has no intrinsic value
So there is no real way to "prove" something
So nothing can be proven
So nothing is certain

Interesting

>> No.4278394

I'm certain that I'm thinking.
Only way to get around that would be to define thinking away from the cartesian sense.

>> No.4278405

>>4278394
But how are you certain that it's "you" that's doing the thinking?
I mean, a buddhist would say that the thinking is simply "being done", that there is no "you", that the thinking is just a function of the mind which is not seperate from the whole of the universe, reality, etc...
and if there is no certainty in the universe...

>> No.4278413

Atheist.
Saying your agnostic is just fucking silly. By the agnostic logic you can't even be sure you aren't in the Matrix

>> No.4278415
File: 13 KB, 265x272, yes and.jpg [View same] [iqdb] [saucenao] [google]
4278415

>>4278338
>Not reality in itself.
You are human. You have no means of verifying that truth. You agreed just a sentence before.

>Even if the reality is that all reality is subjective, that would make it objective
Yes? And? All your statement achieves is turning this into a debate of semantics. I won't disagree with your claim that it's objective, but I don't hold that reality is exclusively subjective or objective to begin with.

Additionally, saying that it's simply "objective" or "subjective" is an over-simplification of the point I'm trying to make. While I do find convention in holding objective truths, that doesn't mean the objective truths I hold are actually "objective" or "truths". It only means that they're conventional.

>> No.4278417

>>4278413
Read the rest of the thread? You can't be.

>> No.4278419

>>4278405
Even if you were to say my thinking is produced by physical laws, neurons firing, blah blah blah (the way we always dance around it), or that my thinking is a product of a computer program running the universe, "I" am still thinking (although it is not being "done" by me).

Now if you were to dispute the "I"...

>> No.4278427

>>4278415
Even though you're roasting me, that picture made me laugh really hard. I guess I was laughing at myself.
And I totally agree with you.

>> No.4278435

>>4278413
>By the agnostic logic you can't even be sure you aren't in the Matrix
The only people who are bothered by the existence or the non-existence of God are the people who care in the first place.

If I happen to be living in the real world; awesome. If I happen to be living in the matrix; big deal. It's not caused me any problems before, has it?

The day you finally become mature will be the day you realize that it's not a question of whether or not God exists; it's a question of what difference it makes and why we should care. Additionally, it's not an issue of whether reality is composed of particles or waves; it's an issue of what difference it makes and why we should care. And I happen to find more convention in exploring scientific concepts that I do exploring the nature of God. How bout you?

>> No.4278447
File: 7 KB, 217x165, yeah.jpg [View same] [iqdb] [saucenao] [google]
4278447

>>4278413
Here's a mindfuck for ya'

Your brain IS the matrix. All of your perceptions of reality are nothing but reflections of a much larger truth. Your senses inherently deceive you to at least some degree. Once you realize this, you can interpret reality however you want to achieve happiness or sadness. Women are suddenly happy in their abusive relationships. Basement dwellers are suddenly happy they never leave their house. Major league baseball players are suddenly convinced they actually contribute something to society.

But just because you interpret reality in a certain way, it will never change the "True" reality. And that, my friend, is why you are eternally in the matrix regardless of what you think.

>> No.4278457

>ctrl+f
>no existentialism

ok then

>> No.4278494
File: 357 KB, 810x1410, wildchild.jpg [View same] [iqdb] [saucenao] [google]
4278494

>>4278415
>>4278427

>> No.4278507

Occult/esoteric philosophies of various cultural heritages.

>> No.4278908
File: 26 KB, 500x347, 1325672282983.jpg [View same] [iqdb] [saucenao] [google]
4278908

>>4278211

>What philosophy do you subcribe to, if any?

Reality perception: Christian Theism.

Personal life: Individualism (someone contradictive of the former)


>pic related

>> No.4278924

I believe that nobody knows, and modern physics is just a string of falsities strung together to create a big mesh of bullshit.

>> No.4278984
File: 28 KB, 400x311, billandted2.jpg [View same] [iqdb] [saucenao] [google]
4278984

>>4278211
Be excellent to one another, one and all.

If you've got the capacity to better realize how excellent you are able to be to those around you, enjoy that realization and continue your excellence unto others.

Do not let the non-realization-of-excellence in those around you convince you that they are not deserving of excellence - they are.

Those who refuse to be excellent should be allowed to make their own choices - but they should not be allowed to negate the excellence of another.

The Universe is vast and we are very small - get high, enjoy it, and try to ensure the excellence goes on.

Party on /sci/

>> No.4278994

Humanism. It's not like I can put my trust in anything besides humanity.

>> No.4278999
File: 57 KB, 298x298, enlightened.jpg [View same] [iqdb] [saucenao] [google]
4278999

zen

>> No.4279009

Solipsism; I have no reason to believe anything other than myself exists
Logical rejection of logic.
All proofs of logic must be logical, but all logical proofs beg the question, so it rescinds itself

>> No.4279015
File: 27 KB, 83x275, SetteAdmiresThatStache.jpg [View same] [iqdb] [saucenao] [google]
4279015

Transhuman Immortalist Extropian.

>> No.4279017

>>4279015
When you say "imortalist" what do you mean?

The underpinnings of a human mind are often its computational weakness.

>> No.4279019

>>4278924

Science is pretty reasonable in that if you can suggest an alternate theory with better evidence disproving the current line of thought your understanding of physics would be accepted as the new 'truth'
> Truth being the accepted theory

We dont understand physics on a subatomic level very well, but if you want to explain it you can go ahead and try to do better

>>4278994
Look up Epicurus if you havent already, that guys a bro

>> No.4279021

>>4278211
>claims to be a skeptic
>claims its possible to know something
casual
Pyrrhonian here, the rest of you are probably incorrect.


>>4278279
>Although while I think humans can't be sure about anything, truth has to exist in some form or another.

Why? Not saying it doesn't or does, but why cant everything just be a huge pile of shifting uncertainties and nothing is actually nailed down?

>> No.4279022

>>4279017
>>4279015

How could anyone be immoral? I could understand being amoral. But immoral? It sort of implies you recognize what is good and intentionally dont do that.

>makes no sense

>> No.4279026

>>4279022
ImmorTalist

That's true though. If you decide that what you see as good is bad and bad good, then you still do good things, just with a different definition of the concept.

>> No.4279028
File: 31 KB, 300x331, 1240296813385.jpg [View same] [iqdb] [saucenao] [google]
4279028

>>4279026

Oh oops. I cant read. Everything makes sense now.

>> No.4279029
File: 46 KB, 299x168, HeavySecondsBrobot.jpg [View same] [iqdb] [saucenao] [google]
4279029

>>4279017
Precisely, hence the Transhuman part.

A species capable of changing itself through an enhanced understanding of it's own workings is what we are.

Humans are a transitional species... 200,000 years... even Neanderthal lasted longer.

>> No.4279030

>>4279029
...
but then why the human part? We could build a better intelligence from the ground up

>> No.4279032
File: 33 KB, 460x276, 1313530970439.jpg [View same] [iqdb] [saucenao] [google]
4279032

Anyone know Peter Singer?

I dont know that I ascribe to a philosophy, but I certainly think about ethics a lot.

>> No.4279035

>>4279019
Wow Epicurus was incredibly bro.
But I just believe that people need to be treated like people. Everyone is guaranteed rights by virtue of being human, regardless of any other condition, just being a member of the human species is enough to qualify. Although I like Epicurus' Four-Part Cure and his three criteria's of truth. Both of which make a lot of sense, to me at least.

>> No.4279036
File: 412 KB, 950x950, DIPSHIT.jpg [View same] [iqdb] [saucenao] [google]
4279036

>>4279030
Here, I'll just post this.
http://www.youtube.com/watch?v=qnreVTKtpMs

Enjoy.

>> No.4279039
File: 649 KB, 900x1218, 2007-07-30-she_is_the_very_model_of_a_singularitarian.jpg [View same] [iqdb] [saucenao] [google]
4279039

>> No.4279040

Daoism...if anything.

>> No.4279043

Autism is the best philosophy.

>> No.4279044

>>4279036
>http://www.youtube.com/watch?v=qnreVTKtpMs
Superb as that was it didn't actually answer the question.

>> No.4279046
File: 31 KB, 560x360, brain.jpg [View same] [iqdb] [saucenao] [google]
4279046

>>4279044
Then I'm afraid I must not have understood the question, sorry to play the autist, but could you kindly reword it?

My brain is made of meat, sadly. D:

>> No.4279050

>>4279046
I'm trying to understand why you would change a human (given we're not designed for logical thought, at most we're very simple simulating machines with a nifty path finding system) instead of working from the ground up (maybe mapping neuron paths and activity, then virtualising that at thousands of times the speed

>> No.4279053

>>4279050
)

Because fuck bad syntax.

>> No.4279055

>>4279050
why do you think logic is so great?

Its just a set of axioms, a useful tool for understanding some things, but i think this very thread has demonstrated its ill equipped to handle things beyond the scope of our immediate limited existence.

>> No.4279056

>>4279050

Couldnt it spell your own doom to build a creature that could destroy you?

Just asking.

>> No.4279058

>>4279055
I don't, I'm >>4279009 .
Logical deductive ability, whether I can prove it valid or not, is vital for any intelligent creature to survive. We don't want it deciding "I saw a star, so let's self destruct".
>>4279056
To spell my doom it would necessarily either have to be better than me mentally, in which case I'm happy with it doing so, or better than me physically, in which case I'm a fool for not bolting it to the floor.

>> No.4279064

>>4279058

>To spell my doom it would necessarily either have to be better than me mentally

Well this is an interesting difference of values.

>> No.4279065
File: 45 KB, 504x504, Awesome_alcohol.jpg [View same] [iqdb] [saucenao] [google]
4279065

>>4279050
Why virtualise a new mind when you can simply move your consciousness to a faster, scalable, more durable substrate?

Your body doesn't have a single cell in it older than several months, and on the molecular level, your body is VASTLY different from the state it was in when you began reading the last word of this very sentence.

So where's the continuity?
Biological chemical processes? Neuron discharge patterns? A soul?

Gradual, nanobot-assisted conversions of individual neurons should be sufficient to maintain an uninterrupted flow of consciousness during the process of moving from a biological substrate to an artificial substrate.

We can BE the robots, Anon.
Yes, I'm asking for this.
Be brutal.

>> No.4279067

>>4279058
Well humans already dont explode when they see a star, so i think saying that making something better at logic than we are is inherently a useful/good thing is kind of jumping the gun, we may be good enough at it to use it in all its applicable practical purposes, and anything better at it may become dangerously reliant on it.

>> No.4279069

>>4279065 here.
Maybe I'm being selfish, but I really REALLY want to be a T-1000... in composition, not interpersonal relations with respect to the timeline.

>> No.4279072
File: 474 KB, 527x476, T-1000_002.png [View same] [iqdb] [saucenao] [google]
4279072

>>4279069
Sup, bro?

>> No.4279075

>>4279064
I know, rite? It's hard to explain how I reached the conclusion, but I try to think philosophically as if I'm not a person, but I'm looking at things objectively. And objectively, I value the intelligence of Hawking over that of [insert dumb person]. If 500,000 smart people came into existence at the expense of 500,000 dumb people, we would make a net gain.


>>4279065
>Why virtualise a new mind when you can simply move your consciousness to a faster, scalable, more durable substrate?
-the human mind isn't made for thinking, it's made for fucking and eating, and if you're female looking after children.
-that approach is vastly less efficient and would hold us back several years
>Your body doesn't have a single cell in it older than several months,
Ah, but it does have roughly the same configurations of cells. It all works the same way it did 3 months ago (with the exception of my brain, but changes in that were pretty minor).
>We can BE the robots, Anon.
Why would I want to be?

>>4279067
We have a general idea of logic, we just aren't computers meant for parsing information and deriving the logical truth (or rather we are, and we suck at it compared to a PC)

>> No.4279077

>>4279075
>-the human mind isn't made for thinking, it's made for fucking and eating

Neither require anything like the mind humans have, so your claim is highly doubtful

>> No.4279082

>>4279075
But you assume that's a bad thing. Too much logic stifles innovation. Great leaps happen because people continually do stupid things, every now and then one of those stupid things turns out to have been a good idea after all.

>> No.4279083

>>4279077
Intelligence is just a side effect.

>> No.4279086

>>4279075

It seems very strange. Whats the point in creating a being capable of understanding everything? That which is to be understood already exists, regardless of whether its understood.

Dont get me wrong, I value science, and understanding. But there are among many values, including security, respect, solidarity, happiness, etc.

>>4279077

You certainly have a point.

>> No.4279087
File: 88 KB, 456x252, spideyfingers.jpg [View same] [iqdb] [saucenao] [google]
4279087

>>4279075
>-the human mind isn't made for thinking, it's made for fucking and eating, and if you're female looking after children.
>-that approach is vastly less efficient and would hold us back several years
Time well spent, if it means personally seeing where it's all headed.

>Ah, but it does have roughly the same configurations of cells. It all works the same way it did 3 months ago (with the exception of my brain, but changes in that were pretty minor).
Granted.

>Why would I want to be?
Why would you not? Immortality for starters. Then there's disease immunity, enhanced perception, memory, strength, etc, etc,.

>> No.4279094

>>4279082
int chaos = Java.math.randomInt(0,100);
stupidity can be emulated by intelligence. Intelligence can't be consistently emulated by stupidity.


>>4279086
>Whats the point in creating a being capable of understanding everything?
Why do you want to be immortal? I think our answers will be the same.

>>4279087
Agreed on the first two. On the third: Why would I want those things if a machine smarter than me could get them instead?

>> No.4279097

>>4279094

... well what would your answer be?

>> No.4279101

>>4279097
I really don't know. I can't personally find an accurate objective answer, because I'm an idiot and biased on top of that. But any machine I make will probably have the same biases, AND will be unable to derive that information anyway. I was hoping you'd say so I could nod aloofly.

>> No.4279104

>>4279101

I suspect at some point, fundamental values become arbitrary.

>> No.4279106

>>4279104
But we can't just ignore the question. There might be a reasonable goal we'd be missing.

>> No.4279108
File: 18 KB, 480x348, The-Firstborn-1-.jpg [View same] [iqdb] [saucenao] [google]
4279108

>>4279094
Sorry for the delay.
We could do that but...

We're haven't yet even touched the surface of the potential of human augmentation. There's no evidence that we cannot make ourselves as capable as any machine after moving into a new artificial substrate.

But, if, for some reason, the dream is impossible, and I'm relegated to being "second best" in terms of intelligence and ability forever, then I'll concede that title happily.

>> No.4279110

>>4279108
Excuse my misspellings, it's late, and I hate myself.

>> No.4279111

>>4279083
Ad hoc.

>> No.4279112
File: 29 KB, 250x352, Sartre.jpg [View same] [iqdb] [saucenao] [google]
4279112

>>4279106

What goal could exist that is not self imposed? If you dont want there to be intelligent computers that is your choice and your choice alone.

>pic related

>> No.4279115

>>4279108
>We're haven't yet even touched the surface of the potential of human augmentation. There's no evidence that we cannot make ourselves as capable as any machine after moving into a new artificial substrate.
Not saying there isn't. It's just that doing so will probably need a far bigger computer than building one from the ground, not least because our current processor technology is inappropriate for life (it has high speeds, low numbers of cores. The brain is the opposite). It also just misses the point for me- maybe I'm missing something, but I really can't see why we would want to proliferate me instead of some superior entity.

>> No.4279117

>>4279112
>What goal could exist that is not self imposed?
What self imposed goal would warrant our efforts?
>If you dont want there to be intelligent computers that is your choice and your choice alone.
I never said that, self aware graphene would be brilliant.

>> No.4279123

>>4279117

>What self imposed goal would warrant our efforts?

All of them? I dont think you will find a rational answer to this, because reason is not the source of our motivation. I think by definition a goal is something we strive to meet. Its a bit of a tautology.

>I never said that, self aware graphene would be brilliant.

Sorry, I didnt mean to imply that you didnt want intelligent computers. I meant to suggest that decision as to what is worth pursuing originates from you.

>> No.4279130

>>4279123
OK then, why should I want to put effort into achieving one? I dislike committing to things without a rational basis.

>> No.4279135
File: 466 KB, 2550x1685, FORWOLF.jpg [View same] [iqdb] [saucenao] [google]
4279135

>>4279115
Superior is a highly subjective term. From the perspective of efficiency, an A.I. "may" be superior. And perhaps that's what we need. But even if that is the inevitable outcome, humans can still reap the benefits of advanced technology.

Myself? I'd catch up on my reading and travel.

Even if the advent of A.I. can give me everything save relevance, I'd be cool with that. Not like any of us are being particularly useful at this point in time.

>> No.4279140

On the topic of goals.

A goal is justified by it's very existence.

If you didn't want to do it, it wouldn't be a goal.

Then again, that's all relative.

Philosophy is hard, guys. D:

>> No.4279142

>>4279135
>But even if that is the inevitable outcome, humans can still reap the benefits of advanced technology.
But why not let them reap the benefits and peacefully concede they're better? I agree it would be nice to have some intelligence to help me out, and I could definitely cope with being pampered by technology, but if we have a goal beyond self indulgence (which I assume you do) that shouldn't really be what happens. Once we are superceded, we become parasites rather than hosts.

>> No.4279144
File: 16 KB, 340x297, jean-paul-sartre-5.jpg [View same] [iqdb] [saucenao] [google]
4279144

>>4279130

>OK then, why should I want to put effort into achieving one?

Well its really up to you isnt it? I think its part having a mind that people want to do things, regardless of whatever we decide those things to be.

>I dislike committing to things without a rational basis.

You are going to have to get used to this I think. Rationality is only a means of actualizing something, it is not necessarily a justification for that something. For example, I dont think you could come to a rational reason for being alive. If you have an answer like "So that I could learn as much as possible about the cosmos" then your life is just a means to an end. That end being as arbitrary and irrational as treating my life as an end in itself.

>>4279140

This guy knows what I am talking about.

>> No.4279152

>>4279144
>Well its really up to you isnt it? I think its part having a mind that people want to do things, regardless of whatever we decide those things to be.
Seems like a weak reason for me, I could use it to justify anything I wanted, like punching kittens or spreading viruses to take down copies
>For example, I dont think you could come to a rational reason for being alive.
Agreed. A few years ago I was considering suicide, but if there IS a reason then I'd be disadvantaging myself, and if there isn't it wouldn't change anything for better or worse.

>> No.4279156

>>4279152
Excuse the weird capitalisation, I don't normally type as if I'm schizoid.

>> No.4279159

>>4279142
An interesting notion.

I myself would love to assist that A.I. in making new discoveries, exploring, and learning. The only conceivable way I see that happening is by altering our own composition to match it.

Which is the Good End, if I'm still making any sense.

>> No.4279164

>>4279152

>I could use it to justify anything I wanted, like punching kittens or spreading viruses to take down copies

Thats right, you could.

>>4279156

No problem.

>> No.4279165

>>4279159
The problem then comes down to resource allocation. Of course, having you to assist would make things easier for the AI, but if the cycles could support two AIs of your intelligence than you would probably be doing more good by not becoming one of them

I agree partially with the idea that humans are needed to shake things up, and invent new things. Our creative ability is something I don't think we can emulate.

>> No.4279168

>>4279164
>Thats right, you could.
In my head that makes it useless though. If doing something other than what I want to do is just as valid, I have no reason to do what I want to do other than that I'm chemically biased towards doing it.

>> No.4279170

Average negative utilitarianism

>> No.4279171

>>4279165
>Our creative ability is something I don't think we can emulate.
Here's an interesting read, even if you don't normally like these kinds of articles. _http://www.cracked.com/article_19273_6-shocking-ways-robots-are-already-becoming-human.html

I suppose a sufficiently advanced A.I. would be intelligent enough to do some very extraordinary things. Including making myself more like itself than a human, and then a genuine asset and ally.

>> No.4279174

>What philosophy do you subcribe to, if any

edgy teenage atheism

>> No.4279175

>>4279168

> If doing something other than what I want to do is just as valid

Why would you do it if you didnt want to? I dont think you can separate, desire, motivation, and responsibility.

I think I am playing a bit of devils advocated from an existentialist perspective. One that I think becomes absurd when taken to its extreme. None the less I think it is correct in the assertion:

What is good or bad originate from your own decisions.

>> No.4279177

>>4279170
Negative utilitarianism? That's an interesting one.

>>4279171
Thanks
>cracked
well there goes my afternoon.

And I think it probably would. The problem still comes down to the fundamentally different methods of computation of us and machines though, and I don't see how we could change that.

>>4279175
Why would what I want to do be better than what I don't want to do? I know for a fact I'm an idiot prone to making bad decisions.

>> No.4279182

This statement is unprovable

'rational' people can go to sleep now, you are defeated

>> No.4279183

>>4279177

>Why would what I want to do be better than what I don't want to do?

Is it appropriate to rephrase this as:

Why would I do what I want versus what I dont want?

By definition, what you want is what you would do.

>> No.4279184

>>4279183
>implying humans have wants and aren't just products of their environment reacting to stimuli like a bacterial organism

>> No.4279189
File: 73 KB, 800x600, sketch.jpg [View same] [iqdb] [saucenao] [google]
4279189

>>4279177
Heh, sorry about your afternoon.

We'll see where technology leads us. One way or another, I intend to see it to it's end...

The greatest story ever told.

All the better because I'm in it. :)

Sleep well, Anons. It's bedtime for this hopeful meat-creature. (yet another drawback of biology, sigh)

>> No.4279190

>>4279183
I don't know. That doesn't really say anything else, it's tautological.

>> No.4279196

>>4279190

Yeah I guess you are right. Do you think thats a bad thing?

>> No.4279197

>>4279189
They're made of MEAT
sleep well, anon
>>4279190
>almost 2queer
that would have been too good, I guess.
>>4279196
I do, yes.

>> No.4279668

>>4278345
It's not a paradox. There are statements we are certain of. Of course, we don't know these statements, but they do exist.