[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 604 KB, 400x514, 1925_kurt_g%C3%B6del[1].png [View same] [iqdb] [saucenao] [google]
2396720 No.2396720 [Reply] [Original]

Do Gödel's incompleteness theorems mean that a Turing machine can never become singularity?

Does anyone here understand Gödel's stuff?

>> No.2396729
File: 159 KB, 912x624, rapture-for-nerds01.jpg [View same] [iqdb] [saucenao] [google]
2396729

Um... Why do you suppose one would have anything to do with the other?

>> No.2396734

>singularity
Define the term please.

>> No.2396737

I understand Godel's Incompleteness Theorems. I created a thread outlying the proofs of the theorems and their correct implications within formalized mathematics.
By reading your question I can tell you do not understand Godel's Incompleteness Theorems, nor what a Turing machine is.

>> No.2396759

>singularity
We really need a new word. I hate the use in the "zomg, computers get smarter than us!" sense.

>> No.2396778

>>2396734
I should have used another word. I have to go to Buffy-speak to explain:

Do Gödel's incompleteness theorems mean that an intelligent machine that is like super smart and always right about everything is impossible.

I was thinking it does.
Let's say there is a subsystem A, that is consistent and complete for some things and subsystem B, that is consistent and complete for other things. There is a subsystem C, that selects what things to use A and for what things to use B and how they interact.

This could be described as system ABC and it can't be both consistent and complete.

And so on until infinity so eventually some selection system must use random values to decide how to combine inputs from different subsystems.

>> No.2396783

I'd assume the fact that Turing machines can't exist in reality would be plenty proof that they can't become much of anything.

>> No.2396789

>>2396783
Only the infinite part. That is just nitpicking. I tried to explain it better here >>2396783

>> No.2396794
File: 112 KB, 912x624, rapture-for-nerds02.jpg [View same] [iqdb] [saucenao] [google]
2396794

>>2396783
I'd assume the fact that the singularity is total pie-in-the-sky bullshit wishful thinking by science fiction faggots would mean that this whole thread is nonsense, so Godel's Theorems could do anything, like prove black holes must exist or something.

>> No.2396792

>>2396778
>Do Gödel's incompleteness theorems mean that an intelligent machine that is like super smart and always right about everything is impossible.
No. It simply won't answer some questions.

Now, if you demand that it always answer, then we don't even need Godel's incompleteness theorems, we can just use the Halting Problem.

>> No.2396801

>>2396778
Godel's theorems in layman's terms basically states that if a system of axioms is strong enough to talk about arithmetic (PA) then there is a true statement G where G is unprovable in PA. If PA is consistent then it cannot prove its own consistency. A Turing machine is a precise definition of an algorithm.

>> No.2398757

bump for discussion on godel

>> No.2398844

Godel simply says your system can't be both complete and consistent. Neither condition is necessary for self-awareness as evidenced by the fact that most people have neither.

>> No.2398866

Of course not OP, we already have computers that are "like super smart and always right about everything", they're called human brains. Granted we're not "always right about everything", but we're certainly close enough to it to get useful work done. A computer as intelligent as the average human being would be plenty enough AI, leave notions like right and wrong to the philosophers.

>> No.2398868

>>2396801
The best part is that WE can definitely say we know something that the system does not. Namely, that G is true. So, the mathematical system will always be ignorant of some things that are obvious to you.

We are barking up the wrong tree. Intelligence is not mathematical, nor is it algorithmic.

>> No.2398885

Question: if a proposition is undecidable with a Turing machine, does that mean it's a Godel sentence?

>> No.2398936

>>2398885
No, there are other problems that are undecidable. See
http://en.wikipedia.org/wiki/Halting_problem

>> No.2398942

>>2398936
They are somewhat related though.