[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 6 KB, 351x270, top.jpg [View same] [iqdb] [saucenao] [google]
5075309 No.5075309 [Reply] [Original]

Hi, /Sci/, after teaching myself maths for a while (several yrs),
mainly top and geometric stuff, some phylosophic questions
appeared in my retarded brain.

What is the future of maths? Some hope for motive theory, some
would say - the only question is growing powers of computers.

What happens with the meaning of maths in sense of
utilitarian approach solving everything numerically ?

Do you perceive mathematical structures as something existing "outside"
our brains or they just temporary useful tools before we build
a supercomputer capable to simulate say a human brain ?

Eventually, what is the main diffrence between somebody creating a proper
algorithm capable to find *arbitrarily* accurate solution (e.g. cohomology groups of some manifold) or somebody trying to find it explicitly ?

Here is a couple of examples, that I found for software related to the question:
gap-system.org/
math.rice.edu/~andyp/papers/PicardGroupLevel.html

And what is the sense of finding rigorous mathematical basis for some things
working good already, like GR or QFT ?

I hope, after throwing shit in several first posts we'll get some useful
thoughts from anon.

>> No.5075809

>>5075309
first and last bump

>> No.5075842

Future computer can prove all math theorems.

One day in the future, mathematicians will become obsolete.

>> No.5075846

>>5075309

You are trying to find meaning or purpose in math, but math is just a form of measurement and nothing more.

>> No.5075850
File: 713 KB, 975x1441, cutey_Emma-80s_sad.jpg [View same] [iqdb] [saucenao] [google]
5075850

>What is the future of maths?
Strange question, I don't understand the context.

>Some hope for motive theory, some
would say
Don't know much about motive theory, but why would it be more important than any other mathematical subject people work on?

>
What happens with the meaning of maths in sense of utilitarian approach solving everything numerically ?
utilitarian? I feel the problems of the world should be solved politically. There was an interesting article some months ago, bashing how TED is developing and questioning it's value and approach.

>Do you perceive mathematical structures as something existing "outside"
our brains or they just temporary useful tools
This is the question of Platonism vs. Fictionalism vs. ...etc.

>before we build a supercomputer capable to simulate say a human brain ?
And then? What is the question again?

>Eventually, what is the main diffrence between somebody creating a proper
algorithm capable to find *arbitrarily* accurate solution (e.g. cohomology groups of some manifold) or somebody trying to find it explicitly ?
There are interesting papers on what math is and what we really want to accieve.
I remember reading "what is good mathematics" by Terrence Tao, which was interested (although I not always agree) and some other better articles on that subject.

>And what is the sense of finding rigorous mathematical basis for some things
working good already, like GR or QFT ?
I guess getting better ideas for new theory, better understanding for the old ones and also work for academics to do - it's interesting at least. If you look at big problems, solutions usually reveal some new math.

>> No.5076060

>>5075842
>thinks computers can resolve even basic Diophantine problems
>>5075846
>thinks math can only deal with quantifiable things
>>5075850
>physicsfag
>2012

OP, probably working out stuff in non-axiomatized mathematics as well as across several axiomatizations at once. Computers being able to handle mathematics in more symbolic and abstract ways (only people really doing this are the mathematica dudes though).

People will start having to think about how to pose questions and will start thinking about the importance of bounds as opposed to just numbers. Other abstractions may come into the forefront.

While a perfect platonist mathematical system may exist we deal more in terms of man made mathematics (not talking about different set theories such as zfc, constructivist, etc..). This has very little to do with whether or not we're capable of simulating a human brain in a computer. The larger problem is how algorithms scale and how real mathematics have little to do with computation.

Algorithms are created based on techniques developed by mathematicians. Wolfram has done a number of talks on this.

Personally I don't think the real world works like mathematics, but I suppose if you really wanted to you could probably find some abstraction in some axiomatization to fit anything.

>> No.5078062

Expected asspain of pure mathsfags.

Guys, I didnt wanna harm your feelings, dont understand why are you so agressive.

>thinks computers can resolve even basic Diophantine problems
You're confusing the question. It is just a matter of time and
computational powers.

If you will still argue, that pure maths is something BEYOND algorithmically
resolvable then you imply platonism either.

Moreover, if you don't imply platonism and still say something cant be
algorithmically resolved then you in fact will have problems with
Gödel's theorem.

And also ----- >>5075842

>> No.5078071

>>5076060
What can you say about the fact, that many problems in group theory, topology, in particular homology and cohomology groups of lie algebras can be calculated
and all the algortihms are just linear algebra ? I saw a lot of source code
of such stuff (at least which is available). It's all about linear algebra, the
only problem with it is sparse matrices - but it's only a question
of either optimisation or using a better computer :3

>> No.5078079

>>5075309
I think that teaching yourself has given you a skewed view of current mathematics. While computers may play a bigger role (and this is exciting) there is still plenty of stuff left to be done the old fashioned way. I don't know what the landscape of modern research looks like, when I got out of math, the Langlands programme was still a big thing, thats about all I know.

>And what is the sense of finding rigorous mathematical basis for some things
working good already, like GR or QFT ?
Well the mathematicians would do it for the sake of it, but I think that the process of making it rigorous would be very likely to lead to new insights.

>> No.5078124

>>5078062
>>5078071

To solve a diophantine problem you have to be able to have an algorithm that can run off of basic intentions and do anything. In other words an algorithm of everything. Some might argue that this could be like a really sophisticated learning algorithm system but you're not really any closer to a solution at that point because it's too broad. It has all the same problems as a human and worse it's more prone to overtraining and undertraining.

This is just talking about problems with quantifiable mathematics and mathematics that can be modeled with addition. You can't even do real multiplication on a computer from a computational standpoint, how can you expect to use computation as a foundation for everything else?

>but it's only a question
of either optimisation or using a better computer
You should actually study your own field because if you did you'd understand that algorithms fundamentally have limits that aren't resolved by optimizing said algorithm or switching to a better computer.

>> No.5078148

>>5078124
You're right that simply increasing the speed/memory of our current computing technology won't help, but there are other kinds of computers we could build. Computers that manipulate their own hardware, chemical computers, biological computers, quantum computers. Many of these can or may be able to address those problems.

>> No.5078161

>>5078148
While this is true, it also turns out it's not as simple as just making a 'computer B' that can do 'all that computer A can do and more'. Even with quantum computers you end up losing the capability of solving some problems solvable by classical computers and gaining the capability of solving some not solvable by classical computers (including some but not all NP complete problems). Of course you could argue that you could then go on to make hybrid computers and so on, but you're still avoiding the problem of 'an algorithm for everything'.

>> No.5078176

>>5078079
> I don't know

okay

> I think that the process of making it rigorous would be very likely to lead to new insights

It was always otherwise actually - new USEFUL mathematical structures
appeared from physical evidence

>> No.5078179

>>5078161
But we have such a computer. We have billions of them. There's proof of concept all over the place. With greater understanding and optimization of chemical and biological computation, hybridized with the triviality of certain problems in classical and quantum computing, we could definitely create a computing machine that can generate proofs of these problems, no?

>> No.5078184

>>5078161
So you imply that our brain obtains an algorithm of everything. Great

>> No.5078191

I guess, all of pure mathfags eventually just imply that our brain has
some metaphysic nature nad mathematical structures live
on their own in parralel universe waiting to be "discovered".

Ok, it's just a viewpoint, also having rights to live

>> No.5078201

>>5078179
You could, but I argue that in order for it to be able to discover novel proofs by novel means it would need to be 'an algorithm of everything'.

>>5078184
Yes, and much like learning algorithms they're prone to several problems. I figure an 'algorithm of everything' would likely just be a very sophisticated system of learning algorithms and would thus have all the same shortcomings that a normal human does. It would waste time on fruitless endeavors and produce incorrect but correct looking results and waste time in oh so many ways. Ultimately, I don't think it would be much of a step forward.

>> No.5078206

>>5078191
lolno gb2/1800s

>> No.5078274

>>5078206
> Use more unknown shitty words
> Try to look oldfag

>>5078201
Ok, so you have just confirmed that it is so:
human brain is something supernatural in comparison
to computers, and hence mathematical structures
live in subspace supernatural in relation to
computers.

I dont say it's bullshit, I just have another outlook on such things.
For me mathematical point of view is just more or less
algorithmic approach, though sophisticated, since our brains
were developed through millions of yrs, but
computers re getting better billions of times faster.

For me it would be too bravely to imply that
computers (in particular neural networks) never reach powers of biological brains.

>> No.5078294

>>5078274
No, It's the opposite. If you interpreted my posts as this then you have either never taken an algorithms course or a machine learning course. I believe that a computer may eventually reach the power of a brain but it would do so by ending up with all of the same problems that a brain has.

>> No.5078330

>>5078294
Yeah, yeah, I waited that, bro.
So ad hominem, okay.

But still let's try not to talk about me ok ?

So a computer having this magical "algorithm of everything" will
be not more efficient than a usual human brain, right ?
Any proofs ?

Also saying a possible field of solutions is soooo broad doesnt imply
algorithm of everything. And BTW be so kind, define this term, pls.

> You can't even do real multiplication on a computer from a computational standpoint, how can you expect to use computation as a foundation for everything else?

Again, bro, you say, our brain does this supernatural "REAL multiplication" and
a computer doesnt. Doesnt matter how long we debate the question is
eventually about nature of a computer and a brain. I dont think, a brain is something other than the same physical reality as a computer.
Even though supercomputers will wok ass normal brains, brains are quite different from each other, you can still build a very good artificial brain, like
10 times Newton or so.

>> No.5078347

>>5078330

In machine learning you have a lot of different learning algorithms. It's not quite like the rest of AI where you talk about markov chains, a* search, clever probability based approaches to problem solving, etc.. A learning algorithm is basically polynomial regression on steroids. You feed it sample data with the correct "answers" and it tries to find correlations in the data to predict the "answers" in new data. There are lots of algorithms but they're all essentially the same including Neural Nets. They have many fundamental limitations not just in terms of power but in terms of capability. Normal Neural Nets have no memory for example. An algorithm of everything would be a learning algorithm that overcame this (by itself or with others).

>from a computational standpoint
This is in regards to the group of compsci fags who argue that mathematics should be founded on computation (as in computable concepts) via category theory instead of set theory. They argue that only things that can be approximated really exist and via some programming languages argue that a program is effectively the same as a proof. It's nonsense considering how incredibly limited what they're doing is. Basically modeling all of mathematics on an axiomatizing with just addition due to how computation works.

Brain does abstract symbolic multiplication which is to say that it doesn't actually do multiplication (in terms of computation) it only tries to find out interesting things about it.

>> No.5078401

>>5078347

I think your poor review of modern algorithms doesn't relate to the question (at least i dont agree, that every comp. algorithm is a learning algorithm).
Neither the notion of alg. of everything does. Again this is a question of
a machine, say not only comp. power but possessing memory and what ?
Where is an argument that not every mathematical proof cant be done
by a machine ?

> They argue that only things that can be approximated really exist and via some programming languages argue that a program is effectively the same as a proof.
what is your argument to that ? without emotions like:
> It's nonsense considering how incredibly limited what they're doing is

> Brain does abstract symbolic multiplication which is to say that it doesn't actually do multiplication (in terms of computation) it only tries to find out interesting things about it.
> abstract symbolic multiplication
> it only tries to find out interesting things about it
> sounds supernatural
This last sentence makes me feel unsure that we're talking in /sci/-style

>> No.5078420
File: 133 KB, 400x307, 1340052534063.png [View same] [iqdb] [saucenao] [google]
5078420

>>5075309 And what is the sense of finding rigorous mathematical basis for some things working good already, like GR or QFT ?
WTF are you talking about? Both of these are already in a very rigorous mathematical basis. Quantum field theory is already axiomized in terms of category theory, GR just needs varational calculus and group theory.

I'm not even going to bother with the rest of your questions, they're a bunch of philosophical and political nonsense that have no answer

>> No.5078422

>>5078401
>implying I said that every algorithm is a learning algorithm
I was just providing a definition for an algorithm of everything. Unless you think that Neural Nets are a poor comparison to the brain?

>what is your argument to that ? without emotions like:
lol because it only deals with a very very limited axiomatization of mathematics and tries to model the rest of mathematics on top of it. This isn't an emotion, it's the basic notion that they don't even have the set of reals to work with. Transcendentals? Nope they don't "exist" to that foundation. Pi? doesn't exist. It's like saying that all of transportation should be done on the tricycle and that stuff like 'space travel' or 'see travel' doesn't actually exist.

It's not supernatural bro. It's the basic consequence from the fact that multiplication is not just sophisticated addition. It's its own fundamental operation and trying to compute multiplication through addition is just wrong. Besides, you should know that in most cases a calculator is outright useless to a pure math major.

For a proof, try attacking a Diophantine problem. Your computation will lead you absolutely nowhere. There is no basic course of action. No starting point. No techniques. It's approachable from any and every angle, yet very few if any will yield fruits. Fermat's Last Theorem is a good example. It's easy enough to explain to a child yet difficult enough to take centuries to prove.
http://en.wikipedia.org/wiki/Fermat%27s_Last_Theorem

>> No.5078439

>>5078420
> Finding the proper axioms for quantum field theory is still an open and difficult problem in mathematics. One of the Millennium Prize Problems—proving the existence of a mass gap in Yang-Mills theory—is linked to this issue.

Please, GET THE FUCK OUT OF HERE, brain damaged beeing

>> No.5078461

>>5078422
That's much better and interesting.

Ok, so I'm a little bit confused then with a term "fundamental operation", you know, bro. It sounds like THIS is something supernatural (I mean not in the religious sense or so) in relation to computation.

And what if I say then, that I built a computer having this fundamental operation, and why exactly the addition is the fundamental operation, which is in the way accepted by a machine ?

We all know, that today's computers are not able to compute every stuff in abstract maths, but damn they can even calc. fuckin cohomologies, yet for
finite groups, but man the develop so fuckin fast ! It wasnt a question in the op-post

And again, ok, about Diophantine equations. If we approach them with
our classical algorithms like those based on Markov chains or with Post-machine we have troubles its obviously, but it has nothing to do with the fact, that if say they cant be computed (in principal in finite time), then
we neccesarily imply that their solutions lie in some transcendent field of
reality. I cant agree just with one fact, that our brain possesses some
ethereal features which stand it above other beings including computers.

>> No.5078467
File: 80 KB, 634x600, 1340055327560.jpg [View same] [iqdb] [saucenao] [google]
5078467

>>5078439
Are you fucking 12? 13? http://en.wikipedia.org/wiki/Wightman_axioms

QFT is not the same thing as gauge theory, you silly little child. Differential and <span class="math">\infty[/spoiler]-categorical cohomology provides a rigorous mathematical basis for QFT - there is a local net copresheaf of algebras which satisfy the locality axiom with respect to some geometric structure (usually Lorentzian); cobordisms are the propagators, while the boundaries of the cobordisms describe Fock space... gauge theory involves deriving a mass gap from such in a Lorentz-invariant manner valid for any gauge group.

>wikipedia knowledge of physics
LMFAO. Thats some funny shit. Post more nonsense to make me laugh FAGGOT! HURRY UP LITTLE ENGINNER!

>> No.5078474

>>5078467
> Finding the proper axioms for quantum field theory is still an open and difficult problem in mathematics. One of the Millennium Prize Problems—proving the existence of a mass gap in Yang-Mills theory—is linked to this issue.
> Finding the proper axioms for quantum field theory
> for quantum field theory

go along, dont bother, student

>> No.5078481

>>5078161
sorry if it's a newb question, but I thought if we could find a "good" solution for one NP complete problem, we could solve every NP problem.
Logic isn't my speciality, just took a course years ago.

>> No.5078485
File: 94 KB, 484x400, congratulations_retarded.jpg [View same] [iqdb] [saucenao] [google]
5078485

>>5078474
WTF? Are you fucking retarded?

QFT is already axiomized in terms of the only fucking consistent mathematical framework: category theory. IT'S BEEN DONE SINCE THE 1950'S.

Do you know anything or are you just going to copy something you read off of some outdated article with no source?

I am pretty sure you are out of your league kid. I doubt you even know what what quantum field theory is, you're just reading some garbage online you don't understand

>> No.5078500

>>5078485
A heart attack was near ...

>> No.5078515

>>5078461
What I mean by fundamental operation is that it has to have its own set of axioms. In other words, you can't derive multiplication from addition. This is the case in axiomatizations for dealing with normal arithmetic. Peano Arithmetic for example provides separate axioms for addition and multiplication.

As far as computers not being able to deal with multiplication, it's kind of a weak argument in some ways. Mainly because you can do multiplication with binary numbers and stuff as well. The reason I brought it up is because when you try doing multiplication from a computational perspective on irrational numbers (for example: '(pi)*(sqrt(2))') then you are forced to rely on infinite repeated addition algorithmically which means you never actually complete the multiplication only approximate it. Even then, symbolic systems like mathematica don't compute multiplication instead just manipulate it the way mathematicians do. Like I said, that particular argument was a weak argument aimed mostly at people trying to found mathematics on computation.

The thing about Diophantine problems is that It's not that it's beyond the scope of reality or that a mathematician can grasp metaphysical objects. The approaches are just too broad and you could spend an infinite amount of time on any of them. Many diophantine problems are thousands of years old and still have no solution with none on the horizon. Worse still, it may turn out that some things may be unprovable.

>> No.5078564

>>5078515
Good. So forget about that argument about multiplication, I dont like it either.

Yes, I actually was thinking about things, which are "senseless" or as you say
unprovable in terms of logic.

But If they are provable it means in finite time, what can be done in finite time can be done by a machine either. If this is not right - a brain is supernatual, since
it can grasp in this case something beyond physical reality.

Eventually, every formalism is wrong in rigorous sense, since the only way
to put some matter of "truth" to it - to test in real world by prediction, yet its not
rigorous. In this
sense at least for me it makes no difference, because both math. formalism and
computer logic are just some *approximations* to the real world. I dont believe,
mathematical structures live their own lives in ether. Making good approximations
then is just a matter of *efficency*. I dont see principal
difference in this sense between a computer or a human brain.

Category theory is cool stuff in way it generalizes knowledge and makes some
things clear and simple to understand, like OOP in comparison to low-level programming. And its acceptable by a machine better. I hope, in this sense
many difficult problems of today can be resolved via category-based computations (i.e types, so we can distinguish say rationals with irrationals), yet it will eventually use some sort of linear algebra. Eventually, in relation to Diophantine equations I would suggest new computers
could handle it, since "too broad" or "could spend infinite time" are no arguments for me :3 But the rest sounds good, thanks for a discussion

>> No.5078573

>>5078564

Thing is that mathematicians don't actually care about the real world and often times deal with bizarre custom made axiomatizations that have absolutely nothing to do with the real world. Why are they studied you ask? because by learning stuff about other axiomatizations then we can have more context to learn stuff about axiomatizations we're used to.

A lot of pure mathfags would probably take insult to mathematics being referred to as an approximation to the real world. In fact, it wouldn't be a stretch to say that the real world is an approximation of the mathematics.

>> No.5078583

>>5078564
Yea, even if mathematics proves as out of the reach of machines as it is to us, there's still value in many more practical situations for computer science.

>> No.5078596

>>5078573
Yes, pure platonism as it is. That is what I was actually trying to say.

No problem, it is their own decision. Eventually being a materialist or a platonist - isn't a rigorous formal question, it's just a philosophical outlook.

There can't be a rigorous argument if maths is an appr-n to the reality or reality
is appr-n to maths, and what is the "reality" then ? It's an old speculation like
in Matrix movie. I personally, dont beleive it, since we have no evidence for it.

>> No.5078759

> No friends
> No GF
> No good salary
> Doesn't really know what he is doing
> Nobody knows what he is doing
> Balding, fat or extremelly thin, bad smell
> Gik
> psychotic

>> No.5078862

>>5078596
Programmer here, have worked for almost 10 years.
Question: where the fuck do I need category theory and Lisp-like languages ?

>> No.5078922
File: 951 KB, 1479x2048, cutey_Emma-fake_gold.jpg [View same] [iqdb] [saucenao] [google]
5078922

>>5078467
The problem of constructions of Beaz and co is that they don't seem to enable you to compute, do they?
I'm a big fan of categories, but if nobody gets any insight, is it really a physical theory?