[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 84 KB, 400x359, math.jpg [View same] [iqdb] [saucenao] [google]
6507389 No.6507389[DELETED]  [Reply] [Original]

why do we still do math by hand?
>have computers that can do math speed of light
>why not learn how to solve real life problems with computer
>leave mathematics classes open for shit tier math majors
>middle school, choose math or computer route

>> No.6507407
File: 33 KB, 403x357, My Sides.jpg [View same] [iqdb] [saucenao] [google]
6507407

>>6507389
>computers can do math
gr8 b8 m8 I r8 it an 8/8

>> No.6507409

>>6507389
Computers are for doing calculations.

However, the laypeople, even if they aren't good at calculation, still need to understand the basic concepts of applied math. A layperson should be able to interpret data tables and graphs, understand some basics of statistics, estimate a sum or a product, etc.

>> No.6507422

>>6507389
I'm taking the bait. First, no computers cannot "do math speed of light". Secondly, you seem to have misunderstood the differences between mathematics and calculations. While yes, calculations can be performed at a relatively quick rate by computers, computers lack the intelligence or creativity to properly perform at mathematics. For example, this is why mathematicians still have jobs these days, and why computers are not solving some of the most difficult problems in mathematics today. While computers can perform simplistic calculations, it is clear that they are not sentient beings capable of understanding and analysis, at least in their current form.

>> No.6507436

>>6507409
>Computers are for doing calculations.

Nonsense.

http://www.simonsfoundation.org/quanta/20130222-in-computers-we-trust/

>> No.6507437

>>6507422
Please don't waste such a quality post on a thread like this

>> No.6507444

>>6507422
You just need to program the computer to start proving things.

http://www.math.upenn.edu/~wilf/AeqB.html

>> No.6507446

Anyone who knows anything about programming should know that computers are shit at math.

They are just good at doing things really really fast.

>> No.6507495

>>6507444
>A bunch of algorithms implemented in an attempt to solve a specific problem in hypergeometry is equivalent to "programming a computer to prove things"

>>6507436
lol, what is this paper? It's clearly written by and about people who don't even understand what math is.

>To many in the field, programming a machine to prove a triangle identity — or to solve problems that have yet to be cracked by hand — moves the goalposts of a beloved 3,000-year-old game
>programming a machine to prove a triangle identity
>Proof by exhaustion on infinite sets.
ahahahahahahaha

Also
>>“The soul is the software,” said Zeilberger, who writes his own code using a popular math programming tool called Maple.
>soul
>popular, math, maple
wow dude, this is just awful.

>> No.6507502
File: 664 KB, 1280x800, 03.jpg [View same] [iqdb] [saucenao] [google]
6507502

>>6507495
>computers can't prove things!
>>oh yeah, what about this
>that's not a proof, because a computer did it

>> No.6507508

>>6507389
show me a computer that could have proved Fermat's Last Theorem

>> No.6507513

>>6507508
Show me a brain that's magic.

>> No.6507514

>>6507502
The computer didn't do it. A group of persons developed a bunch of algorithms over a span of several decades that allowed them to solve a problem much faster than they could've done by hand.

If the computer itself had figured out the algorithms and proven (math term) that they worked for the problem, then it would've been the computer doing math. As it stands all it did was computations.

>> No.6507515

>>6507514
The computer did do it.
>b-but someone else wrote the algorithm
B-but someone else taught the mathematician! Who fucking cares?

>> No.6507516

>>6507495
you're retarded. stay in school, kid.

>> No.6507518

>>6507513
This isn't a good argument. No one is denying that computers could eventually reach the point where they could do math, but we're still very far from that point. Though I think machine learning algorithms could certainly prove useful in finding patterns that humans can't see.

>> No.6507520

>>6507518
>this isn't a good argument
Neither is >>6507508

>> No.6507524

>>6507515
Crunching numbers is not the same as a logical proof that crunching numbers is sufficient. By themselves one is a result in mathematics, the other one is a waste of funding. Note all the countless results in mathematics where only an algorithm is produced and that's sufficient (no one cares about actually finding digits of pi, as long as an algorithm for computing it exists then it is sufficient).

>> No.6507530

>>6507520
I'm not that guy but it highlights the point that we're not there yet and until we are then you can't make claims like OPs.

>> No.6507549

>>6507524
>encoding mathematical statements doesn't work because then theorem-proving is just number crunching!
cute, but thankfully your distinction is nonexistent.

>>6507530
I don't think "there" is a well-defined point. In philosophy it is a common idea that there are no real skeptics, skeptical responses are just there to keep honest people honest. For some reason, when it comes to the capabilities of computers, people feel that it is actually reasonable to hold foundationally skeptical views, or otherwise create "non-computers" of the gaps arguments.

>> No.6507578

How would a computer procede, given the definitions it needs, to prove, I don't know, the monotone convergence theorem?

>> No.6507583

>>6507549
You have to prove that a collection of statements, if true, imply that the theorem is true.

>> No.6507586

>>6507549
A point when a computer could solve Fermat's Last Theorem on it's own. In other words, a point where it's no longer necessary for humans to do mathematics because computers can.

>> No.6507587

>>6507422
brute force, or hire mathematics phDs

http://phys.org/news/2014-02-math-proof-large-humans.html
computers have and will make math proofs too large for humans to check

even so, then let people interested in mathematics study mathematics. I wanted to know how computers work in high school but instead I was forced to learn geometry, algebra2, precalc and calc of which I have never used in my programming career

>> No.6507607

>>6507587
>I wanted to know how computers work in high school but instead I was forced to learn geometry, algebra2, precalc and calc of which I have never used in my programming career
>I don't use math in my programming career, that means no one does.

>> No.6507624

>>6507607
I am finishing up my bs in computer science and still have no idea how a computer really works from the bottom up. I much rather would of rather taken discrete mathematics and logic design in high school than geometry, calculus

I think there still needs to be mathematics classes but they can make all of calculus a 3 unit class instead of a full 15 units summarizing concepts and learning how to do matlab to do derivatives, integrals, etc

>> No.6507631

>>6507586
It's intersesting that you chose such an example. If that's your criterion for computers, what does that say about your opinion of people prior to Wiles's proof? It doesn't seem like a particularly interesting point, to me.

>> No.6507635

>>6507624
Yea, I can agree with that. I guess my issue as a mathfag is that I think there should in general just be a lot more math offered in highschool but it's really not practical. Discrete math would be useful as would graph theory. Both can become quite hard though beyond the basic concepts. Even shit like counting can become insanely hard, I wouldn't expect the average high schooler to be proficient at them, I wouldn't even expect a high school teacher to be comfortable enough with them to grade them.

>> No.6507643

>>6507389
calculations != mathematics.

Math is a science just as physics or cheistry is (albeit, not a physical one). New things are being discovered every year.

To put it simple, would you expect any CS major / phd to make any significant advancement in the field of maths? Probably not, but they will most like make some insane calculations, none the less ;).

>> No.6507644

>>6507587

>>6507422
I am this guy.

With your last point, this becomes an entirely different discussion. In this case, I would argue that while a high school curriculum is beneficial and somewhat essential to learning, self-learning is the arguably most important aspect of your life. Mark Twain said:

I have never let my schooling interfere with my education.

And perhaps you should take this quote to heart. If one were to be truly interested in such a topic, natural curiosity tends to guide them, and they learn without much help from others. Nowadays, you can simply log onto Khan Academy, or even back when you were in high school, go to the local library and withdraw books on the topic of computers, or take apart a broken computer yourself. Personally, I taught myself calculus in 8th grade along with about 10 other programming languages while never taking a computer science course because they were not offered. The bottom line is that computing courses (in high school) are not essential to your learning, and that while for the most part, middle-level to high-level courses (by the high school definition of course) can be rather useful to you, self-learning is typically the prime form of learning.

>> No.6507647

>>6507635
I though precalc was much harder than discrete mathematics

precalc had so many different concepts to which I had no practical use, really. memorizing hyberbolic functions, etc bored me to death

My high school was in silicon valley so I feel there were many students that could handle discrete mathematics and logic design in hs

>> No.6507648

>>6507644
I had to get top grades to get into a top university and that requires me to know 93% of all the material I was taught in school. If all my classes were Pass/No pass and I could still get into a top uni I wouldnt care what they taught us in school

>> No.6507649

>>6507643
creating new algorithms isn't math research?

>> No.6507730

>>6507647
Wait, are you talking about actual discrete math or the discrete math course that several (but not all) universities tend to offer undergrads as an intro to proofs course? If the latter then I can totally understand why you would think discrete math is easy. If the prior then I'm surprised anyone would brush it off as easy.

Assuming the prior, would you recommend a high school teach counting like in this PDF
http://ocw.mit.edu/high-school/mathematics/combinatorics-the-fine-art-of-counting/lecture-notes/MITHFH_lecturenotes_2.pdf
or would it be better if they taught the 12-fold way?
http://en.wikipedia.org/wiki/Twelvefold_way

>> No.6507735

Computers can do arithmetic, they cannot do math.

Yes, there is a widespread sentiment that we might be able to "teach" a computer the abstract thought required to independently prove something but that day is certainly not today.

>> No.6507746

>>6507735
>Computers can do arithmetic, they cannot do math.
The sure sign of a math pleb is someone who thinks that there is a real difference here, when the lack of a difference was proven beyond any doubt over fifty years ago.

>> No.6507775

>>6507746
>lack of a difference proven
lol

>> No.6507779
File: 7 KB, 203x248, 1380573996338.jpg [View same] [iqdb] [saucenao] [google]
6507779

>>6507746
>the lack of a difference was proven
you fucking people, and this is my first visit to /sci/ in a while

>> No.6507784

>>6507746
Ok you can't say something that dumb without citing your source

what proved that there is a "lack of a difference" over fifty years ago?

what is multiplication if not a series of sums?
what are integrals if not a series of multiplications (slivers of area y*dx) then sums?

>> No.6507786

>>6507784
>>6507746

I should say, I'm not disagreeing with you... the math pleb is wrong. But just... what the fuck are you citing that "proved" this?

>> No.6507795

>>6507624
Computer science is about algorithms not computers. Computer engineers might learn digital electronics, computer architecture, semiconductor physics, etc

>> No.6507796

>>6507784
>what is multiplication if not a series of sums?
>what are integrals if not a series of multiplications (slivers of area y*dx) then sums?
wat

>> No.6507848

This is a very interesting question. A fairly new field of math is the development of computerized proofs, and the only real objection is that it's "cheating". When the last few generations of mathematicians start to die off it will really start to take off.

>> No.6507855

>>6507389
ask your computer what 1/2 is I double dare you, you nigger.

>> No.6507864

Can a computer set axioms?

>> No.6507870

>>6507855
depends which type of number 1/2 is?

>> No.6507895

>>6507389
Because computers lack creative insight. They also don't generate new ideas.

>> No.6507899

>>6507786
Obviously for a particularly emphasized form of "proved" we'll just run into loeb's theorem, because if I can prove that "if a computer can prove it, it's true," then I can prove it's true. But clearly I can't prove a wide class of statements. But for the offhanded /sci/ remark kind of proof, godel's incompleteness theorems show that the language of arithmetic is technically all that is needed to prove things, so computers being able to perform arithmetic (along with the rest of the assorted background logic) is pretty much a done deal, it's just a matter of programming. It was meant in a ha-ha-only-serious manner. Perhaps you've made one or two comments like that in your life.

For the more wishy-washy side of it, you only need to note that it took us a few thousand years to program other people to do calculus, but not quite as long to have a lot of these problems done by computer. They're very quick learners, and we're programming other people to be better at teaching them at an alarming pace. Now if I ask you, could you teach Newton calculus before he invented it himself, you'd probably say yes, but what about Galileo? He was a pretty sharp guy. Or Archimedes or Euclid. (Assume no language barrier, time travel exists, etc.) I mean, there wasn't anything fundamentally different between Eudoxus and Leibniz, just some programming... right?

>> No.6507936

>>6507895
>Because computers lack creative insight. They also don't generate new ideas.

But they do! Way back in 1996 the very first, novel, computer generated proof was announced.

http://www.nytimes.com/library/cyber/week/1210math.html?pagewanted=all

The field of ATP is now full of programs that can generate novel proofs.

Mathematical and logical reasoning is what computers can do well.

>> No.6507937

>>6507389
Because for a huge amount of time each day, you will be doing math. Buying stuff? You gonna be doing some math. Driving somewhere? You probably gonna do some math for how long its going to take you.

Most of this math is of course pretty simple math, but you gonna need a lot of it and sometimes you just gonna need a bit more.

Saying computers can do math is just stupid. Sure, matlab or whatever you use is probably loads faster and perhaps more correct than you will ever be, but you can't use it if you don't understand it at least partially. Even if you just use wolfram alpha for everything, you gonna have to be somewhat aware of the math to be able to tell it what to do and change your input slightly to get the right answer. Never mind that you will have to do some calculation to know if the answer it gives you is the one you want.

>> No.6507944

>>6507937
Throughout all of human history we learned new math on accident, because there was no one to teach us. Please don't mistake this sorry state of affairs for being better than computers, who are just now learning, merely because of the force of a long history of such pathetic fumbling, masked for psychological reasons by the word "creativity" or "awareness."

>> No.6507973

You need to understand math before you can make a computer do it for you.

>> No.6507979

>>6507973
Is that why only grandmasters make chess programs?

>> No.6508033

>>6507407
a resistor and a capacitor can do integration.

>> No.6508037

>>6507899
>Obviously for a particularly emphasized form of "proved" we'll just run into loeb's theorem, because if I can prove that "if a computer can prove it, it's true," then I can prove it's true. But clearly I can't prove a wide class of statements.
That doesn't imply that the converse of your statement is true.

>godel's incompleteness theorems show that the language of arithmetic is technically all that is needed to prove thing
What? how do godel's incompleteness theorems even suggest that?

Archimedes had actually worked out differential and integral calculus in his time. The integral part of it was lost until the 1990s though with the only copy having been overwritten in religious scripture during the dark ages.

>> No.6508043

>>6507936
These are machine learning algorithms anon. They pretty much take a ton of sample data and attempt to mimic the underlying patterns.

>> No.6508046

>>6508037
The theorem doesn't suggest it. It uses the exact mechanics to prove its point. This is one of those cases where you look at the finger and not where it is pointing, rare though they may be.

>>6508037
>Archimedes had actually worked out differential and integral calculus in his time.
Yeah I'd like a cite on that one.

>> No.6508048

>>6508033
Computations are not proofs anon. Plants growing in a flowerpot also do a bunch of calculus computations to figure out which way to grow.

>> No.6508049

>>6507979
He didn't say you had to be the best in the world at math, he only said you had to understand it. I'm pretty sure the people who make chess programs understand the rules and have studied how the game is played.

>> No.6508058

>>6507973
>You need to understand math before you can make a computer do it for you.

that's like saying: you need to learn how to play a piano before you can play piano.

please, just quit posting and lurk. Better to remain silent and be thought a fool than to speak and to remove all doubt.

>> No.6508060

>computers can't do math!
sure they can, like this
>well it's not theorem-proving, that's just routine calculation
but what about these theorem
>well they're not discovering it on their own, they still have to be programmed!
but what about these self-learning de—
>it just doesn't count, ok, please believe me!

>> No.6508063

>>6508037
Why would someone argue for the converse of their own statement?

>> No.6508066

>>6508046
I think you misunderstood the theorem anon. It's applied to a specific axiomatic system but doesn't mean that said axiomatic system is sufficient to prove statements in other axiomatic systems.

>> No.6508067

>>6508058
not really

it's like saying "you need to learn music theory to play piano" although not 100% true it makes things a hell of a lot easier and makes you better

>> No.6508072

>>6508066
It's has literally nothing to do with the results of the theorem, bro.

>> No.6508079

>>6508046
It had been known that he had worked out some primitive form of differential calculus for ages. However apparently had had written a palimpsest where he detailed his actual method when thinking about how to approach problems and in that text he described his primitive form of the integral which he uses to compute the area of some solids.

The recovery project has been ongoing and it's hard to find information on it since it isn't all readily accessible. It's taken so long because the only existing copy of the palimpsest (discovered in the 1900s) was bleached, cut in half, restitched lengthwise, and written over in religious text, on top of that there's been pages torn out and it's been burned amongst other things. You can find some interviews here and there where the researchers describe his writings on the integral but I don't have any bookmarked to link. Here is their webpage. Soon google is going to make available a recovered digital version of the palimpsest.

http://archimedespalimpsest.org/digital/

>> No.6508086

>>6508063
Given a statement
if P then Q
the converse is
if Q then P

It's a common error for people to see a situation where it's true that if P then Q and assume that also means if Q then P. In other words "If a computer can prove a theorem then so can a human" does not imply that "if a human can prove a theorem then so can a computer".

>> No.6508087

>>6508079
Well it was known that he was working in that direction, sure. I'd be interested to see the final results of this.

Either way my question did not depend on exactly the people I mentioned, I only mentioned them for being memorable people of their time.

>> No.6508091

>>6508086
>>6507513

>> No.6508126

>>6508087
I was able to dig up some more information.

An explicit description of some of the techniques used in the palimpsest.
http://en.wikipedia.org/wiki/The_Method_of_Mechanical_Theorems

A more general description of the palimpsest's content.
http://en.wikipedia.org/wiki/Archimedes_Palimpsest#Mathematical_content

This is new to me as well, prior to this I'd only seen interviews. There are images up on their site but aside from being huge and illegible they're also in greek.

>> No.6508138

>>6508067
Not really. I remember "doing" math in high school by punching numbers into my calculator and getting an answer without really understanding how. I might have gotten the correct answer, but that sort of approach of relying on formulas and computers would never allow me to create something unique. By the same principle, if we relied solely on computers to do mathematics for us, we would never advance ourselves beyond our current level.
>>6508067
Although it is always tempting to make analogies to prose an idea, they don't always equate properly. Learning to play an instrument is fundamentally different from how one learns math, yet at the same time the way you learn are similar. While both disciplines may be learned through rigorous practice, the baseline of mathematics is to show relationships of values in an objective manner. You could just plug values into a calculator and get an answer, but you would be stuck when confronted by a unique problem. While had you learned how mathematical formulas are derived, the process of structuring a problem, and translating it into the language of mathematics; you might be able to create a way to solve a unique problem and even write a program so that a computer could do the calculations for you. Also, an instrument is rather a means to express emotion through sound. Music came first, the theory came afterwards.

>> No.6509688

>>6507389
>have computers that can do math speed of light

So velocity is now measure of computational speed?

>> No.6509692

Calculations and math are not the same thing.

>> No.6509712

>>6509692
Implying you can't derive the truth of the cosmos just from the Peano axioms.

>> No.6509984

>>6509712
>implying that pure mathematicians care about the cosmos. It's just a special case afterall.