[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 195 KB, 502x528, matlab is better.png [View same] [iqdb] [saucenao] [google]
9781695 No.9781695 [Reply] [Original]

Which algorithms does pic related use to solve systems of differential equations? Trying to build my own solver system in Python; is it anything beyond Euler algorithms?

>> No.9781699

>>9781695

are we just talking ODE's here? in That case you can just reduce comparatively complex ones down to systems of first order which can be solved pretty easily(care for inhomogenous ones though). For PDE's it's a gigantic can of worms you dont want to open^^

>> No.9781711

>>9781699
Just ODEs for the purposes of my solver

I'll open the PDE can of worms though if you have it

>> No.9781717

>>9781695
Solve them numerically or analytically?

>> No.9781722

>>9781695
https://en.wikipedia.org/wiki/Runge%E2%80%93Kutta%E2%80%93Fehlberg_method

>> No.9781723

implement this

https://en.wikipedia.org/wiki/Risch_algorithm

>> No.9781726

>>9781717
Numerically is fine, trying to create a solver that can study the behaviour of the system depending on what constants I feed it etc.

>>9781722
Cheers

>> No.9781728

>>9781711

there is no closed , general ,analytical method for solving pde's ;) If you find one that is even only able to show when solutions exist you'll get a million dollars for trivializing the navier stokes problem^^

>> No.9781733

>>9781723
Thanks

>>9781728
What numerical methods exist?

>> No.9781739

>>9781733
http://4chan-science.wikia.com/wiki/Computer_Science_and_Engineering#Numerical_Partial_Differential_Equations

>> No.9781746

>>9781726
I used the Runge-Kutta method for ODEs and Crank-Nicolson for PDEs in python. It's quite fun writing them out yourself, as opposed to using a prepackaged option. You really get to understand how they work. It's really not that difficult. If you understand the method on paper (you can even do one or two steps by hand to really get it) it is easy to write it in code.

>> No.9781770

>>9781733

Never tried anything beyond the already suggested Crank-Nicolson method for them, wasn't that interested^^

>> No.9781798

Is there any way to solve pde's that you can't solve for y_n+1?

>> No.9782481

>>9781798
solve for y_n+3 then for y_n-2

>> No.9782542

>>9781695

holy shit you sound like a fucking retard

>> No.9782555

>>9781695
>an extremely reliable, easy to use and versatile CAS which can spit out analytic solutions or very decent numerical approximations in usually 4-5 seconds

>does it just use the Euler method

sure OP

>> No.9782558

Why the fuck is multigrid so slow

>> No.9783584

>>9782542
>>9782555
Well I don't fucking know do I, why do you think I posted this

>> No.9783642

Have you ever picked up a numerical analysis book? Runge-Kutta methods for starters (Euler's method is a just a single step RK) are an improvement over single step methods by using intermediate steps. Higher order RK methods are already way better than Euler's , which has LTE n^2.

Then there are multi-step methods, which are essentially derived by integrating interpolating polynomials of degree equal to the number of previous steps you wanna use. Examples include once again Euler's method when you're only using one step (hence one step Euler's method). If you integrate the linear interpolating polynomial between two points t_n and t_(n+h) and associated values u_n and u_(t_(n+h)) you will get Euler's method. Look up Adams-Bashforth methods, they are a common family of explicit multi-step methods obtained in the manner described above for higher degree polynomial interpolations.

Then you can get into implicit methods, which are solved using predictor-corrected methods. Implicit methods feature the next step u_(n+1)) on both sides of the equation, thus making them impossible to solve for directly in many cases. For this case, start with an explicit method, such as Adams-Bashforth, then plug the result u_(n+1) into an implicit Adams-Moultom method of same degree. Hence predictor (explicit method) - corrector (implicit approximated using prediction). Predictor-correcter methods yield a far higher precision then explicit methods alone.

Then there's something called iterated deferred correction (IDEC), which I didn't really understand or remember, and which had something to do with interpolating the error term, but which worked amazingly apparently.

Anyways, Euler's is terrible, it's just a starting point.

t. Brainlet who just finished math 128A at Berkeley.

>> No.9783645

>>9783642
To add to my reply, Wikipedia has a good description of a very simple method using Euler's to predict and Trapezoid to correct, called Heun's method. https://en.m.wikipedia.org/wiki/Predictor–corrector_method
This should radically improve performance over Euler's method alone.

>> No.9783659

>>9783645
Yeah I'm aware of the predictor-corrector Euler, that's what I meant in my OP.

>>9783642
And no never picked up a numerical analysis book, surely that's blatantly obvious by my question.

I've been looking at various techniques but I posted here because I'm curious to know exactly which methods mathematica uses - I can't find that information anywhere

>> No.9784223

Related question, is there a book with all these algorithms?

>> No.9785152

So do modern mathematicians only do software programming these days?

Do any mathematicians still make advancements the old fashioned way with just a pen and paper?

>> No.9786746

>>9785152

You're fucking stupid.

>> No.9786787
File: 61 KB, 490x404, shitposting.jpg [View same] [iqdb] [saucenao] [google]
9786787

>>9786746
>(you)

>> No.9787570

>>9786746

That doesn't answer the question.

>> No.9787867

>>9784223
Scientific Computing with Matlab and Octave, there are pdf's easily found online.
At Berkeley we used Burden Fares Burden, but the professor and GSI's all thought the book was trash.