[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 51 KB, 307x173, Physics-informed_nerural_networks.png [View same] [iqdb] [saucenao] [google]
14994421 No.14994421 [Reply] [Original]

Any /sci/ anons working on Deep Learning in Fluid Mechanics? What's your opinion on pursuing a PhD in this?

>> No.14994433

>>14994421
Why would you use deep learning in fluid mechanics? That doesn't make any sense.

>> No.14994460

>>14994421
here are several pieces of advice:
1) do not let anyone, especially a research advisor, to talk you into pursuing a topic because it is now trendy/easy to publish like AI/deep learning - been there, done that, currently regretting it
2) AI of any kind (especially deep learning) is all about the data, especially high quality data - even the best model will have shit performance if it does not have enough data to learn from
3) since doing the amount of experiments to generate sufficient amounts of data for deep learning is hardly feasible, you will almost certainly have to generate data by numerical computations (because analytical models either have very limited applicability or suck ass in most real-world scenarios) - if you are not skilled in those and/or do not have powerful enough computing hardware, dont even start, learning/doing computation on top of learning/doing AI will be the end of you like it is being the end of me right now
4) if possible, look for data that already exists, it has the benefit of you not needing to make it yourself (and that is a BIG plus)
5) have a plan B that can be feasible with the work you have done working for plan A if plan A fails

>> No.14994463

>>14994460
fuck, messed it up - I am not OP

>> No.14994479

>>14994433
For potentially designing novel cfd solvers that can physics and data. Might be super useful in the aerospace and automobile industry for rapid analysis of designs.

>> No.14994492

>>14994460
Thanks anon for the advice.

I should clarify that I am interested in this topic because I see a lot of potential applications of this technology (as digital twins or novel simulation algorithms etc).

Another plus point of this "Physics Informed" Deep Learning is that we encode the physical laws as loss functions, so we can train our model with sparse data.

>> No.14994715

>>14994492
first I hear of it, tell me more/give literature references
I am >>14994460

>> No.14996209

>>14994715
Sure,
https://arxiv.org/abs/1711.10561
https://arxiv.org/abs/2104.08249

>> No.14996316

>>14996209
these look like good sources, thank you for sharing them.
I am a bit worried regarding the trustworthiness of the 1st one as it is only arXiv published instead of journal published (i.e. likely not peer-reviewed), but if it is indeed truthful, it is a very informative source
you seem to have thought this out fairly well, but I would advise you to first at least try developing yourself something what would work as a proof-of-concept for what you are trying/wanting to do.
once you have that, it would be easier to convince professors to take you as a student (dont forget - a phd is a 2 way endeavor - you may want to be a phd student for a certain professor, but that professor must also want you to be their student)
For someone as your advisor, I suggest looking for one who would be either be from a numerical modelling or machine learning background, as having expertise for advise with this will likely prove helpful overall.
Institutions dont matter as much as the work you do there, so dont get discouraged if you dont get into the best schoold, being even in a mid-tier institution is sufficient if you dont slack in doing your work

happy hunting

>> No.14996317

>>14996209
say, perhaps you any more related sources?

>> No.14996319

>>14994479
>Might be super useful in the aerospace and automobile industry for rapid analysis of designs.
There's plenty of rapid analysis tools that actually work and deep learning would actually be poor for this purpose since it only learns a black box model that can't give you any insights.

>> No.14996716

>>14996316


>For someone as your advisor, I suggest looking for one who would be either be from a numerical modelling or machine learning background, as having expertise for advise with this will likely prove helpful overall.
>Institutions dont matter as much as the work you do there, so dont get discouraged if you dont get into the best schoold, being even in a mid-tier institution is sufficient if you dont slack in doing your work

I was just about to ask this. I had a discussion with an assistant professor at a mid-tier US uni , who has numerical modeling and deep learning experience. I presented a small project I had done using PINNs for a toy CFD problem. The prof was impressed and encouraged me to apply. Also, he has good funding and some innovative ideas.

The only issue is that his lab is pretty new (currently 3 PhD students working under him and the lab was established in 2019-20).

Do you think it is worth the risk to do PhD under him?

>> No.14996723

>>14996317
I believe this literature review (https://arxiv.org/abs/2201.05624)) tries to cover ongoing research on this topic.

>> No.14996733

>>14996319
I agree anon, but a lot of these tools fast running models sacrifice a lot of accuracy.

Think if you develop a surrogate of jet engines by training it on experimental and 3D numerical studies and use that for intersystem analysis instead of using 1D models.

Although, I agree on the black box argument.

>> No.14996832

>>14996716
>I was just about to ask this. I had a discussion with an assistant professor at a mid-tier US uni , who has numerical modeling and deep learning experience. I presented a small project I had done using PINNs for a toy CFD problem. The prof was impressed and encouraged me to apply. Also, he has good funding and some innovative ideas
Seems like a good choice from what you described. Take my words with a grain of salt though, I am only a phd student myself (3rd year)
>The only issue is that his lab is pretty new (currently 3 PhD students working under him and the lab was established in 2019-20)
That is sufficient for a small lab, I work for a lab/research group not much larger than this. Also, if he is innovative and has good funding lined up, it could be worthwhile. Also, younger professionals typically are more ambitious than well established profs because they still have to prove themselves, and are more hands-on too (but this may not necessarily be the case with everyone). In my opinion, having a mentor actually work with you/assist/advise you is more important than having a fancy name on the paperwork, even if that name can open more doors for you later on, but thats just my opinion. The newer professional will likely be able to help you develop actual skills/know-how but will not have the name recognition, while the more established professional will leave you on your own but will have better name recognition and connections.
Also, consider the time commitments, because face-to-face communications is very valuable - would you rather have someone who can meet with you 1-2 hours weekly or even more time per week, or someone who can give you 30 minutes max once in 2 weeks or a month?
Also, I cannot stress this enough - it really helps if you are on good terms with your advisor and like them as a person, than can make a big difference - do you like this guy? Because working for/with a likable person makes a big difference for your motivation

>> No.14996840

>>14996723
thank you, already bookmarked it

>> No.14996911

>>14996723
if you have gathered any other literature regarding this physics informed neural network concept, please share, it is very relevant to me

>> No.14996996

>>14994433
Well they're both basically the same thing: very complex, convoluted derivatives.

>> No.14997910

>>14996832

>Also, I cannot stress this enough - it really helps if you are on good terms with your advisor and like them as a person, than can make a big difference - do you like this guy? Because working for/with a likable person makes a big difference for your motivation


From the conversation I had with him and his current PhD student, he seems to be a chill and a nice person to work with.

Thank you for such a detailed response, anon. This has helped me to clear a lot of my doubts and questions about enrolling into a PhD.

And best of luck for your PhD!

>> No.14997916

>>14996911
I would recommend following Prof.George Karniadakis of Brown University. He and his post-docs from his lab (https://www.brown.edu/research/projects/crunch/home)) has basically pioneered this Deep Learning Strategy. Hope this helps!

>> No.14998186

>>14997910
if that is the case, and you really want to do a phd (it is not those who are the most brilliant, but those who persevere that end up as victors in the end) and really have nothing better to do, might as well go for it
>>14997916
will follow up on this, thank you

>> No.14998316

>>14998186
>really have nothing better to do, might as well go for it

This keeps haunting me. I had superb grades, multiple internships and bagged a CFD analyst role in a big automobile industry after my Mech bachelors

On the other hand, my CS and EE buddies barely passed their courses and got into FAGMAN earning 4-5x of my income.

A part of me still keeps telling me to ditch everything and do a MS in Supply Chain or Data Science and join tech companies.

>> No.14998942

>>14994460
>or do not have powerful enough computing hardware, dont even start

Yknow some colleges have supercomputers or at the very least outsource the computational requirement to somewhere else. Could even use google colab

>> No.14998953

>>14994433
Well it does
1) AI is overhyped so you need to put AI in somewhere to get reviews.
2) Actually solving the differential equation can be very computationally demanding task, AI can basically provide a simpler result with less heavy computations. Its like that you would think how the liquid would move without solving any equations. Like remembering how the water curls in your grandmas garden fountain

>> No.14998996

>>14998942
that assumes the uni is rich enough to own the hardware (supercomuputer/cluster/etc) and is not a primarily experimental research based institution - some unis focus more on theory, others more on experimental approaches and have barely any theoretical people working there

>> No.14999095

>>14998316
yes, market need plays an important role in employment and salary
sadly, higher education does not necessarily mean better employment - the longer you stay in uni (in case of studying) or science/academia (in case of work), the more you gain science-related skills and know-how, but not industry-related skills and know-how, that can lead to a bad market fit, or being "overeducated", especially of you are doing this education in something that is not directly industry related.

I think, that if you already have a good job you are happy with, it is not worthwhile to go back to school
If you dont like your job or want to change your field, then going into further education might be a real option
Doing a phd means you are eligible to do a scientific career afterwards, but might be less employable in industry (all the time you do a phd is the time you are not gaining industry job experience), except for cases where your specific skillset is needed, like R&D department of a company that works specifically in that field.
Also, looking into an industrial phd might be another option (yes, that is a real thing - you essentially do your phd not only in the uni, but also at a company)

>> No.14999102 [DELETED] 

>>14994421
We're probably all going to be dead before you finish your degree.

>> No.15000904

bumping for currently best /sci/ content to stay alive

>> No.15001174

>>14996209
>>14996723
>>14997916
largely, these speak about solving single dif equations, not their systems - do you have any sources that solve systems of equations, like Maxwell equations for instance?

>> No.15001356

>>15001174
Nope, sorry. But, I think if you search on google by using the string "PINN"+ whatever PDE's, ODE's etc, you should get some decent results.

This research area is pretty novel, so I would say there is a chance that the topic you are looking for might not be published.

>> No.15003146

bump

>> No.15003184

>>14994421
If you're pursuing a phd in deep learning in 2020+ you're literally a decade behind, your work will be meaningless.
Unless you're developing new model/platform paradigms for ML and AI, you're literally going nowhere.
ML is probably one of the most well studied topics, on par with silicon, which is way too much to be relevant as a research topic these days, all mediocre researchers do is throw ML at some problem and see if it sticks, when there are thoudands of other ways around the problem that jist require a bit of engineering knowledge and creativity.
People literally rediscover the wheel every single day by using ML on problems that are already solved using cheap signal processing / control theory tricks.

>> No.15003197

>>14998316
Honeymoon phase of CS is already in the past, don't worry.
You can tell something's wrong when people graduate with STEM degrees become influencers and tell you with a straight face that they never need the math/physics/hardware in their job.

I do embedded design for my country's defense procurement (or whatever you call it in english) and I have to learn new physics shit every single day, I work with propulsion engineers, participated in designing simulators etc.

When I hear about fagman stacies racking up 120k a year + benefits for working on a non profitable app that barely changed since the day it was created by the CEO in his garage, and is still somehow buggy as hell on 2 years old smartphones, I can tell the market is going to collapse in the coming years.

Massive layoffs in silicon valley are just the begining.

>> No.15003517

>>15003184
Although I agree with your sentiment anon, topics like turbulence modeling in Fluid Dynamics have not progressed in any significant way for past 2-3 decades theoritically, because they are almost impossible to computational simulate except for extremely basic problems. If we could design human-interpretable DL models, we may have the chance to demystify turbulence and inturn help in designing efficient machinery.

I would say that ML/DL for computer vision or NLP is oversaturated, but Scientific ML/, atlest Physics Informed ML, is just getting started.

>> No.15003523

>>14998953
> 2) Actually solving the differential equation can be very computationally demanding task, AI can basically provide a simpler result with less heavy computations. Its like that you would think how the liquid would move without solving any equations. Like remembering how the water curls in your grandmas garden fountain

Genius analogy anon

>> No.15003535

>>15003197
Yeah, it feels like a bubble, but it still get FOMO when I hear my tech friends' TC.

It is not that I am jealous of my buddies, but I feel I fucked up by taking up Mechanical Engineering as my major.

Idk anons, how do you guys take a decision about your education? Do you just follow whatever interest you or do you try to learn those skills which may help you to earn big?

>> No.15003877

>>15003184
any examples of signal/control theory problems that retards "solved" with ML?

>> No.15003878

>>14996733
So do ANNs. You're replacing a specialized heuristic with a worse one. One that doesn't even save computational power.

>> No.15003898

>>15003878
PINNs are ANNs + Physics laws encoded as loss functions.

But in a way, you're right. This technique might not be mass adopted in industry, but it will be important tool, as powerful as or better than Finite Volume Method in fundamental research.

For example, a scientist, who has minimal experience in simulations, can use this DL method instead of writing sub-optimal code or worse, buying commericial software.

>> No.15003909

>>14994433
i would guess performance efficiency?
ai estimates or infers way of approximating realistic dynamic in more performant capacity

>> No.15003910

just watch water flow and use your head, duh

>> No.15003920

>>14994492
>Another plus point of this "Physics Informed" Deep Learning is that we encode the physical laws as loss functions, so we can train our model with sparse data.
The value of your "trained model" is zero if it is simply curve fitting to some experimental data, i.e., AI applied to the problem as your describe it.

Protip: the "loss function" has already been described as an error minimization problem by e.g., the finite element or finite volume method.

What do you expect mathematical optimization applied to experimental datasets to contribute to fluid mechanics? The purpose of fluid mechanics research is to derive rational models and improve on them - not to do case-specific curve-fitting.

>> No.15003929

>>15003898
>For example, a scientist, who has minimal experience in simulations, can use this DL method instead of writing sub-optimal code or worse, buying commericial software.
How will you set up a fluid mechanics problem and solve it in a rational (read: generalizable) manner by using the DL method you mention?

Fluid mechanics solvers use weak forms of PDE:s and retain a rational basis by doing so. How does your DL method differ from this? If the difference is that the DL method relies only on data (from experiments or from simulations based on the classical FEM/FVM/FDM somulations), it is useless, because such results cannot be generalized.

>> No.15003937

Should i do a PhD in turbolence after master degree in physics? Do you have suggestions for master thesis?

>> No.15003940

>>14994421
wtf is the point, you can get fluid-like behavior without deep memeing. I thought the whole problem with fluid sims was accuracy?

>> No.15003993

>>15003920
>>15003929
Anon, I think you should go through these papers >>14996209

The point of PINNs is that it requires sparse or no data to train. It is basically unsupervised learning.

The way it trains is that it uses the generalization of BackPropgation Algorithm which is known as Auto-Differentiation. So, after calculating the outputs (which are flow parameters like velocity, pressure etc) and then use auto-diff algorithm to calculate partial-derivatives with inputs which are space-time coordinates. So, in essence, we are trying to represent the PDEs without any FVM/FEM discretization.

On the other hand, you have pointed out correctly that the DL can only represent one instance of the boundary and initial conditions. But, some researchers have already used techniques like transfer learning to use pre-train models. Furthermore, some are even claming that they can create generalize model (https://arxiv.org/abs/2202.05476).).

>> No.15004017

>>15003993
no training data required - how? unsupervised learning is only supposed to do data clustering as far as I know
do you have more literature on this?
also, maybe you have some good step-by-step tutorials for a beginner?

>> No.15004071

>>15004017
>>15004017
>no training data required - how?

Basically you segregate domain into 3 parts and represent them as a tuple of space and time (x,y,z,t). The three parts are the initial conditions (x,y,z,0), boundary conditions (0,t) and internal points (x,y,z,t).

For the initial and boundary conditions, you basically use the supervised strategy i.e. calculating MSE loss between predicted value and ground truth.

The "unsupervised" parts kicks in when for the interior points, you basically use the PDEs to "regularise" the solution.

>also, maybe you have some good step-by-step tutorials for a beginner?
There are many implementations available on github, but I would recommend looking into the DeepXDE library (https://deepxde.readthedocs.io/en/latest/).). It explains this stuff using rigorous math.

>> No.15004076

>>14994460
>since doing the amount of experiments to generate sufficient amounts of data for deep learning is hardly feasible, you will almost certainly have to generate data by numerical computations
Can't you just have a physics experiment recording data 24hs a day ?

>> No.15004104

>>15003535
>but I feel I fucked up by taking up Mechanical Engineering as my major.
Heh, at least you are not me. Took a mechatronics degree and now all ME/EE/CS fags laugh at me, feelsbadman

>> No.15004108

While this thread is up, any good resources for deep learning theory? In particular, any good books besides the Goodfellow/Bengio one?

>> No.15004141

>>15004071
thank you for the reference, seems like an informative and worthwhile read
this seems to involve time, what if I want a steady-state solution, like from FDFD or RCWA method for Maxwell equations?

>> No.15004161

I'm happy I found this thread. I have been looking into PINNs a lot recently and I struggle to see what's interesting with them.

Its a method to solve PDEs by incorporating the form of the PDE into the loss. There is not much novelty here in my opinion. First problem that some other anons pointed out is that they're data-driven. As with any DL algorithm they will not generalize well, you have no guarantees and you need to solve them yourself first (to generate the training data) to be able to train them. I don't have much knowledge about the application side but I've heard (from researchers in fluid dynamics) that standard numerical solvers are faster (this of course depends on what you want to do).

I can see two interesting parts that I'm currently pursuing. First problem is that they're data-driven. A better method - which is at the core of DL research - is building the biases you want (in this case constraints from a PDE) into the architecture. A lot has been done in incorporating group symmetries into NN (convolutional nets, g-convolutions, graph networks) - same can be done for physical laws (https://arxiv.org/abs/2210.01741))

The second is discovering the underlying PDE from data i.e. symbolic regression. This is more about using NNs to discover new governing laws or invariants from data. I find the idea very appealing, but it's very engineery and uninspired at this point. You don't have much mathematical guarantees as there will be infinitely many equations fitting your data. How to choose the correct one is an open problem.

>> No.15004176

>>14994421
I can see this work for video games, CGI, VR and AR or all things that don't require a realistic model. You could make gameplay with where the player is an alchemist and has to forge their own concoction's stuff like that. It probably won't be as accurate as an actual calculated rendering.

>> No.15004373

>>15004071
I see, very interesting stuff
this is for python
realistically speaking, how fucked am I for taking the matlab approach here? from what I've read, it has the base functionality for neural networks and deep learning, but is not very popular in the deep learning community
I had 1 course where I had to use python and was terrible at it, found it difficult to go from matlab to python since most of my work previously was done in matlab. I have ~1 year left in my phd, is it still worthwhile to try to learn a new language (python) in order to have better opportunities to do these things, or should I stick with matlab because taking the time to transfer would outweigh the increase in opportunities and it would not make sense to transfer any more (a.k.a. too late, already rekt)?

serious question-at do you think?

>> No.15004951

>>15004373
I think anon, you should try to learn python. It will be beneficial to you in the long run

>> No.15005411

>>15004951
I understand that and plan on getting into it after (hopefully) graduating, but what about right now, before that time?

>> No.15005420

https://youtu.be/IMUqCLswa3s

>> No.15005798

>>15004161
Yeah, physical laws as hard constraints within the NN architecture might be the way forward, anon. Thanks for sharing the paper.

>> No.15005800

>>15005411
If you already done with your major parts of your PhD research, I would say that you can maybe try to use Matplotlib for visualizing your results. The syntax is basically equivalent to MATLAB, but you might get a good introduction in using Python libraries.

>> No.15005857

>>15005800
yes, but what I am asking is about the actual coding of the neural network - the actual "meat" of the work

>> No.15006284

bump

>> No.15006748

>>15005857
it is possible to call python functionality from matlab, maybe just use that for deep learning modules?

>> No.15006761

>>15006748
just use PyTorch

>> No.15006811

https://arxiv.org/abs/2202.11214

any thoughts on this? a group at NVIDIA created an extremely efficient DL model which emulates a standard numerical weather prediction model. in this field, extreme efficient gains (5 orders of magnitude apparently) can be worth a slight accuracy reduction because weather models needs to be extremely timely and reactive to what's happening in real time. these guys did not use PINNs, which makes me think this would be the natural next step to bump up the accuracy. basically, it doesn't matter if it takes a long as fuck time to train, if it can be run operationally much faster than a traditional solver, then it's useful.

>> No.15007724

>>15006761
isnt tensorflow more readable/easy to understand though?

>> No.15008185

>>15006811
I agree, they might take a step in that direction. NVIDIA already has their own PINN toolbox https://developer.nvidia.com/modulus

>> No.15008187

>>15007724
No I've found PyTorch easier to read easily

>> No.15008189

https://youtu.be/IMUqCLswa3s

Training a Chess AI in Pytorch

>> No.15008217

>>15007724

Pytorch lightning > Pytorch > Tensorflow Keras > Tensorflow

In terms of readability. Keras and lightning are basically easy to use wrappers for the underlying library. Think a higher level programming language layer that takes some control from you in exchange for ease of use. Haven't used in Keras in ages so maybe it's easier now but overall Pytorch is the user friendly small project option, Tensorflow is more suited to production models and allegedly performs better at scale.

The arms race between the two packages means everything I've written will be outdated in a few months.

>> No.15008234

>>15008217
Why is facebooks stuff always better

React > Angular
Pytorch > TensorFlow

It's like the google engineers have ulterior motives'''

and don't quote me on this

>> No.15008976

>>15004071
and what about time-independent diff equations where a function is a steady-state solution?

>> No.15008985

>>14994421
ML is a meme. If you want to make a big name for yourself in CFD the biggest needs right now are more efficient and flexible algorithms, especially for marrying dynamics on different timescales. I've got a buddy who literally did a whole dissertation on trying to bridge electron, ion, and observational timescales in a plasma sim to try and model experimental observations - optimizing code that needs to run quickly and smoothly across ~8-9 orders of magnitude. Same thing is true in neutral fluid dynamics - there's a ton of important phenomena that concern aerodynamics, meteorology, etc. that require being able to efficiently and accurately model behaviors that cover large time-scales.

>> No.15009673

>>15008985
>biggest needs right now are more efficient and flexible algorithms
thats what ML does though

>> No.15010132

I just found a good overview of the "state of the art" of PINNs: https://pantheon.ufrj.br/bitstream/11422/15774/1/PHSSCerqueira.pdf

>> No.15010427

>>15008985
Anon, idk how familiar you are with turbulence modeling in CFD. It was predicted that we may have enough computational resources that Large Eddy Simulation will become the industry standard in mid 2010s, but it never happened. So, now the industry is stuck with RANS model build in the 1980-90s.

In my opinion, PINN will help us to break this stagnation

>> No.15010600

>>15009673
Providing a black box curve fit is of no use in fluid mechanics.

>> No.15010671

>>15010600
some models are more interpretable than others
besides, even if it is a black box, if it computes the same thing as a fancy numerical model, but much faster, is that so bad?

>> No.15011673

>>15004071
I see most examples on the internet using Dirichlet boundaries, what if I want something that uses both Dirichlet and periodic boundaries, like found here for example: http://dx.doi.org/10.2528/PIERB14071606

>> No.15012434

bump

>> No.15012446

>>15010671
Without knowing the underlying mechanisms for calculating the results, what is your basis for declaring the results of the black box simulation to be accurate?

It's the equivalent of fitting a 50-th order polynomial to experimental data. Sure, it fits every data point you gave it, now, but you've lost any underlying first principles and there's a very good chance your model is going to be completely fucking useless for anything vaguely deviating away from the data you already gave it.

>> No.15013856

>>15012446
I thenk that is just called overfitting and there are ways to prevent it

>> No.15014288

>>15004141
>>15008976
>>15011673

If anyone could provide answers to any of these questions, I would be very grateful

>> No.15014407

>>15013856
Overfitting is one of the biggest problems with ML. Tons of models will do extremely well on training data or modelling *very* small variations on training data, but the moment you stray too far from it you get fucking ridiculous results.

>> No.15014841

>>15014407
well, if you put a human in a situation it is completely unfamiliar with, it wont do any better either - it is simply a general property of intelligence - you are proficient with what you are familiar

>> No.15014848
File: 16 KB, 400x400, gigachad.jpg [View same] [iqdb] [saucenao] [google]
15014848

>>14994463
Sure you are. You're Over Powered.
King.

https://youtu.be/Ux5cQbO_ybw

>> No.15015240

Source: My own PhD in data driven methods for fluids. Steve Brunton, George Karniadakis, Omer San and their kind are snake oil salesmen. They make the tiniest tweak in their ANN architecture, test it on 1D Burgers' or 2D Bossinesq flow, never report the cost of training the network, and call it a machine learning revolution. They have skewed incentives to peddle this stuff because DOE bureaucrats dont know how to recognize bullshit, and they know Congress wants to hear about investments in AI. The fact of the matter is that the money is better spent on stuff like high order DG, high-enthalpy solvers, and adaptive/automated meshing, but that isn't as sexy as machine learning in annual reports. PINNs are impossible to converge, SINDy is a solution looking for a problem, and everything else is just fancy curve fitting. I've been able to make a good career after my PhD on this stuff, but it is honestly just a lot of hot air and marketing.

>> No.15015308

>>14994421
I don't get how neural networks can possibly help with CFD simulations. Laminar flow simulations are basically a solved problem, so the only application is in turbulent flow simulations. But the basis of the utility of neural networks is that you train them on a shitton of data, and hope that your simulations are close enough to the training data so that your results are reasonably accurate. But this has two issues: first, you can never provide enough data so that any simulation you do resembles a training case, and second, turbulence is a fundamentally chaotic phenomenon. A tiny change in boundary or initial conditions can dramatically change the outcome of the simulation.

>> No.15015512

>>15015240
Oh, that is kind of depressing to hear, anon. Do you think there is a chance of hybrid solvers, that use both PINN and traditional solvers. For instance, we calculate primitive flow parameters using pressure and energy using SIMPLE or PISO algorithm and then use those variables to calculate temperature by PINN implementation of the energy eqn.

Additionally, I wanted to ask what and where are you working after your PhD? Do you still use PINN in industry or academia job?

My idea was that if I do a PhD in AI/DL in CFD, I still develop the skills to work as DL engineer in tech industry if CFD stuff fails to be adopted by industry.

>> No.15015517

>>15015512
Sorry messed up the example. I wanted to say, continuity and momentum eqns solved using trad solvers and energy using PINN.

>> No.15015572

>>15015308

Well, high sensitivity to initial conditions could hypothetically be approximateble by a NN.

Do I think it will happen any time soon? Nah.

>> No.15015653

>>15014848
it was so random it actually made me laugh. thanks, I needed that

>> No.15015668

>>15015572
>Well, high sensitivity to initial conditions could hypothetically be approximateble by a NN.
That's the thing, you can't approximate chaotic systems with neural networks as far as I know. They're not useful for that type of application.

>> No.15015955

>>15015512
If you want to do ML research, I would say that's the wrong path to go. Places like OpenAI, Google Brain, DeepMind are light years ahead of what you'd deal with in CFD ML, and for that work you'd need to do an ML PhD. Secondary places like Amazon, Meta, Nvidia will gladly take you, they actually have a decent amount of physics-based ML (for easier problems like flight trajectories and structural dynamics). If you want to be just a normal ML engineer implementing tried and true models into the company's workflow, you don't need a PhD. Online courses are unironically good enough, you just need to know modern a couple ML packages, some database knowledge, and data pipeline management.

>> No.15015960

>>15015512
>>15015955
Forgot to say, I work in government research now. ML has had a lot of success in that area with nuclear physics, molecular dynamics, materials, and energy grids. Flow physics for real life engineering systems is leagues harder than those problems, ML has only made some inroads in CFD in government with people like Youngsoo Choi and George Karniadakis, and half their time is spend marketing their methods instead of doing useful research.

>> No.15016039

>>15015955
>>15015960

Thanks anon for the useful info! I have just started to learn TF and made some toy PINN models.

Do you know any specific resource for the database knowledge and pipeline management. I have good introductory background in DL theory

>> No.15016131

>>14994421
Completely retarded. Numeric solutions already work incredibly well.

>> No.15016140

>>15004176
Also this. It's only useful for simulations that aren't realistic.

There's literally no reason to push a neural network onto the physical problem. Numeric PDE solutions are already easy to implement, straightforward in design, beautifully elegant, and easy to interpret.

I can't think of a single reason to do it other than maybe videogames.

>> No.15016142

>>15015308
The only thing I can think of is some unique schema utilizing auto differentials, but you hardly need a neural network for that.

>> No.15016446

>>15016039
As a small word of advice, ML folks are moving away from TF to PyTorch. It doesn't matter where you start, I learned with TF, but you should eventually learn PyTorch.

No idea about data management for commercial applications, I sometimes see related stuff on Coursera. My databases are just whatever simulations I run, but commercial databases (images, online orders, audio files) are constantly updating and changing, which is where data management is more important. If you stay in physics and engineering, youll probably just learn it on the job.

>> No.15017642

bump

>> No.15018279

>>14994421
> fluid mechanics using curve fitting estimators
Seems like a nice way to introduce error into your PDEs that will be used by foreigners with potentially harmful consequences to humanity :^)

>> No.15018632

>>15003197
>when people graduate with STEM degrees become influencers and tell you with a straight face that they never need the math/physics/hardware in their job.
That's because what they're doing isn't CS and didn't need CS in the first place. They were software developers whose degree passed an HR test, but an actual CS job at a place like google or Microsoft requires a PhD at the minimum. I mean here's a recent result from google research:
https://arxiv.org/abs/2101.05549
this is what CS research actually entails (well, both this and implementing these types of things at scale).

My experience with CS has always been with the hardcore theory people (who are basically mathematicians working on a gamut of problems either pure or industrial) and hardcore systems people (who engineer their own chips / takeouts)

>> No.15019688
File: 40 KB, 666x583, this is intense.jpg [View same] [iqdb] [saucenao] [google]
15019688

>>15018632

>> No.15020233

>>15019688
same impression when I opened that article, made my day

>> No.15020851

>>15018279
an adaptive fitter that saves a lot of time is better than an exact solver that needs a supercomputer to run, if the fitter has 99 % accuracy

>> No.15021935

>>15020851
true

>> No.15022080

Hey /sci/, I'm also considering a PhD loosely in the area of sciML, although the supervisor is likely to nudge me more in the direction of optimisation (e.g. minimising cost functions more effectively). It's with the maths department, not CS. I'd like to eventually end up doing ML research on cool cutting edge shit, is this a decent path to take?

>> No.15022107

>>15022080
well, a big part of ML is finding faster/less compute-heavy solvers for models, so it could be a good way to get there, but you would likely need to do some extra learning on the ML side of things

>> No.15022122

>>15022107
yeah i figure i would. some advice i've received has been basically don't pigeonhole yourself too much with a particular ML application for a PhD, because the field moves fast and during that time google/deepmind/etc are probably going to come out with something better. best to focus on the fundamentals

>> No.15022621

>>15022122
yes, definitely valid advice on the pinhole thing, fundamentals are always good

>> No.15024071

Is this the ML thread? Can any ML anons give some advice? I'm doing a PhD in a non ML field (biology) but I think newer ML models will be critical for some of the experiments I want to run. I don't need to be an expert but what topics would you say I should know for effective fine tuning and implementation of existing models? Is it worth it to study the mathematical topics on their own or do you have suggestions for layman ML resources?

>> No.15024683

>>15024071
first learn the basics of different types of networks and their applications, then go from there. read about hyperparameters and their optimization

>> No.15025726

>>15024683
yes

>> No.15025765

>>15024683
>hyperparameters
Not familiar with the term, thanks

>> No.15025772

>>15024071

CompBio person here. While some people make specific model architectures and things like that for Biology, as someone with a background in Biology your best bet is in application of these algorithms to biological questions.

A piece of advice is to understand the uses of different kinds of AI in biology. Most AI used for biomarker and polygenic risk signatures is "classical" in the sense that it uses linear methods or decision trees. These methods are much more interpretable than deep learning models and so tend to be more trusted.

These methods (and classifiers in particular) are generally also used to aid in supervised understanding of disease drivers (as in train a classifier to tell the difference between two disease states with RNA-seq and look at what the important genes are).

Clustering and dimension reduction algorithms are also important. You should understand transformation methods like PCA, UMAP and TSNE very well.

Multiple omics integration for drug/drug target discovery is often being driven by learned matrix facrorisation algorithms.

Deep learning is making inroads in a couple of key areas. Variational autoencoders have shown great success as dimension reduction methods and beat traditional methods like PCA in terms of ability to filter out noise in large data sets (like single cell experiments).

Image analysis people who know image neural networks are an increasingly large part of the industry since Spatial Transcriptomics has come in.

People are also trying to apply some of the work being done in Causal modelling using Bayesian networks, and recent developments in graph neural networks are hyped to speed make this easier, but nobody has shown shit worth taking seriously yet.

>> No.15025809

>>14994460
I didn't studied this topic and I knew it, ai is dogshit and messy

>> No.15026107

>>15015955
Hey anon, do you think a PhD would be required for research position at a CAE company which develops new simulation algorithms.

For example, something like this

https://arxiv.org/abs/2210.05837

>> No.15026134

>>15026107
You do not need a PhD to work at Ansys, even in their R&D division. An MS will do you just fine, I know a couple of Masters students who went on to work for Ansys. They own or are in the process of acquiring every commercial simulation package, so you are expected to work on lots of different applications, as a heads up.

>> No.15026464

>>15025809
wise

>> No.15027035

>>15026134
true

>> No.15027266

>>15025772
Many thanks for the reply anon, I'll save your post. I'm definitely interested in drug discovery although I'm debating whether omics integration for drug discover or some kind of pure structure -> readout model would be more work pursuing.

>> No.15027268

>>15027266
*more worth pursuing

>> No.15028726

>>15027266

In practice both things you're talking about are used. Omics is more often used in drug target discovery, but with technologies like T cell repertoire sequencing you can get a little bit into finding things to outright mimic for drug design.

The second step you described (model to readout) will be more useful in biomarker discovery, but also has some use in actually testing the likelihood that a potential drug formulation will actually bind to and inhibit function of a receptor.

As far as I'm aware a lot of computational chemists find themselves doing the latter, while a computational biologist would be of more use interpreting the biology around drug target discovery.

Also roles for CompBio in finding mechanism of action of potential drugs, preparing evidence for FDA approval and stuff. There's also infrastructure and pipeline building stuff that might be more accurately called "bioinformatics", but the lines are super blurry.

>> No.15029180

>>15028726
Maybe I should be a little less vague. 'Biomarker' discovery seems more down my alley, but what is your opinion on, not biomarker discovery with the second approach, but direct phenotype discovery? There are more than a few labs that are interested in this with things like yeast, c elegans, and organoids and I'm pretty excited about the prospect of drug -> phenotype and bypassing as much biology as possible.

As far as omics integration goes, I've seen some papers that use drug-genome interaction data to predict drug activity based on genomic signatures, but I wonder if, for a model (like, say, an organoid), it would be more with cataloging drug phenotype interactions for ML-based prediction and forgetting about omics.

Thoughts? I know it's a little non-traditional and stuff like this is mostly limited to academia (for now). There are very few people interested in this kind of thing in my immediate vicinity as far as I know (which makes the idea interesting, I guess, but also risky).

>> No.15030658

b.ump

>> No.15030670

>>15029180

You mean basically screen drugs against various tissues/ bugs, catalogue the features of the drugs/the organoids response and then use that to predict the best ones?

It's a cool approach. What it lacks is systems context. As the efficacy of drugs in a biological system will be effected by feedback loops involving tissues your organoid won't be in contact with then there will always be effects and variables that will decrease your drugs chances of success when moved to a real system (i.e a human body). People are beginning to account for this with these super complex interconnected cell population experiments, where they like have a network of cells in dishes with controlled flows between types of cells to allow experiments to control for cell type interactions and feedback loops beyond what you could get for a simple organoid, but that research (as you said) is early days and pretty academic.

Nobody trusts just in vitro evidence as evidence of efficacy yet as these systems are unproven.

>> No.15031211
File: 7 KB, 233x216, 95c.jpg [View same] [iqdb] [saucenao] [google]
15031211

>>15030670
>You mean basically screen drugs against various tissues/ bugs, catalogue the features of the drugs/the organoids response and then use that to predict the best ones?
Exactly. Sounds to me like I'm somewhat on my own here then, which is both exciting and concerning. I'll need to ramp up the detective work and ask around with some of my local ML/bio people. Thanks for the responses anon!

>> No.15031262

>>14994421
OP DO NOT DO THIS. PINNS AND DEEPONETS ARE FAD. THEY WILL PEAK VERY SHORTLY. GO STUDY FEM INSTEAD.

>> No.15031362

>>15026134
>They own or are in the process of acquiring every commercial simulation package
Why do you unironically spread lies on the internet? Are you a retard, malicious or simply excessively ignorant?

There are dozens of general purpose "simulation packages" (whatever you mean by that) on the market with a dozen more industry specific (i.e., fluid dynamics or structural mechanics) FE softwares offering fierce competition.

>> No.15032222

>>14994433
Here is an example of a great application of it. Instead of using more heuristic methods for turbulence closure in RANS modeling use a neural network with tensor invariants enforced in your prediction.
https://www.osti.gov/pages/biblio/1333570

>> No.15032513

>>15031262
any other sciML themed adjacent areas that might not be so much of a meme? e.g. suitable for a 4 year PhD, something where your work isn't going to get BTFO during that 4 year span

>> No.15033409

>>15032222
this looks like a neat concept

>> No.15034477

>>15031362
It was clearly hyperbole, settle down there champ. As for genuine market competition, in the context that every idiot using SolidWorks will use their shitty solver, sure.

>> No.15034508

>>14996996
They're completely different. In one of them you're calculating the desired trajectory of the weights in a stochastic graph tensor as you willfully force it to descend a figurative multidimensional slope, in the other you're calculating the actual trajectory of an actual substance (which is not a graph tensor in any way) as it descends a literal 3D slope of its own accord

>> No.15035219

>>15033409
yes it does

>> No.15036764

>>15032513
probably something that is currently niche, but has future application potential

>> No.15037205

bumping for excellent content

>> No.15038934

guess this thread got played out, was good while it lasted

>> No.15039576

>>15038934
like hell it did

>> No.15040209
File: 2.27 MB, 1732x1572, repulsive curve.png [View same] [iqdb] [saucenao] [google]
15040209

>>15019688
>>15020233
the reason you don't hear about it is because like all stem subjects, it's competitive and takes a lot to get to that level.
It's very trendy to talk about "OMG, I code!" because for some reason it's seen as a traditionally difficult task to write code even for a system that isn't at scale.
Fewer people talk about pic related, and many even bitch about muh too much math.
https://www.cs.cmu.edu/~kmcrane/Projects/RepulsiveCurves/index.html

>> No.15041134

>>15040209
the reason you don't hear about it is because like all stem subjects, it's competitive and takes a lot to get to that level
exactly, most people simply wont understand it, thus making it not popular

>> No.15041617

>>15041134
>>15040209
true

>> No.15042942

>>15030658
bum.p

>> No.15044861

>>14994433
Look at this for example: https://www.deepmind.com/blog/accelerating-fusion-science-through-learned-plasma-control

>> No.15045379

>>15044861
yes, I read about this - basically, AIs learned to change machine parameters that stabilize plasma in fusion machines and can make the reactions last longer

>> No.15045949

>>14996832
Those yellow lasers are pretty cool. Also, nice mascot. Bruno is a cool name.
>meds here

>> No.15046390

>>15045949
what "yellow lasers" are you even talking about?

>> No.15048183

>>15045949
>>15046390
yeah, I never said anything related to it in my initial post

>> No.15048245
File: 40 KB, 334x502, 1536373734822.jpg [View same] [iqdb] [saucenao] [google]
15048245

>>15048183
My remote viewing skills deteriorated exponentially after the boosters.

>> No.15048635

>>15048245
>remote viewing
what is that?

>> No.15048707
File: 29 KB, 245x249, 1669345770242476.jpg [View same] [iqdb] [saucenao] [google]
15048707

>>15048635
Entanglement out of our biological matrix. A telephone across space time.

>> No.15048881

>>15048707
I feel like this is bullshit

>> No.15049949

>>15048881
so do I

>> No.15050765

welp, this thread had a good run

>> No.15052096

OP here. Should we make SciML general?

>> No.15052117

>>14994421
>fluid mechanics
boring shit, who gets excited when water goes through pipe?

>> No.15052475

>>15052117
Water also goes through blood vessels. How do we create artificial arteries that could save our lives and improve life quality? We would need some fluid mechanics to study that.

>> No.15052572

>>15052096
I would agree to it, but Im not sure how popular it would be - if it is too slow, it will simply get bumped off the board on day 1

>> No.15052587

>>15052096
also, if you could share more literature about PINN, or answer any of these I would be very grateful
>>15004141
>>15008976
>>15011673

>> No.15052864

>>15003184
>People literally rediscover the wheel every single day by using ML on problems that are already solved using cheap signal processing / control theory tricks.
While as a researcher in signal processing I absolutely empathize with this sentiment, I dont think its fair to entirely dismiss deep learning based techniques for the field.
There are very challenging problems for which ml/dl are very powerful tools.
In fact there are scenarios where dl peforms better than other alternatives, for example for the mri problem dnns perform better and are more robust than compressed sensing.

>> No.15054169

>>15052572
I also vote to make such a general

>> No.15055155

>>15054169
ditto

>> No.15055278

>>15052587
Hello anon. Unfortunately, the questions you have posed are out of my expertise. I think you should look up researchers who have worked on problems similar to your questions and try mailing them.

This field is cutting edge, thus it is likely that your questions haven't been even explored yet.

>> No.15055441

>>15055278
>the questions you have posed are out of my expertise
big sad
>try mailing them
my advisor is very much against students asking for help, especially from people outside the research group, he would literally go ballistic

>> No.15055444

>>15055278
>it is likely that your questions haven't been even explored yet
even bigger sad, I am probably not smart enough to figure all this stuff out with ~1.5 years left in my phd

>> No.15057330

bump

>> No.15057393

>>15052587
is there ANYONE AT ALL who can answer these questions? I am pretty desperate because im a noob at DL and dont know how to even start answering them myself

>> No.15058426

bumping and agreeing to sci ML general

>> No.15059755

where should a "layer normalization" be placed in a fully connected network - before the dense layer, after it or after the activation function?

>> No.15059787

>>14994421
That is exactly my current research field. I am going to publish my thesis and a more compact research paper in a couple of months.
Any of you are free to AMA.

I agree with the other anon who says that it feels as though it is mostly snake oil. But that is true of many DL results even the ones from DeepMind or OpenAi. Once you probe the model and try to work with it, train it on your data instead of theirs and drop their seemingly arbitrary constraints, then you realize that it is much less capable.

However lots of smart people are working on this stuff so I am not going to say with 100% certainly that there is no use or future in this direction.

PINNs are actually only one of three approaches.
There is also operator learning, but that is limited by training and data set requirements.
There is also the more DL approach of using convolutional or graph neural networks and the loss being just the L2 norm instead.

The last approach gets the best results right now.

>> No.15060181

>>14994421
can't believe this thread is still up. its been a month.

>> No.15060267

>>15060181
that is for several reasons:

this board is rather slow in general
this thread is rather slow in an already relatively slow board, hence not only did it stay up for long, it is all so well below the bump limit for the board
I personally bump this thread when it goes low in the catalog in hopes of someone seeing and possibly answering my questions (I am the one who asked >>15052587), also there were some pretty good sources shared here (which I have now bookmarked) and some quality discussion in general
yes, my bumping is somewhat decreasing the quality of the thread, but it also keeps it alive, that way someone is able to see and respond to my questions

>> No.15060268

>>15059787
if You are competent in deep learning and/or physics-informed neural networks, could you please take a look at these >>15052587 and try answering at some of them?

>> No.15060354

>>15060268

I don't see any reason why steady state or periodic boundary conditions would be a problem. In fact chemistry people already do steady state predictions. Periodic boundaries would probably work even on my model although I never tried.

There is really no magic involved. If you can create a representative dataset using a classic simulation, and then you setup your deep learning architecture cleverly and you select your loss cleverly (especially in PINNs its all about the loss), then you should be able to get your model to return an approximation of the classic simulation.

It's of course more tricky than it sounds but in principle yet both steady state and periodic boundary conditions should be learnable.

>> No.15060637

>>15060354
ok, thats one issue down, but how to do this in practice? I get the dirichet boundary - just set the function in those points to zero, but how would a periodic boundary be described in a neural network?
also, regarding steady state - there are no initial conditions in steady state, because there is no time, so my question is - how to introduce the excitation into the system (excitation = incident wave)

>> No.15060703

>>15060637
Your steady state question is not well defined enough for me to understand exactly what you want. But in short, if you can simulate it in a classical simulation, you can get a deep neural network to replicate your simulation to some level of accuracy. How to do it in practice is literally the whole research project...
With neural networks you always need to define your training set as having an input and a label. If you can do that, you can train. Your input can be whatever you want. So it could be your lattice (?) or whatever defines your current state and whatever defines your excitation. Shove all of those into input tensors and then shove your desired end state in your label tensor.

Periodic boundaries just means that all pairs of opposite boundaries have to coincide up to some level of accuracy at every simulation step, no? That's literally it.

>> No.15060786

>>15059787
curious to see your work, can you post a pre print here?

>> No.15060792

>>15060354
maybe i'm a retard, but this is what i don't get. you're generating the data with a traditional simulation anyway, so in practical terms what's to gain from from a PINN, if it has to be retrained for varying sets of ICs/BCs anyway? is it possible to train an operator which can simply step forward in time regardless of ICs/BCs? (i've seen examples of this but they're not physically constrained)

>> No.15060885

>>15060786
No sorry. Even if I wanted I would need to get the permission of my advisor and so on. Plus I really don't want to dox myself.

>>15060792
Your dataset should be representative of the domain you are interested in. That means, if necessary different ICs and BCs that span the whole domain that you are interested in. The neural network then learns an approximation of the physics i.e. navier-stokes, free surface interactions, ...etc. Then you can setup a new simulation with a specific IC and BC which did not appear in the training dataset but which are part of the domain represented by the training dataset. Then you can infer this simulation much faster (because it is linear math) than it would take to compute it.
So in PINNs and in the unconstrained Deep Learning approaches, you are training a physics engine which can generalize to some extent over many ICs and BCs as long as they are within the domain represented by the training dataset.

Operator learning is also possible, but I am not an expert, the field is newer and from what I can tell, it is even harder to get real world simulations of such an approach for now.

>> No.15060962

>>15060885
thank you. could you explain more what you mean by IC/BC which are not in the training set, but which are "part of the domain"? (or point me to a link where i could read more)

>> No.15061464

>>15060703
>Your steady state question is not well defined enough for me to understand exactly what you want
I linked an article to a steady state simulation I want to replicate, here it is again: http://dx.doi.org/10.2528/PIERB14071606

>> No.15061895

>>15061464
essentially, it has the for of a linear problem where the matrix is the problem itself and the excitation is the vector in Ax=b

>> No.15062024

>>15059787
Hello anon. I am trying to build a PINN for a microfluidic channel. I have some difficulties with it. Can you give some suggestions?

>> No.15062877

bump

>> No.15063704

>>15061464
looks like a neat paper, I was looking for something like this, thx anon

>> No.15064355

>>15061464
do time-independent diff equations even have initial conditions?

>> No.15064433

>>14994460
You could always do cloud computing

>> No.15065065

>>15064433
how would cloud computing solve any of the challenges listed? Even if you could compute the data fast, there are still extra things like diversity of data, skewness, etc

>> No.15065125 [DELETED] 

What's the toughest part of physics to simulate ? I assume it's optics and anything camera related.

>> No.15066244

>>15065065
I suppose exactly in the way you described

>> No.15067267

>>15066244
yes, more compute resources help, but raw power can only take you so far

>> No.15067468

>>14994421
>Deep Learning in Fluid Mechanics
If you are going to get a phd in theoretical physics, this is actually a pretty good choice.

There have been a lot of advances in theoretical fluid mechanics in the last 15 years that tie it to some pretty impressive sounding things in high energy physics, and having a machine learning aspect makes you employable outside of academia.

>> No.15068584

>>15062024
Can anyone who works in PINN help me out here? As in the name, microfluidics, my geometry is in micrometers and as a result all the boundary points that I sample from my domain are very close. This results into my NN being stuck on a local mimina (my inlets are fully developed flow conditions, i.e. the velocity profile at the inlet must be a parabola, but instead I get uniform value which is the average velocity). I have tried various activation functions, training for longer time, but nothing works.

Anyone who has to deal with data like physical co-ordinates, how do you do it?

>> No.15069426

>>15067468
yes, its almost like all mater can just be described as fluids of different viscosities

>> No.15070544

>>15069426
so weird how all matter can be described with essentially the same math

>> No.15071605

bump

>> No.15071630

>>15067468
>here have been a lot of advances in theoretical fluid mechanics
Like what?

>> No.15071644

>>15071630
Fluids with anomalous charges, a lower bound on viscosity from ads-cft arguments, fracton fluids, failure of israel, stewart's second order hydrodynamics, etc.

>> No.15071656

>>15071644
>failure of israel
Not nuking Palestine?
The other shit sounds more like general relativity than fluid dynamics

>> No.15071660

>>15071656
>general relativity
or qft

>> No.15071676

>>15071656
>>15071660

Israel and Stewart. They are two authors that came up with a reasonable seeming approach to relativistic hydrodynamics in the 70s (60s?)

QFT that is weakly out of equilibrium is just hydrodynamics. The QFT just tells you about the microscopics of the fluid but at large scales all that matters are things like transport coefficients and the equation of state.

Einstein's equations in asymptotically AdS space have been shown to be dual to the relativistic version of the Navier-Stoke's equation. It's just a valid classical to classical duality called the fluid-gravity correspondence and you don't need to believe in string theory to use it.

>> No.15072692

>>15071676
That's pretty cool, thanks anon

>> No.15073123

bump

>> No.15074448

>>15071630
>>15071644
dont forget conductive fluids, i.e. magnetofluidics

>> No.15075776

>>14994421
Metal forming simulation engineer here, so not classic fluid dynamics but similar (FEM and FVM, fortunately no turbulence in our processes). My opinion is that teaching an AI on simulated data assumes an unrealistic expectation of the precision of the simulation. At the current year we still work with assumptions and simplified models, since no simulation can fully grasp the complexity of the real world.

I don't even see the benefit even if it would give good results, because this ML approach would require tons of simulations whereas in industrial practice a few iterations with the correct validated parameters will give good enough results.

Also I see the risk that the AI would only exploit the weak points of the models instead of giving actual good solutions. (It is already a problem with traditional optimization methods).

>> No.15076076

>>14998316
>On the other hand, my CS and EE buddies barely passed their courses and got into FAGMAN earning 4-5x of my income.
>FAGMAN

Is that like the poltard version of saying FAANG? You should probably go back to your containment board.

>> No.15076734

>>15076076
newfag lol
FAANG is for gay larping

>> No.15077162

>>14996319
actually, you can just look at the neural net. number of inputs is number of variables, and each connection is a relatively simple formula.
You might even be able to compound the nodes, extract the formula the AI comes up with

>> No.15077656

>>15077162
how? isnt a neural network essentially a black box?

>> No.15077674

>>15076076
Two of those companies trooned out. Do not deadname them chud!

>> No.15077737
File: 333 KB, 640x360, poairesearch.webm [View same] [iqdb] [saucenao] [google]
15077737

>>14994421

>> No.15077875

>>15026134
MS is all that’s really needed for working at Ansys, I did an internship for them during my MSc.

>> No.15077879

I wanted to say great thread, thank you to the anons keeping it alive. I recently started learning CFD for work and this has provided some great links and reading.

>> No.15077933

>>15077879
https://www.youtube.com/watch?v=iKAVRgIrUOU

>> No.15078101
File: 1.20 MB, 695x621, network drawing numbers.gif [View same] [iqdb] [saucenao] [google]
15078101

>>15077656
NTA, but no, not at all. You can see exactly what's happening.

>> No.15078201

>>15077656
Yes, the meaning behind the weights is usually unknown. There's a research area around labeling nodes, but I can't remember it's name.

>> No.15078270

>>15077879
you are welcome

>> No.15078271

>>15078101
>NTA, but no, not at all
>>15078201
>Yes
duality of man
maybe you (>>15078101) can provide some sources on how to do this thing?

>> No.15078804

>>15078101
I heard something about this, but never knew much about it. Can you post some literature sources?

>> No.15079455
File: 572 KB, 927x391, number reading neural network.gif [View same] [iqdb] [saucenao] [google]
15079455

>>15078804
>>15078271
Starting from this video (yes I know it's technically popsci, but it's a good break down)
https://youtu.be/aircAruvnKk
The whole series he does is great, and the visualizations he shows are really clear.

Here's a visualization I made of the net applied filter for each output node of a neural network that can read handwritten digits 0-9.

>> No.15079523

>>15079455
thanks, will check it out

>> No.15079977

>>14994421
I did a project where I tried to learn the Reynolds Stress Tensor fraturing in RANS models as a function of local data, using DNS as training data. Ngl it went pretty shit

>> No.15080987

I read about it but still cannot grasp the difference between PINN and operator learning, can someone explain it in simple terms?
is operator learning better than PINN?

>> No.15081500

>>15080987
anyone?

>> No.15081996

>>15080987
>>15081500
PINN = Solving a (parametrized) PDE using unsupervised learning.

Operator = Learning an mathematical operator via supervised learning.

They are used for different things. PINN does make sense over classical numerical methods if parametrized PDE are learned. Operator Learning net (e.g., DeepONet) can be plugged into numerical codes after learning.

PhD in neural networks for physics

>> No.15082551

>>15081996
thank you so much, perhaps you can share some good literature sources about PINNs and operator learning?
I already have bookmarked all of the other sources from this thread and also read some of them

>> No.15083380

>>15081996
sounds like some cool concepts, I would also like to learn more about them

>> No.15084008

>>15081996
what about PINNs for systems of PDEs?

>> No.15085270
File: 992 KB, 293x129, self healing neural netowrk.gif [View same] [iqdb] [saucenao] [google]
15085270

I just came up with a new kind of neural network and the output is promising for my uses.

>> No.15085394

>>15085270
do tell more

>> No.15085713

>>15081996
sounds interesting, please tell more

>> No.15085725

>>14994421
>should i waste my life on the current thing?
remember the learn2code meme?

>> No.15086628

>>15085725
no

>> No.15087053
File: 412 KB, 789x474, try 3.gif [View same] [iqdb] [saucenao] [google]
15087053

>>15085394
its set up like a ladder with two sides reflecting and connected to each other. Here I got the network (the two rows of dots) to cycle alternatively between red green and blue dots on each side.

>> No.15087551

>>15087053
and what are your planned uses?

>> No.15087628

>>15087551
something like this, its a conventional neural network that's been trained on a set of paintings of produce.
https://ditzbitz.com/fruitgen.html

>> No.15087769

>>15087628
tried it, those things barely look like the target image

>> No.15087773 [DELETED] 

>>15087769
ok luddite, it's literally on the level of a professional artist and it will replace you in two more weeks. you're coping

>> No.15087925

>>15087769
1. you never saw the target image
2. where's yours?

>> No.15088434

>>15087773
not saying it is bad, just that it needs improvement

>> No.15088829

>>15081996
literature about PINNs and operator learning please?

>> No.15089897

obligatory bump

>> No.15090553

>>15078101
looks neat

>> No.15091237

dead thread, was good while it lasted

>> No.15091246

>>15091237
>was good while it lasted
debatable

FPBP anon here >>14994433
This is still a completely nonsensical idea

>> No.15091410

>>15091246
who cares as long as OP can scam some grant money then good for him

>> No.15091512

>>15085725
wasn't a meme, everyone in science has to code at least a little bit now and the required coding skills continue to grow in magnitude.

>> No.15091668

>>15091410
so long as OP doesn't think getting a grant for this means it's a good idea

>> No.15091947

>>15091512
true

>> No.15092280

>>15091237
I found it extremely helpful

>> No.15092466
File: 174 KB, 700x700, download (62).png [View same] [iqdb] [saucenao] [google]
15092466

https://youtu.be/jYpVTraOdZ4

>> No.15092524

>>15092466
cool stuff

>> No.15094070

>>15092280
as did I

>> No.15094541

>>15094070
ditto

>> No.15095005

>>15091246
so is everything I suppose

>> No.15096439

bump

>> No.15096774

seems like the thread is coming to an end, should I condense an post all the valuable links i this thread in one comment? The literature here is a gold mine

>> No.15096795

>>15096774
go for it

>> No.15097880

bump

>> No.15098050

>>15096774
would be good

>> No.15098153

https://arxiv.org/abs/1711.10561
https://arxiv.org/abs/2104.08249
https://arxiv.org/abs/2201.05624
https://www.brown.edu/research/projects/crunch/home
https://arxiv.org/abs/2202.05476
https://deepxde.readthedocs.io/en/latest/
https://arxiv.org/abs/2210.01741
https://youtu.be/IMUqCLswa3s
https://arxiv.org/abs/2202.11214
https://developer.nvidia.com/modulus
https://pantheon.ufrj.br/bitstream/11422/15774/1/PHSSCerqueira.pdf
https://youtu.be/Ux5cQbO_ybw
https://arxiv.org/abs/2101.05549
https://arxiv.org/abs/2210.05837
https://www.osti.gov/pages/biblio/1333570
https://www.cs.cmu.edu/~kmcrane/Projects/RepulsiveCurves/index.html
https://www.deepmind.com/blog/accelerating-fusion-science-through-learned-plasma-control
https://www.youtube.com/watch?v=iKAVRgIrUOU
https://youtu.be/aircAruvnKk
https://ditzbitz.com/fruitgen.html
https://youtu.be/jYpVTraOdZ4

>> No.15098158

>>15096795
>>15098153
I may have missed some, but this should be a mostly complete list

>> No.15098162

>>15098158
filtered

>> No.15098771

>>15098158
thank you

>> No.15099311

>>15098162
how am I filtered?

>> No.15099382

>>15052117
fluid mechanics also applies to gas dynamics, aerodynamics, hypersonics, propulsion, all kinds of very cool stuff (not that anyone but elite MIT grads gets to actually do it)

>>14994421
good luck training your model lol, all these ML for CFD approaches run into the problem of training data being staggeringly expensive and time consuming to generate

>> No.15099397

>>15003197
>Honeymoon phase of CS is already in the past, don't worry.
I heard the same thing when I started my mech e degree, and it was wrong then, and I think this will again prove to be wrong

the economics of how software works (infinitely replicable for essentially free) mean software companies will continue to have extremely high margins compared to companies that make a physical product, so odds are good that the compensation for really good SWEs will continue to be astronomically higher than it is for really good mechanical engineers

>> No.15099498

>>15099382
>the problem of training data being staggeringly expensive and time consuming to generate
story of my life

>> No.15099499

>>15099397
>the economics of how software works (infinitely replicable for essentially free) mean software companies will continue to have extremely high margins compared to companies that make a physical product, so odds are good that the compensation for really good SWEs will continue to be astronomically higher than it is for really good mechanical engineers
sad but true, we just cannot compare

>> No.15100274

>>15098153
you are a life saver

>> No.15101324

>>15098153
I bumped just for this

>> No.15102097

>>15099397
how is this even legal?

>> No.15102476

>>15099382
is there any way of getting data for cheap (computational-wise)?

>> No.15102498

>>14994421
Deep learning is fundamentally anti-scientifi. It is basically exploiting that we have cheap computational power to adjust robust models to whatever data set you have. It is by far better to study just Fluid Mechanics and see if something may benefit from a ML model. Your statement could be translated to "Is it worth pursuing a PhD in seeing if I can adjust a function to things that have to do with Fluid Mechanics?", obviously you can, but this will not help you understand fluid mechanics.

>> No.15102507

>>14994421
>there's a thread from november still up.
/sci/ is truly dead

>> No.15103229

>>15102507
no, I just dont let this thread die by bumping it + this is a niche topic