[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

2022-06-09: Search is working again.
2022-05-12: Ghost posting is now globally disabled. 2022: Due to resource constraints, /g/ and /tg/ will no longer be archived or available. Other archivers continue to archive these boards.Become a Patron!

/sci/ - Science & Math


View post   
View page     

[ Toggle deleted replies ]
>> No.9572364 [DELETED]  [View]
File: 67 KB, 791x388, simple_neural_network_header[1].jpg [View same] [iqdb] [saucenao] [google]

Hey, can one of you make me a neural network in python that has two inputs, two hidden layers which have 20 nodes each and one output using relu as the activation. Also I want to backprop a cost function.

Thanks /sci/

>> No.9273516 [View]
File: 67 KB, 791x388, simple_neural_network_header[1].jpg [View same] [iqdb] [saucenao] [google]

Nueral Networks and Machine Learning fascinate me, but I'm too brainlet to understand them.

What is a good "track" for someone who wants to possibly get into machine learning? I know basic java, python, and C as far as technical skills and am currently on Calculus 2 as far as math. Book recommendations to eventually get me there?

>> No.9227147 [View]
File: 67 KB, 791x388, neural_net2.jpg [View same] [iqdb] [saucenao] [google]

How long before we finally get rid of science and adopt something superior?

I propose Pattern Study as the replacement. Pattern Study involves creating pattern structures like Association, Causation, up to very complex compositional patterns and then matching them with the data, in a recursive way such that the pattern structure changes to fit.

Anyway just some random idea for finally getting rid of the shitstain on humanity that is "science".

>> No.9184298 [View]
File: 67 KB, 791x388, neural_net2.jpg [View same] [iqdb] [saucenao] [google]

I need to pump that skill up boi.

>> No.8926310 [View]
File: 67 KB, 791x388, neural_net2.jpg [View same] [iqdb] [saucenao] [google]

Ayy Lmao

>> No.7951004 [View]
File: 67 KB, 791x388, neural_net2.jpg [View same] [iqdb] [saucenao] [google]

Protip: job market data scientists & ML experts is on fire. If you find these topics interesting you should get into it now.

For undergrads this means CS, applied math, stats, etc. PhD can help but not required. If you have any quantitative PhD (e.g., engineering, even chemistry) there are "bootcamps" to bring you up to speed and place you in a data science role. Some are good, especially the more selective programs you don't have to pay for.

Source: I am hiring for two ML spots on the west coast and can't find anyone for <$100k and that's after offering lots of equity. As an extreme example, Google brain team is more like $450k/yr plus $4M in options over 4 years.

Anyone have any questions about this? Will answer.

>> No.7689003 [View]
File: 67 KB, 791x388, neural_net2.jpg [View same] [iqdb] [saucenao] [google]

>My idea was to make it able to guess a number I randomly choose.
you should probably read up on them a little before trying to make one as this is not at all applicable to neural networks

here's a really good introduction:

here's what it looks like to feed data through the network shown in pic related using a sigmoid activation function (f)
[eqn]\texttt{f = lambda x: 1.0/(1.0 + np.exp(-x))}\\
\texttt{x = np.random.randn(3, 1)}\\
\texttt{h1 = f(np.dot(W1, x) + b1)}\\
\texttt{h2 = f(np.dot(W2, h1) + b2)}\\
\texttt{out = np.dot(W3, h2) + b3}[/eqn]

where W1 is a [4x3] matrix, W2 is [4x4], W3 is [1x4]
and b1, b2, b3 are all [4x1] vectors

the idea is to have a bunch of labelled training data and update the values in W1-W3, b1-b3 according to some algorithm (e.g. backpropagation) until you get the decent results

here's a good lecture on the derivation of backprop:

that's literally all there is to it

View posts [+24] [+48] [+96]