[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 30 KB, 500x309, image.jpg [View same] [iqdb] [saucenao] [google]
7688927 No.7688927 [Reply] [Original]

What's /sci/'s opinion on neural networks? Was thinking about making a smaller one just for fun, can code c++ and Java.
My idea was to make it able to guess a number I randomly choose.

>> No.7688936

Good luck m8, neural network are fascinating

>> No.7688955

rather than reinventing the wheel you should probably just use a kit in python

you can still learn the gradient descent/ etc. while you do it

>> No.7688966

>>7688936
Thanks bud
>>7688955
What type of kit? I also don't want one where I do nothing programming wise, I want to learn a bit when I use it.

>> No.7689003
File: 67 KB, 791x388, neural_net2.jpg [View same] [iqdb] [saucenao] [google]
7689003

>>7688927
>My idea was to make it able to guess a number I randomly choose.
you should probably read up on them a little before trying to make one as this is not at all applicable to neural networks

here's a really good introduction:
http://cs231n.github.io/neural-networks-1/

here's what it looks like to feed data through the network shown in pic related using a sigmoid activation function (f)
[eqn]\texttt{f = lambda x: 1.0/(1.0 + np.exp(-x))}\\
\texttt{x = np.random.randn(3, 1)}\\
\texttt{h1 = f(np.dot(W1, x) + b1)}\\
\texttt{h2 = f(np.dot(W2, h1) + b2)}\\
\texttt{out = np.dot(W3, h2) + b3}[/eqn]

where W1 is a [4x3] matrix, W2 is [4x4], W3 is [1x4]
and b1, b2, b3 are all [4x1] vectors

the idea is to have a bunch of labelled training data and update the values in W1-W3, b1-b3 according to some algorithm (e.g. backpropagation) until you get the decent results

here's a good lecture on the derivation of backprop:
https://www.youtube.com/watch?v=nz3NYD73H6E

that's literally all there is to it

>> No.7689102

>>7688927


First up if you don't know how to do multivariate calculus, linear algebra and logistic regression you should probably learn those first, you won't understand how a nn works without understanding those first.


>My idea was to make it able to guess a number I randomly choose.

This doesn't sound doable. NNs require an input vector of data they use to determine the label. What input is it going to use to guess your random number?

>>7688966
Machine learning libraries make nns easier to code. They do handy things like automatic differentiation and many allow them to be computed on your GPU, which is a lot faster. Caffe has a java interface https://github.com/fastturtle/jCaffe..

I'd recommend numpy or MATLAB/octave for making your first nn though. MATLAB syntax is convenient because a neural network is really just matrix multiplies>sigmoid function/softmax. and training is just subtracting error derivatives from weights.
The MNIST data set of handwritten digits is great for beginners.

>> No.7690530

>>7689102
Any examples of simple neural network code in Matlab? I want to play a bit with one where I can see the source code.

>> No.7691180

>>7690530

http://www.mathworks.com/matlabcentral/fileexchange/9262-simple-neural-network

simple 2 layer neural net with sigmoid activation

Also check out the programming assignments for week 4 and 5 of Andrew Ng's coursera course