[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.7720459 [View]
File: 72 KB, 800x600, asap.jpg [View same] [iqdb] [saucenao] [google]
7720459

>>7720322
they're really not as mysterious as they sound.

>arrange your input data into a vector(eg a vector of pixel intensities for an image)
>multiply the vector by a weght matrix to get the input matrix for the first hidden layer
>evaluate a function(eg tanh(x)) on every element of that matrix to get the activation matrix
>multiply this by the second weight matrix to get the second hidden layers input.
>repeat for as many hidden layers as you want

congratulations, you made a feedforward neural network!( training it with back propagation is only slightly harder.)

'deep learning' isn't all that arcane either. 'deep' just means you have a lot of hidden layers

Navigation
View posts[+24][+48][+96]