Quantcast
[ 3 / biz / cgl / ck / diy / fa / g / ic / jp / lit / sci / tg / vr ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

2017/01/28: An issue regarding the front page of /jp/ has been fixed. Also, thanks to all who contacted us about sponsorship.

/sci/ - Science & Math

Search:


View post   

[ Toggle deleted replies ]
>> No.10646771 [View]
File: 59 KB, 531x467, 1542845042207.png [View same] [iqdb] [saucenao] [google] [report]
10646771

Why are people picking the layer shapes they are when designing neural networks? For example, convolutional NN ``VGG-16`` essentially moves from 224 -> 112 -> 56 -> 28 -> 14. How would the properties of the network be different were I to start from 14 (of course same input size) and fan out to 224? What would happen if we did more a u-shape, where we shrunk down then grew back out? Are there any general rules here?

>> No.8277975 [View]
File: 59 KB, 531x467, 1458617238435.png [View same] [iqdb] [saucenao] [google] [report]
8277975

>>8277798
>caring about what /sci/ thinks
You fucked up man, do what makes you happy cause you gotta live with that decision

>> No.8221436 [View]
File: 59 KB, 531x467, xA8LLRu.png [View same] [iqdb] [saucenao] [google] [report]
8221436

"At one point, I resisted by pushing my jaw between Harris’s elbow and my throat. That didn’t help. “He can choke your whole jaw into your throat,” Ryron said. “It affects the carotid—through the jaw!” He said this with an air of Isn’t that cool?"

Reminds me of pic related. What's with nerds being martial arts experts? Being bullied as kids?



Navigation
View posts [+24] [+48] [+96]