[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.14506447 [View]
File: 62 KB, 680x521, 1608400670080.jpg [View same] [iqdb] [saucenao] [google]
14506447

Do good looks = good genetics?

>> No.12554339 [View]
File: 63 KB, 680x521, 1543055097912.jpg [View same] [iqdb] [saucenao] [google]
12554339

What was engineers called before the invention of the engine?

>> No.12437778 [View]
File: 63 KB, 680x521, 1607369639832.jpg [View same] [iqdb] [saucenao] [google]
12437778

https://plato.stanford.edu/entries/wittgenstein-mathematics

>Since we invent mathematics in its entirety, we do not discover pre-existing mathematical objects or facts or that mathematical objects have certain properties

>If, first, we examine what we have invented, we see that we have invented formal calculi consisting of finite extensions and intensional rules. If, more importantly, we endeavour to determine why we believe that infinite mathematical extensions exist (e.g., why we believe that the actual infinite is intrinsic to mathematics), we find that we conflate mathematical intensions and mathematical extensions, erroneously thinking that there is “a dualism” of “the law and the infinite series obeying it” (PR §180). For instance, we think that because a real number “endlessly yields the places of a decimal fraction” (PR §186), it is “a totality” (WVC 81–82, note 1), when, in reality, “[a]n irrational number isn’t the extension of an infinite decimal fraction,… it’s a law” (PR §181) which “yields extensions” (PR §186). A law and a list are fundamentally different;

>Given that a mathematical extension is a symbol (‘sign’) or a finite concatenation of symbols extended in space, there is a categorical difference between mathematical intensions and (finite) mathematical extensions, from which it follows that “the mathematical infinite” resides only in recursive rules (i.e., intensions). An infinite mathematical extension (i.e., a completed, infinite mathematical extension) is a contradiction-in-terms

What does this mean? Can some big strong /sci/entist explain this in retard terms?

>> No.12163976 [View]
File: 63 KB, 680x521, 1573986038747.jpg [View same] [iqdb] [saucenao] [google]
12163976

Ive seen tons of american people on tiktok (yes, i use it occasionally when bored) say that they have >130 IQ but it seems very unrealistic that so many people have an IQ that puts them in the top 2%. They didnt even know that 130 is high. As a european I dont have any idea how IQ tests are conducted in american schools. Are the results trustworthy? Are there so many black people that the results of average people are higher than they should be?

>> No.10902340 [View]
File: 63 KB, 680x521, 1543575510767.jpg [View same] [iqdb] [saucenao] [google]
10902340

def build_model(input_shape):
inputs = Input(input_shape)

c1 = Conv2D(8, (3, 3), activation='relu', padding='same') (inputs)
c1 = Conv2D(8, (3, 3), activation='relu', padding='same') (c1)
p1 = MaxPooling2D((2, 2)) (c1)

c2 = Conv2D(16, (3, 3), activation='relu', padding='same') (p1)
c2 = Conv2D(16, (3, 3), activation='relu', padding='same') (c2)
p2 = MaxPooling2D((2, 2)) (c2)

c3 = Conv2D(32, (3, 3), activation='relu', padding='same') (p2)
c3 = Conv2D(32, (3, 3), activation='relu', padding='same') (c3)
p3 = MaxPooling2D((2, 2)) (c3)

c4 = Conv2D(64, (3, 3), activation='relu', padding='same') (p3)
c4 = Conv2D(64, (3, 3), activation='relu', padding='same') (c4)

u71 = Conv2DTranspose(32, (2, 2), strides=(2, 2), padding='same') (c4)
u71 = concatenate([u71, c3])
c71 = Conv2D(32, (3, 3), activation='relu', padding='same') (u71)
c61 = Conv2D(32, (3, 3), activation='relu', padding='same') (c71)

u8 = Conv2DTranspose(16, (2, 2), strides=(2, 2), padding='same') (c71)
u8 = concatenate([u8, c2])
c8 = Conv2D(16, (3, 3), activation='relu', padding='same') (u8)
c8 = Conv2D(16, (3, 3), activation='relu', padding='same') (c8)

u9 = Conv2DTranspose(8, (2, 2), strides=(2, 2), padding='same') (c8)
u9 = concatenate([u9, c1], axis=3)
c9 = Conv2D(8, (3, 3), activation='relu', padding='same') (u9)
c9 = Conv2D(8, (3, 3), activation='relu', padding='same') (c9)

outputs = Conv2D(4, (1, 1), activation='sigmoid')(c9)


model = Model(inputs=[inputs], outputs=[outputs])
model.compile(optimizer='adam', metrics=[dice_coef], loss=bce_dice_loss)

return model

Navigation
View posts[+24][+48][+96]