[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.11536591 [View]
File: 2.47 MB, 2040x2689, 1562143780740.jpg [View same] [iqdb] [saucenao] [google]
11536591

>>11536511
It is pretty obviously wrong at this point. In my opinion.

Chomsky key observation is that language learners learn the restrictions of their language, despite usually never being told these restrictions explicitly. He therefore proposes that an innate universal grammar which provides these restrictions.

However, Deep learning language models are very forcefully challenging this idea. We know these models have no kind of innate universal grammar, yet they are still able to infer language rules and generalize within the rules of the grammar. These models are able to do precisely what Chomsky said is impossible.

Chomsky basically got things backwards. He was puzzled by how efficiently humans can learn something which seems so complex like language. So he came up with this idea of innate universal grammar, which act as an inductive bias to accelerate learning.

I think the answer to this problem is in the languages themselves. Human languages evolved to be simple to learn. If a language was hard to learn it simply wouldn't survive. The properties of language have gone through a selection process that prioritizes learnability.

Navigation
View posts[+24][+48][+96]