[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

2022-05-12: Maintenance has concluded successfully. 2022-05-12: Ghost posting is now globally disabled.
2022: Due to resource constraints, /g/ and /tg/ will no longer be archived or available. Other archivers continue to archive these boards.Become a Patron!

/sci/ - Science & Math

View post   
View page     

[ Toggle deleted replies ]
File: 18 KB, 491x235, cX13J.png [View same] [iqdb] [saucenao] [google] [report]
7897859 No.7897859 [Reply] [Original] [archived.moe]

>Dear CSists

I was reviewing some formal language theory for a seminar I'm giving. I am not a CS person.

Here's the thing: this whole branch of "math" seems like trivial BS. E.g., the CFG~nondeterministic pda equivalence literally seems like a theory of smashing your fingers into a keyboard. Most of the first results you see are always the pumping lemmas, which are pretty much ornate facts about counting and running out of fingers, and don't even fully characterize the classes they describe.

I ask you guys this: please prove me wrong. In the sense that, I don't feel like I even know enough of the "big results" of formal language theory to really criticize it fairly. What are the "major theorems"? What can you do with it that is more straightforward than other methods from algebra?

>Pro version: what results aren't simply about word problems, or putting a metric on proof procedures?

>> No.7897869
File: 886 KB, 1170x500, CS.png [View same] [iqdb] [saucenao] [google] [report]

>this whole branch of "math" seems like trivial BS

CS theory [math]is[/math] trivial. We've been saying it for years...

>> No.7897975


I am actually curious what the major results of formal language theory are.

>> No.7898232

I mean what do you want to know? Formal Language theory is basically just math with rules that can be used to represent Computers.

If you have specific questions i can answer them the best i can but it seems like you have questions regarding the Chomsky Hierarchy? Most of that is just proving what resources we need in order to make a machine do some sort of mathematical task. The Pumping lemma helps us describe where one those boundaries is and understand why.
Comp sci is really different from the way we approach most other disciplines. Like in electronic Engineering there's like no point when Coulomb's law stops being useful, If there are point charges you can measure and predict how they will affect the environment using Coulomb's Law.
In Comp Sci on the other hand concepts will just disappear due to the mounds of abstraction we heap on them at every level.

Does that help? I'll be watching the thread if you have other questions

>> No.7898286

How do you like using 4chan, friend? Isn't it nice that you can touch keys and have those letters show up on a screen? Why does computer science need to be like math? Please leave theoretical computer science be, you're obviously too dumb to understand even the most basic realities of its existence as a field.

>> No.7898304

>Isn't it nice that you can touch keys and have those letters show up on a screen


>Please leave theoretical computer science be, you're obviously too dumb to understand even the most basic realities of its existence as a field.

"find CS easy -> must be retarded"

I'll never understand why CS majors think this.

>> No.7898426

I'm just wondering about what its main results are. Like, what is an "interesting" formal language theory result that's "deep" in the theory of the field?

eg, group theory has FT of fin gen abelian groups, galois theory gives explicit descriptions of solutions to equations, the jordan-holder theorem promises certain invariants about groups; topology has the jordan curve theorem, the fix-point theorem, and stone duality provides the compactness theorem for FOL; logic has various results about categoricity, los's theorem, etc. Arguably, none of these results would have been possible without the algebra, topology, and logic behind them.

You give examples from EE, and similarly in physics you have a few formulas which describe motion, speed, curvature, etc. very well.

The only "theorems" (i.e. theoretical results) I know about formal language theory, such as the pumping lemma, and the automata-language correspondence, seem like ad hoc collections of facts about how many gizmo's you need to translate one thing into another, or counting arguments which don't seem inherently related to deep properties of computation. Like, there's not really even that much of a "why" - the pumping lemma just gives you a brute way to test if a language is "too complicated", doesn't even classify the languages exactly, and a lot of computation problems seem to be of this flavor.

But (this is hard to phrase), what is the sort of a result that "belongs" to formal language theory or its related fields, and isn't just a fact about, say, combinatorics? I guess the automata results just feel kind of arbitrary, like, you could have chosen different constraints on your production rules and a different metric other than "how many gizmos do I add onto my automata" (which it seems like people are doing now, which sort of "proves" this point), in a way that say, the duality of rings and schemes is not, and the compactness theorem of FOL is sort of a "deep" nontrivial result.

>> No.7898445

New person here. I just want to chime in that every major subfield of math is "sufficiently expressive" in the sense that every theorem in one has a corresponding theorem in the other. There are functors which can turn any statement about group theory, topology, set theory, etc. into a corresponding theorem in one of the other fields in a natural way. There is probably better terminology for this, but each such category actually contains a "copy" of each other such category.

In the context of languages, it may well be that they admit no theorems which are not more easily proved in other categories. Then whole challenge is then interpreting results in a way that proves useful in the study of computing machines.

>> No.7898454

Um. That's not true? While you can have, eg, group objects in the category of spaces (topological groups), and category theory makes analogies about e.g. galois connections between categories other than groups and fields, you cannot port highly specific information from one category to another arbitrarily. E.g. the theorem that all vector spaces have bases doesn't carry over to modules. Heck, there are even theorems about locales that don't carry over to spaces. I'd say these are all fairly expressive categories.

Maybe I am misunderstanding you?

>> No.7898469
File: 92 KB, 959x617, math + CS.png [View same] [iqdb] [saucenao] [google] [report]


CS isn't math; it just uses math in order to study modes of computation. It's not a subject that furthers the field of mathematics. Like mathematical physics, problems in CS can spur the development or creation of areas in math but it's an application and not a field of math (just like EM isn't math but vector calculus is).

If you want to read something more involved that goes beyond the pumping lemma in automata theory then check out "Elements of Automata Theory" by Sakarovitch.

>> No.7898475

I am saying, for example, that both the category of groups and the category of topological spaces contain proper subcategories equivalent to eachother, even though the categories themselves are not equivalent.

>> No.7898476

>Too Complicated
Is sort of a very complex and interesting question in the CS field though. Really what the pumping lemma is showing is that given any physical device we can force it to do certain types of calculations. Of course you can throw huge amounts of effort at something but if it falls under the Regular Language umbrella a machine can be made to do that computation using no memory.
If the function you would like to compute can be "Pumping Lemma"ed then the function is at least Context free, meaning it requires memory but it also doesn't need to change how it performs based on the input.
Above that are recursively enumerable which require even more machine complexity.

Kinda, but it makes sense why the axioms we work with are there when you get into it. As explained earlier when a language exceeds a level of complexity the overall design of the machine must be changed in a large way

In general I get where you're coming from, unlike other pure mathematics its not "Lets make up a set of simple axioms and see what happens" and if you find a comparable rule set in the real world all the math remains valid. This is "We have created a bunch of Insanely quick but dumb machines, what can we do with them". The application is tremendously narrow by design

>> No.7898479

Recursion theory aka computability theory is math though. So is proof theory and model theory. You're just fucking retarded.

>> No.7898482


Recursion theory goes way beyond simple Turing machines CS majors study. You're retarded if you think you do the same exact stuff as mathematicians.

>> No.7898488

Hm. I'm still not asking my question very well. Even a lot of "application" fields can list major results. Like, HS style newtonian physics presents some fairly "deep" equations relating mass, weight, force, etc. It's still an application, but it describes an important physical result that came as a consequence of the implementation of calculus as representing physical things. Maybe you wouldn't call this a "theorem" but an important "law" or at least "significant discovery".

Part of my confusion may be that I am around professors who act like CS is math and has a lot of important discoveries as a field, and maybe I am expecting "deep" results where none exist. Of course, since it is a "branch" of math... some theorems would be nice too.

Another way to ask this is if you had to compile the important discoveries unique to the "theory of formal languages" what would they be? In the way a physicist might list equations that govern forces, a mathematician could tell you one of the theorems I listed, or a physiologist could tell you about cortical encoding of visual information?

The reference is appreciated.

>> No.7898491

When I studied recursion theory I also studied problems in second order arthimetic. guess that is well beyond Turing machines?

>> No.7898495

Pick up a fucking book and get off here. Start with the basics like rice's theorem and move up. Just shut the fuck up you lazy piece of shit.

The shit you're complaining about is baby CS tier automata level

>> No.7898499

No. Category theory is more about statements that are similar in different fields: products groups are like product vector spaces are like product topologies.

>> No.7898516

If you're going to mess with me then at least learn enough to try to argue that >>7898475 is false. The assertion that I have made about equivalent subcategories is common knowledge, even if I am inept at repeating its statement.

>> No.7898517

I don't think I see what you're saying. A topological space doesn't come with group structure, and a full subcategory of groups would just give them the discrete topology. Even if such a subcategory existed, it wouldn't necessarily have properties of the containing categories. Eg groups have a zero object, but spaces don't.

It sounds like what your saying is "concrete categories exist, and by definition you can take the structure you put on a set in different categories on all at once," which isn't the same as saying that you can translate results from one category to another.

>> No.7898525

That's why I was asking for references you fucking asshole. Most of the things people usually point me towards only have these few basic results. I didn't say shit about CSers, I was only asking about a fragment of their field. Being in a different field doesn't make me lazy. You sound like a histrionic faggot.

>> No.7898527

>Heck, there are even theorems about locales that don't carry over to spaces.
really ? can you cite a few ?

>> No.7898549


end of this lists just a few

>> No.7898564

....so the fuck what? Your whole post amounts to autistic dick measuring wrapped in a gripe.

Formal language theory is just some shit that's useful to make things like compilers, the usefulness of which doesn't need to be explained.

Why are you even complaining about this? Fuck. Don't you losers have anything better to compete over?

>> No.7898574

damn dude i wasn't trying to belittle or compete with anyone.

im asking for a motivating result to follow up on. like when a kid asks "why are we learning group theory" you have galois theory to point to, and i'm asking what that is for flt. i'm not around cs kids or in their culture at all, so its foreign, I have no one to ask, and i'm not gonna flip through an 800 page book just on the hope that there's something more interesting than counting in it.

>> No.7898577

btw thanks for not being assholes and actually suggesting material/responding. are CS kids usually defensive douchebags like >>7898495

>> No.7898579


Of course not, they are unemployed math majors.

>> No.7898581

...ah, sorry if I misinterpreted. That was honestly just my go to reaction to the whole "CS is trivial" meme (which isn't even wrong but annoys me because it seems motivated by a really noxious, misplaced emotion).

>> No.7898588

>In the context of languages, it may well be that they admit no theorems which are not more easily proved in other categories. Then whole challenge is then interpreting results in a way that proves useful in the study of computing machines.
This is essentially what I came here to say.

I think the main cause for OP's lack of understanding in this matter is an under-appreciation for the rigor and redundancy required to create something that does not entirely exist on paper.

>> No.7898595

>There is a notion of sublocale (or quotient frame) which corresponds to that of a topological subspace. Every locale can be represented as a sublocale of a spatial locale, but in general sublocales behave rather differently from subspaces: this is most evident in the fact that the intersection of any number of dense sublocales of a given locale is dense. It is this property (often in conjunction with the discrepancy between the two notions of product) which gives rise to most of the significant differences between the category of locales and that of spaces; as was first pointed out by Isbell [a1], these differences often have the effect of making the former a pleasanter category to work in than the latter. Three examples of this: the property of paracompactness for locales is inherited by arbitrary locale products [a1], the Lindelöf covering property for locales is equivalent to realcompactness [a5], and every localic subgroup of a localic group is closed [a6], [a7] (localic groups are to topological groups as locales are to spaces).


>> No.7898601

Sorry for being an asshole. Pick up Sisper's text. Flip through the later chapters on computational complexity & skim the theorems. Cooks theorem & Savitch's theorems are foundational. Understand those both first as a starting point.

Again, sorry for being an asshole but the whole "CS is a meme" threads going on makes me assume everyone that says that shit is baiting

>> No.7898623

OP is an applied scientist who just likes math and happens to be teaching a course which will have some CS kids in it. Sorry.

thanks. Being an applied scientist, I'm surrounded by them, and most don't know how math works, and the ones that are into computation are really dogmatic about it but can't actually point me towards any texts or results.

Hence me saying "I don't feel like I even know enough of the "big results" of formal language theory to really criticize it fairly." I'm confident in my math abilities, I just didn't want to waste a bunch of time on the references people gave me around the dept which only have boring stuff like what I mentioned. I'll look those things up. I've skimmed sipser and it seemed... kinda boring, but knowing which theorems are even considered important helps.

>> No.7898631

Biggest problem is CS is its easiest to gleam the surface so there's a lot of tards running around. Also people who want to make vidja games.

>> No.7898636

defensive douchebag is actually helpful douchebag at least enough to send me into a wiki and ebookz hole until i decide how i feel about this.

>> No.7898930
File: 27 KB, 642x417, language CFG hierarchies.png [View same] [iqdb] [saucenao] [google] [report]

Recursive function theory is only studied in CS. It is often studied at either the undergraduate or graduate level (my university does it in undergrad). Furthermore, recursive function theory is only one model of computation and typically a course covering recursive function theory also covers many other non-trivial models of computation with the intention of proving equivalence. This is why in modern times we refer to the topic as computability instead of recursive function theory.

Computability is very closely related to formal language theory.

The big theorems in computability theory are
>S_{mn} theorem.
>Kleene's recursion theorems.
>Rice's Theorem.
>A whole bunch of stuff about the Arithmetic Hierarchy
>A whole bunch of stuff about encoding arithmetic, logic, languages, and a bunch of other shit.
>A whole bunch of stuff about recursively enumerable sets, oracles, and so on.
>A whole bunch of different models of computation and theorems proving equivalence between them.

Parsing theory is also very closely related to formal languages. Here the goal is to reason about languages and algorithms that can parse said language (and at what level of time complexity). In particular one typically deals with subsets of the set of context free languages (pic related). Depending on properties that the grammar has for the language one may be able to use different types of parsers, some simpler and more efficient than others. Furthermore, one may or may not be able to prove that two parsers parse the same language and one may be able to root out problems like ambiguity in the language specification.

There are also many other related topics including language theoretic security and graph grammars.

>> No.7898946
File: 172 KB, 1280x850, Languages Chomsky hierarchy.jpg [View same] [iqdb] [saucenao] [google] [report]

Me again. Just wanted to point out that the reason you use all of that "smashing your fingers into a keyboard" stuff, as you nonsensically put it, is because those theorems prove useful in distinguishing subsets of languages and telling you where a language lives.

The reason for doing this is because where a language lives tells you whether or not it's membership problem is
>decideable (i.e. if there exists a computable function that when given a word will determine in finite time if the word is or is not in the language)
>recognizable (i.e. if there exists a partially computable function that when given a word will determine in finite time if the word is in the language but may fail to terminate if the word is not in the language)
and it will tell you what type of function can be used to do it, how to produce such a function, the time complexity of the function, and so on.

This may not sound very useful to you since you probably don't care about languages but take note that languages really just encode information. One could talk about the language consisting of all binary strings that encode jpeg images of your mom in a gang bang. Then one could ask whether there exists a function that when given a binary string encoding for a jpeg, will decide if the picture contains your mom in a gang bang. Whether or not this is true depends on the structure of the language describing such binary strings.

>> No.7898956

>parse said language
what to parse means ?

>> No.7898995
File: 937 KB, 629x839, 1451955326987.png [View same] [iqdb] [saucenao] [google] [report]

Everybody in this thread talking about theoretical CS expecting it to be like math. It's not math. It's literally just the theory behind the implementation of CS technology. If you think it's "trivial", throw your computer in the fucking garbage.

>> No.7899010

>the theory behind the implementation of CS technology. If you think it's "trivial", throw your computer in the fucking garbage

When will you CS losers get it into your pea-brains that CS is not EE and is not responsible for computers?

>> No.7899020

We still have a class on Recursion theory in the philosophy department, it's listed under 'Advanced Logic.' In it we proved equivalent notions of computatbility, kleene recursive functions, turing machines, and representability in formal system (Robinson's Q). Also we looked at undecidable problems and proofs of Godel I and II. All the research is in the CS department though.

Logic is pretty much dead at my university ;_;.

>> No.7899024

>hi I never took a operating systems class

>> No.7899026

>hurrr EEs are responsible for computers not CS!!
Dban your fucking hard drive then, since you don't need any of that software that was written by a likely CS major.

>> No.7899031

OS is required for all ECEs. It's an elective for CS majors. The ones making your computers run aren't the CS manchildren.

>> No.7899034
File: 299 KB, 501x656, 1456331869280.jpg [View same] [iqdb] [saucenao] [google] [report]

>grasping at straws

Any bangladesh monkey can write software

>> No.7899036

Sour grapes are sour.

>> No.7899038

Yeah but not everybody can write good software.

>mfw physics majors in charge of not coding buffer overflows into every piece of code they touch

>> No.7899039
File: 151 KB, 496x326, math face.png [View same] [iqdb] [saucenao] [google] [report]

>The janitorial scientist thinks the CEO is jealous of him

You can't make this shit up

>> No.7899040

>autistic math major sperging out about a slightly-different major than his own thinks he has even an ounce of the people skills to ever be a CEO
top kek, thanks for the pic tho.

>> No.7899042

>for(unsigned i=0; i<size; i++)
>for(auto iter = v.begin(); iter != v.end(); iter++)

so hard, so wow

>> No.7899043
File: 175 KB, 511x477, 's_Syndrome.png [View same] [iqdb] [saucenao] [google] [report]

>CS majors can't understand basic analogies and reads them literally

Autism speaks!

>> No.7899045

yes anon, you pretend to be a CEO if that's what helps you cope

>> No.7899048

That's because you looked up trivial shit from intro courses, proved in the 50s.

If you're into finite state automata, look up the Schützenberger/Büchi theorem connecting Monoids, Logic and Automata over finite and infinite words. In the end automata just become an encoding for MSO over sequences and most theory is done in terms of logic, e.g. quantifier rank as a measure for complexity. And even that result is old. You can look into data words, languages over infinite alphabets where basically everything interesting is undecidable. They're used for XML modeling. Probabilistic automata are equivalent to hidden markov models.

Also >>7898930

>> No.7899068
File: 451 KB, 500x211, 1376155955376.gif [View same] [iqdb] [saucenao] [google] [report]


>> No.7899128

>physics major thinks looping over an array is the only way to cause a buffer overflow
why am I not surprised

>> No.7899134


Physics majors code in C++ and never have to worry about raw C arrays. std::string & std::vector 4 life.

>> No.7899193

>>Physics majors
they do not code at all.
coding is for plebs.

>> No.7899218
File: 76 KB, 500x388, 1449693114923.jpg [View same] [iqdb] [saucenao] [google] [report]


They're sick of the spherical cow joke and are stepping up their game.

>> No.7899306
File: 33 KB, 297x400, image.jpg [View same] [iqdb] [saucenao] [google] [report]


Computability theory is very much a live and exist in the mathematics department at my undergraduate institution. The research done on the math department is different than what is taught in the computer science department. This is research looks into which set existence axioms are needed to prove theorems, which allows you to categorize the logical strength of theorems and differentiate between different proofs of theorems. All undergraduate mathematics can be done with transfinite inductive arguments.

But from what I've seen not many math departments outside of a core few have many mathematical logicians. Since computability theory is generally first exposed in an automata class I can see where the misconceptions come from. Based on what I've seen my math classes go into further mathematical meat in recursion theory than my automata classes and it connected it to other areas of mathematics like proof theory, combinatorics, real analysis and Ramsey theory.

>> No.7899479

>But from what I've seen not many math departments outside of a core few have many mathematical logicians.
I really wonder why, they have one where I go but I think even most math students ignore them, no class I took had over 8 students but the algebra and statistic classes where full as hell. And research wise they're topnotch in their field

>> No.7899496
File: 8 KB, 446x195, abstract syntax tree.png [View same] [iqdb] [saucenao] [google] [report]

Parsing is one of the steps of compilation. Formally a compiler is a function/program that takes input in one language and outputs in another language.

I'm going to skip many steps and details. The first step is typically lexing. A lexer for a language is a function that takes a string as input and
>checks that the string is in the language
>converts the string into tokens representing the atoms of your target language (perhaps capturing some amount of semantics, like having a separate token for a subtraction vs negative sign).
A parser for a language is a function takes a sequence/list of tokens and outputs an abstract syntax tree.
After this the compiler will take the abstract syntax tree and convert it to the new language.

An abstract syntax tree is an abstract representation of your input string (see picture for a simple example) that attempts to capture semantics. It is not to be confused with a parse tree. The abstract syntax tree is typically written in the form
>((node1 (node2 node4 ) node3 (node 5 node6 (node7 node8)))
where the basic structure is
>(root leg1 leg2 ... legk)
where each leg can also be a root (and thus a subtree).

Lisp is an unusual language because it is actually written in it's own abstract syntax tree (hence all the parenthesis). The time complexity of the parser is affected by the properties of the language. Also, being able to prove that the language produces a unique abstract syntax tree (i.e. that given a sentence there exists only a unique way to parse that sentence with regards to semantics) is also affected by said properties. In the past this has caused problems with ambiguity.

My explanation may not be very good and others may have more to add but that's the general idea. Pic related, an abstract syntax tree and corresponding input string.

>> No.7899500

That's pretty lucky I think. Our university does teach some basic computability in the philosophy department under a logic course but they do everything with Turing machines and at a level much less rigorous than the computer science students. The goal is to apply computability to logic and prove godel's incompleteness theorems among others.

That sounds like a fairly different take on computability. It almost sounds like the material in our proof theory course (comp sci department). We do have a mathematical foundations course in the math department (set theory) but it's very easy (the students aren't introduced to forcing or even the ZFC axioms until the last week of class). I've heard we've had a proof theory course taught in the philosophy department in the past but I do not know what is covered in it.

Our computer science department has two courses on computability. The first one is just the basic Sipser intro course that does everything with automata and turing machines. The second one does recursion theory and a bunch of other stuff (it's the one I described earlier). Since the students have already been introduced to computability and since all the shitty students opt not to take it then the level of speed, rigor, and difficulty is far higher than the first course.

>> No.7899504

While I agree that C++ is a nice language, sometimes you just don't have the choice and have to use C.
And even in C++ you often use C libraries, which will force you to handle C arrays and strings.
Physics majors probably don't have to worry about that though, since they only code trivial shit with numpy.

>> No.7899520

>But from what I've seen not many math departments outside of a core few have many mathematical logicians. Since computability theory is generally first exposed in an automata class I can see where the misconceptions come from. Based on what I've seen my math classes go into further mathematical meat in recursion theory than my automata classes and it connected it to other areas of mathematics like proof theory, combinatorics, real analysis and Ramsey theory.

Me again >>7899500

Our university has very little in the way of mathematical logicians (within the math department) as well. Almost all of it is done entirely within the computer science department. The type theory, category theory, homotopy type theory courses are all taught in the computer science department. We've also got several comp sci researchers working in mathematical logic related research and there's a weekly mathematical logic seminar that's mostly attended by computer science students and professors (mainly dealing with the intersection between logic, category theory, computability, and mathematics).

The general impression I get from the math students at my university is that they simply don't care about that level of rigor. I hope this is just the math department at my university and not the same everywhere else as the general attitude seems to be the engineer mindset to learning (they're typically only ever interested in some piece of formal logic, proof theory, etc.. if it helps them prove a result in analysis, algebra, or topology). On the other hand the computer science department at my university is the exact opposite and are all over the mathematical logic stuff.

>> No.7900503

how are the elements in such a tree ordered? is there a precedence relation built in?

>> No.7900522

>cs majors believe bound checking is deep and requires intensive study to pull off

>> No.7900557

>thinks baby exposure to Rudin is hard

>> No.7900935

>just because I think something is easy, I am not ignorant on that subject

There are plenty of dumbasses that think they "get" quantum physics because of schrodinger's cat.

>> No.7902289

The children of a node are its arguments. So for instance, the + with the a and the b under it corresponds to the expression a+b. That is the second argument to a :=, while the first is x, which corresponds to the assignment x := a + b

>> No.7902323

>seem like ad hoc collections of facts about how many gizmo's you need to translate one thing into another, or counting arguments which don't seem inherently related to deep properties of computation

this is more or less how I feel about complexity theory. It's almost like everyone in that field is so focused on attacking problems that they forgot about theory-building. No wonder P vs NP seems hopeless

>> No.7902957

That silly picture again. They have INTRODUCTION in the title of course they are going to be trivial.

And Concrete Mathematics is hard! I've never been able to finish a "research" problem.

>> No.7903107


that is a crazy fucking image with 0% opacity background

n1ce png

>> No.7904664
File: 62 KB, 642x617, 1456801818868.png [View same] [iqdb] [saucenao] [google] [report]


try this

Delete posts
Password [?]Password used for file deletion.