[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 7 KB, 300x272, matrix-neo.jpg [View same] [iqdb] [saucenao] [google]
1634978 No.1634978 [Reply] [Original]

I'm a total newfag to computer science and I want an to learn the basics of programming.

What language should I learn first?

Pic related, it's Neo (Thomas Anderson) from The Matrix

>> No.1634980
File: 13 KB, 521x241, red-pill-or-blue-pill.jpg [View same] [iqdb] [saucenao] [google]
1634980

>>1634978

>> No.1634981

C

>> No.1634983

C++

Just do it. C++ for dummies will teach you the basics. Classes will get into deeper detail.

>> No.1634987

Whoa

>> No.1634988

I'd say C++, I learned python then C++ and C++ is very annoying to learn when you keep thinking "This is so easy in python".

>> No.1634991

In b4 whoa

>> No.1634997

Every god-king level programmer I've ever met started with Hypercard.

>> No.1635003
File: 10 KB, 200x242, John_carmack-739326.jpg [View same] [iqdb] [saucenao] [google]
1635003

Binary

>> No.1635010

>>1635003
Not a language.

OP whatever you do don't pick anything with the word "Visual" in it that lets you design things without coding.

>> No.1635014

>>1634997
>Hypercard
whoreslaughing.jpg

>> No.1635017

Learn Java

Every single company that has something to do with computer is willing to hire programmers at ease with Java

Learn Java, ignore the haters, enjoy your job and decent money

>> No.1635020

What is LUA, exactly? I keep hearing it in relation to the Sony PSP and Garry's Mod

>> No.1635037

Not OP but in same position as him.

Why not C sharp? How does C sharp differ from ++?

And isn't Java inefficient?

>> No.1635044

Java and C++ are the two big ones. After that it depends on whatever the fuck you want to do. People who go to school for computer science don't have much trouble learning new languages but I don't know what it's like for a hobbyist.

>> No.1635047

>>1635037
lmfao

C# isn't multiplatform, and is for gui

Learn C to get the lowlevel coding skill. Then move on to CPP Python or what ever the fuck you want.

(I hate visual basic, .net, and all of the other single-platform languages.)

>> No.1635048

>>1635037 C sharp
Lulz. Its written C#. Its essentially C++ pretending to be java. The main difference between java/C# and c++ is that the java compiler does most of the work of handling dynamic memory for you. It doesn't do as good of a job as a competent programmer (programs are slow, use more memory), and the world is full of dimwitted programmers who are completely dependant on it and destroy everything when they've got to use a different language.

>> No.1635050

>>1635037
>isn't Java inefficient?
It does a lot of things for you, so you don't have as much control in Java as you do in other languages. That means you lose a little efficiency but honestly you're not even gonna notice the difference.

>> No.1635055

>>>/g/

>> No.1635059

C# is a great beginner language IMO. It lets you get straight in without worrying about points, memory management, etc. Then move onto C++ and C.

>> No.1635060

>>1635055
/sci/ - Science & Math

Programming loosely falls into both I believe.

>> No.1635082
File: 14 KB, 250x315, begc++-778785.jpg [View same] [iqdb] [saucenao] [google]
1635082

>> No.1635139

>>1635020
It's an interpreted scripting language used commonly in gaming. It's incredibly fast for an interpreted language.

>> No.1635168

I learned objective-c first. It took me about a month to get the basics down. However, it doesnt look like it is very marketable.

From what ive read on websites, java and c++ are the popular ones. Of the two, java pays significantly better.

>> No.1635319

Google "The C Programming Language", the first pdf on the list is THE guide to the C language.

However, learning a language and learning computer science are not at all the same thing. If you're truly an absolute newbie to CS, you should look into some high-level descriptions of data structures and algorithms. Sure, you could jump right into a language and figure out how to make some horrible, inefficient shit (see: indie game devs and /prog/ superheroes) - but you'd do much better to actually learn some computer science first.

Honestly, unless you're some sort of autistic bookworm, your best bet is finding a university; the next best thing is probably to find a university with insecure CS course pages and following along with this fall's data structures and algorithms courses as best as you can.

>> No.1635353

>>1635168
Java only pays better within the very lowest tier of programming work; that doesn't necessarily mean that Java isn't marketable, you'll still make more money with it than working at Best Buy, but it pays more like IT work, rather than an actual career in science.

C and C++ are really the languages to learn if you're serious about computer SCIENCE, Java is only fine for some high level IT stuff and really low-tier indie development - look up some tech-based companies (Oshkosh Truck comes to mind), or even some of the big-name video game studios, and you'll see a large demand for C++.

>> No.1635360

JAVA/C++ are engineer languages. Learn LISP/HASKELL/SCHEME and don't be a faggot.

>> No.1635364

>>1635360
Engineers prefer C over C++, faggot. They find C++ too bloated.

>> No.1635365

what is the difference between c and c++?

>> No.1635370

>>1635365
C is less bloated, but stiff. C++ is more bloated obviously, but more fluid and flexible, which is why most software development teams prefer to use it.

>> No.1635371

>>1635365
OOP

>> No.1635373

>>1635364

C++ introduces some new concepts like objects and classes. This is called Object-Oriented-Programming. It eases the programming of big projects. C is purely imperative language.

>> No.1635374

>>1635365
C++ extends C with object orientation, templates, STL and a few other features. When I say features it makes these things seem kind of small but they open up a whole new world.

>> No.1635396

Touch C++ and you're in for 6 months of endless frustration. If you can stand it, you're going to become a solid programmer, but as a first language I recommend Java or C#.

>> No.1635402

congradulations,
You were force fed the red pill.

>> No.1635409

Python and Perl (start). Java & C++ (harder).

>> No.1635413

>>1635396
How is C/C++ frustrating? It was the first language I learned, and I don't quite understand what's so fucking hard or infuriating about it. It's a very straight and logical language, I don't see how anyone would have difficulty writing and reading it. Assembly maybe, but not the C family.

>> No.1635414

>What language should I learn first?

English

/thread
>>1634978

>> No.1635430

>>1635413
>I don't quite understand what's so fucking hard or infuriating about it.

Machine-dependent type sizes
Pointer arithmetic
Dangling references and pointers
Unsatisfying debugger
Very awkward and error-prone mechanism of code reuse (#include)
Undefined behavior when initializing static objects

All things nobody who wants to "learn the basics of programming" should have to care about.

>> No.1635437

I don't think C++ is the best choice for someone that's interested in computer science. I'm a games programmer so I use and love C++, assembly and a few others but I know that if you're doing compsci you won't be engineering complex software but programming mathematical functions and such. There are languages for that kind of thing. Do some research before just blindly learning C++.

>> No.1635443

Learn C if you want to understand "How computer does it."
Learn C#/Java if you want to do make big applications.
Learn scripting languages (Python, Perl) if you want to mess around, but dont want to any serious programing..

>>1635047
>is for gui
That mindset is like 5 years old.
.NET and C# has capabilities greater than any other language be it Java, C++ or any scripting language.

>> No.1635476

LISP, use this book http://mitpress.mit.edu/sicp

>> No.1635502

C is outdated just let it die and lets all learn python okay?

>> No.1635505

>>1635443
I work for one of the largest company's in Australia and I use Python to implement server-side security. It's one of the greatest languages out of the 6 I know, so don't underestimate it.

>> No.1635555

First, forget any language that is going to become dependent on compilers and tools of specific companies, so Visual Basic, C# and Cocoa, out.
Java should be out in principle, but wait.

You're starting, so the retarded syntax of Perl makes it a no-no. It's a good language, but not for starters. For the same reason, I would say no to Java, Lisp and Scheme for a start.

LUA is good, but not so common. Better choose something you can easily find help.
FORTRAN is for people requiring exceedingly fast stuff, but it's old and not good for a start.

This leaves us with the other popular languages: C, C++ and Python.

If you're not afraid of working for a good time entering and getting text results, and want a useful and solid introduction, C or C++. It's fast, there is a large use community and has good libraries, but it can have a more or less steep curve for a beginner.

If you want to learn the basics with neat results soon, go for Python.
It's really a scripting, interpreted language, so it's slower, but the syntax is very simple and consistent and has a ton of tools ready to use.
You can still make exercises of building stuff with classes or go without them. It's amazingly flexible and it's my suggestion for a start.

>> No.1635573
File: 40 KB, 560x432, hahahaohwow.jpg [View same] [iqdb] [saucenao] [google]
1635573

>>1635443
> (Python, Perl) if you want to mess around, but dont want to any serious programing..
> but dont want to any serious programing..

You just went retarded. Perl is widely used in servers, computer administration, hacking, numerical work, as an entry for PDL (Perl Data Language) in science, etc.
And well, the same goes for Python.
Going with C#, .NET is a bad choice as it makes you immediately dependent on Microsoft.

>> No.1635908

>>1635573
Being dependent on Microsoft isn't necessarily bad... but it is if you're actually going into CS for the science aspect.

For making games, C# is pretty sweet, although most studios still seem to prefer C++.

Also: perl and python are IT languages, not science ones. There's nothing wrong with a career in IT, but keep it out of /sci/

>> No.1636077

>>1635908
> perl and python are IT languages
Clearly you don't know shit about this.
Examples of Perl use: via PDL for astronomical research and scientific image analysis, etc.
Python is gaining great popularity in the physics and astronomy community. Several departments are migrating to wider use of Python. Even CASA, a NRAO tool of analysis for radioastronomy data analysis now uses Python as the main interface and glue to standard C/C++ libraries.

>> No.1636093

>>1635908
> Being dependent on Microsoft isn't necessarily bad

It's only good if you plan to stick to MS for a long time, disregard security issues, forget multiplataform programming (Mono sucks and it's incomplete, by the way), and you're willing to reach a point in which you have to start paying a lot to do anything profitable.

>> No.1636097

Ruby. Powerful, easy to learn, fun to use. Language of the future.

>> No.1636113

>>1635414

0.0/10

>> No.1636117

Screw programming languages,you can learn any one of them in a couple of weeks.
Now,algorithms,that's fucked up,takes a lot of time to learn,but once you grasp it,you can apply them in any language.
Anyways,go with Pascal or C/C++ , and get a good algorithm book.

>> No.1636140

>>1636097
No.

$ cat crud.rb
print "Hello\n" while false
begin
   print "Goodbye\n"
end while false
$ ruby crud.rb
Goodbye
$


http://www.webdesignforums.net/other_languages_195/7_reasons_ruby_rails_sucks_28505.html
http://dj-bri-t.net/2009/09/ruby-documentation-sucks/
http://reprog.wordpress.com/2010/03/17/why-ruby-is-not-my-favourite-programming-language/

>> No.1636145

>>1636117
> Pascal

It's not the 90s anymore...

>> No.1636157

As a beginner, pointers in C++ are the most confusing and frustrating things ever.

>> No.1636160

>>1636140

Haters gonna hate. But time is against them.

>> No.1636167

>>1636157
If you can't understand pointers, you can't understand computers.

>> No.1636175

>>1636167

You don't need to understand computers. It's the computer that needs to adapt to you.

>> No.1636179
File: 11 KB, 695x567, umad-truck.png [View same] [iqdb] [saucenao] [google]
1636179

>>1636160
lolno

>> No.1636187

>>1636175
0/10
That's only valid for user-oriented, simple tasks. Not programming.

>> No.1636188

>>1636187
Meh, not even that. Any user should have a basic knowledge.

>> No.1636194

>>1636175
Riiiiight. Because electronic devices automagically conform to our wishes without human intevention.

>> No.1636196

>>1636175
Maybe you should try hitting it with a stick until it figures out you're the dominant one and adapts to you.

>> No.1636198

>>1636179
>java fanboy spotted

I bet you fap to cuckold porn.

>> No.1636205

>>1636187
>>1636194
>>1636196

Programming is about programming. It's not that hard, or is it? Stop living in the past.

>> No.1636213

>>1636205
And what programming is, at its root, is telling the computer what to do with its registers, CPU, APU, and memory. Even if you're using a high-level language, if you're serious, you need to know what's going on at that level.

And if you're a serious programmer you have to be able program the machine at that level. YOU are the one who has to adapt the machine to the higher tiers.

>> No.1636232

>>1636198
>>1636198
lolno
I use Python, C, C++ and Fortran. Mainly Python.

>> No.1636240

>>1636205
People who don't understand computers are doomed to be beaten again and again by pitfalls related to computer architecture, and to write inefficient programs. It's much more important than you think.
See examples in this book, for example:
http://www.amazon.com/Writing-Scientific-Software-Guide-Style/dp/0521858968

Also in google books for a review.

>> No.1636290

>>1636240

You got file for that book?

>> No.1636299

>>1636290
Nope, I bought it. But I think you can find it as a file.

>> No.1636311

>>1636232
>python fanboy spotted

The previous statement still applies btw.

>> No.1636327

What exactly is computer science like in a uni?

>> No.1636346

At my uni they taught us the very beginner friendly basics with JSP as a part of setting up an internet page, then they tossed us right into C++, which I for one loved, but I know there were a lot of people struggling with it. Soon we are going hard into C for microprocessor programming.

>> No.1636349

>>1636311
umad
If I was a fanboy I wouldn't use other languages. That I use mainly Python is merely because it's fast and good for most of the data crunching I do.
If I criticize Ruby is because of it's own problems.

It's certainly a bad idea to suggest someone starting with a language full of inconsistencies, badly documented, and poor availability of quality libraries for different tasks.
On the other hand, there are several excellent libraries for C++ and Python, for example, for an amazingly wide range of needs.
Example: http://pypi.python.org/pypi?:action=browse

>> No.1636358

>>1636346
What uni was this?

>> No.1636383

>>1636349
>If I criticize Ruby is because I'm too scared to learn it and realise it's better than my pet language.

ftfy

>> No.1636392

>>1636383
>If I criticize Ruby is because I'm too smart to waste my time with it (having tried it), and realized it's one of the worst languages out there.

refix'd
Also, Ruby fanboy detected, no counterarguments detected.

>> No.1636396

>>1636392
He's waiting for rails to generate counterarguments for him. He'll get it working real soon now.

>> No.1636449

>>1636392
>>If I criticize Ruby is because I'm too obtuse to invest my time on it (having tried it and given it up as soon as I encountered a problem I couldn't figure out), and realized it's one of the best languages out there, and I'm doing all I can to give it bad rep before it gains enough momentum and overthrows my pet language.

rerefixed that for ya btw

>counterarguments
>posting links to bad trolls counts as argument
>mfw
>:D

>> No.1636464

>>1636392
=
>>1636396
laughingbithes.flac

>> No.1636503
File: 24 KB, 131x150, 1280669563533.png [View same] [iqdb] [saucenao] [google]
1636503

>>1636232
>>1635555
>FORTRAN

>> No.1636516
File: 270 KB, 640x640, troll-debate.png [View same] [iqdb] [saucenao] [google]
1636516

>>1636449
Actually those arguments presented in those pages are sound, valid, and testable.

>> No.1637009

>>1635414
>English

So true. More than half your time in a software engineering job is writing memos, email, specifications, requirements, documentation, code comments, and reviewing the same from others.

Do you believe that my getting promoted faster has anything to do with programming knowledge in this place? Do you think that's code you're writing now?

>> No.1637022

Learn basic
Basic, fuck yeah

>> No.1637024

How is the Matrix any different from Tron? After watching the second movie especially I felt cheated.

>> No.1637245

how often do new programming languages come out? when will c++ and java be outdated?

>> No.1637269

If you want to be a good programmer, start by understanding the works instead of just jumping in to use them. Learn algorithms and start with Pascal and Assembly. Frustrating? Yeah but it'll give you a solid ground.

>> No.1637278

java :D

>> No.1637288

>>1637245
That's actually a pretty complicated question. New languages come out often enough, but the most popular ones take forever to go out of date. Furthermore, when a language goes out of date, it's often times replaced with a language that builds directly off of it (B -> C -> C++, Java, C#) so, although the language may become obsolete, you don't really have to worry about having to relearn a completely new language every few years: you could probably get by in some fields only learning a new language every few decades.

However, that's all pretty irrelevant considering that the theories behind the simple syntax of a language take significantly longer to learn; once you learn one language, you'll never have to spend more than a couple of weeks (max) to become familiar with a new one, but mastering the latent concepts of computation, algorithms, etc. will take anywhere from several years to an entire lifespan.

Compare it to natural language: sure, learning Spanish is a pain in the ass, but compare it to learning to speak English from scratch - if you had to do it with an adult brain, it'd be nearly impossible.

>> No.1637301

Java and C++

>> No.1637310

>>1637269
Learning old shit first doesn't make you a better programmer. Good programmers are familiar with languages that are popular. The industry wants people who can read other people's code, not just their own.

If you want to learn a language for conceptual reasons, you'll learn more from machine code than assembly.

>> No.1637342

>>1636157
What's so fucking hard about pointers?
If you know what an address is, and what indirect vs direct means, you know everything there is to know about pointers.

>> No.1637352

>>1637310
>you'll learn more from machine code than assembly.
No.
Assembly and machine code are 1:1 translations, using machine code instead of assembly gives you absolutely nothing, just makes it harder to remember what 0010101010 means, instead of saying mov eax, 1

>> No.1637381

>>1637352
Using machine code helps you to understand how the computer receives and processes information in the most basic sense, while using assembly language obfuscates that. The computer doesn't speak or think in mnemonics.

>> No.1637396

>>1637381
>Using machine code helps you to understand how the computer receives and processes information in the most basic sense
No, you don't learn any of that at all by coding in machine code.
>The computer doesn't speak or think in mnemonics.
It doesn't speak or think in base 16/base 2 either. It "think" by moving electrons in a controlled fashion through different mediums (inb4 quantum level).

>> No.1637414
File: 41 KB, 400x400, sawwhatyoudidthere.jpg [View same] [iqdb] [saucenao] [google]
1637414

>>1637381
>mnemonics

>> No.1637416

If you can grasp the underlying system of symbolic thought and deal with the abstractions that programming demands, learning a new set of syntax is trivial.

>> No.1637419

>>1637310

The industry wants disposable programmers who can code easy things for cheap. If you want to be a GOOD programmer, then yes, you need to learn how a computer works first, how low-level languages operate and a bunch about algorithms, which you can then write in whichever language you like. Learning a language is a matter of a couple weeks, any code-monkey can do that. If that's what OP had in mind, then good.

>> No.1637433
File: 7 KB, 298x275, internetpunch.png [View same] [iqdb] [saucenao] [google]
1637433

>>1637396

You think Assembly works through mnemonics?

>> No.1637435

>>1637396
An electron is not a unit of information in a computer. A bit is. An electron means nothing to a computer.

>No, you don't learn any of that at all by coding in machine code.
Yes, you do. You are forced to learn the opcodes, and to recognize the same memory space for both instructions and data. You don't have access to macros or definitions and declarations.

>> No.1637440
File: 21 KB, 589x375, 1282328721766.png [View same] [iqdb] [saucenao] [google]
1637440

>>1637435

What the fuck do you think a "bit" is?

>> No.1637444

>>1637440
Do you seriously think that a bit is an electron? How many years are you into comp sci exactly?

>> No.1637452

>>1637444

Not "an electron". I don't know where you got that from. A difference in voltage, however, is a bit.

>> No.1637458

>>1637435
Machine code is to a programming language what grunts are to a spoken language.

>> No.1637466

>>1637452
Then we agree, you tard. The computer thinks in bits and not electrons.

>> No.1637471

>>1637435
>An electron is not a unit of information in a computer.
Yes, it is.
>A bit is. An electron means nothing to a computer.
Computers don't know about bits, they don't know about anything because they are not sentient, all they do is "know about" is electrons going through transistors (and other components).

>Yes, you do. You are forced to learn the opcodes, and to recognize the same memory space for both instructions and data.
Same as when learning mnemonics.
But you learn nothing about how the fetch/decode/execution cycle works in the processor, or how memory is accessed, how the ALU works, etc.

>> No.1637477

>>1637471
>Yes, it is.
No, it isn't. Do you even know what a flip flop is, freshman?

>> No.1637485

>>1637458
No, machine code is like literally writing out 10000... (100 zeroes), and assembly is saying 1 googol.

>> No.1637493

>>1637477
I know what it is.
A computer has no idea what a flip-flop is because it's not a sentient being who can know things.

>> No.1637495

OP, don't worry about a specific language so much as getting the practice in. Go to the library, find a programming book that's easy to read and looks intuitive, and go with whatever that one uses.

Stroustrup's book on C++ contains information about why the language was designed the way it was. It's great for learning, because you get a feel for the reasoning behind things. C++ is object-oriented, which will make it easier to learn languages like Java later on (whereas learning C is almost useless for that).

LISP is a different style of programming entirely. I wouldn't recommend it yet. Same with Prolog. (Prolog is a really awesome language, though. Not enough people use it, and this makes me sad.)

Java is nice because it's easy to get beautiful user interfaces, whereas with C++ you're going to be stuck with the command line for a while. (They're similar in many ways, but Java has automatic garbage collection, which is both a blessing and a curse, and a simpler scheme for addressing, which makes learning it easier.)

BASIC is very easy to learn (and then you can go with Visual Basic if you want nice interfaces and object-oriented programming, though I don't recommend VB). It's probably the simplest language to start with, but you won't be able to do very much and will want to graduate to something better later, like Python.

Python is simple and powerful, but you might get some bad habits that are hard to break if you decide to later learn Java or C++.

>> No.1637499

>>1637493
What's your point? It still stands that an electron is not a unit of information and cannot be interpreted by a computer as anything meaningful.

>> No.1637500

C
You will eventually have to learn C, but not as your first language. It's important since it's the lowest common denominator between different processors and teaches you how to write code close to the hardware. It's also the language that Unix was written in and knowledge of Unix is indispensable if you want to learn about good operation system design. It forces you to do a lot of stuff yourself however, like memory management, so it's not a good choose for a total newbie. You need to learn how to solve problems in code first, not worry about fixing that memory leak and other details. Those come later.

Java
Java represents the perfect middle ground between C and interpreted languages like Python and Ruby. It's fast, but you don't have to worry about memory management. It's statically typed, which weasels out a lot of common errors by compile time, but it's dynamically bound and (almost) entirely object oriented. There's a shit-ton of open-source Java packages available, which is a huge plus. It's a good first language and indeed taught at many universities as the first language for new students. It's a little baroque in my opinion however and a little more complicated than needs to be for a beginner. If you ask me, it's the perfect second language.

LISP/Scheme
You should learn one of these languages because they will open up a whole different way of thinking about programming and solving problems. But for that you need to already know how to program. So unless you're a freshman at MIT, not a good choice as your first programming language.

>> No.1637502

>>1637500

Haskell/Prolog
Same as above, but not as important. Definitely not beginner's languages.

Perl
Great if you want to solve tedious or repetitive jobs with three lines of code. But since you can have a cat walk on a keyboard and produce valid Perl code, I would recommend not learning this as your first language.

Ruby/Python
Both of these are great first languages. They are object-oriented, elegant, easy, powerful, open-source and cross-platform. Most importantly they let you think about solving problems without thinking to much about implementation details, which is what you need to learn as a beginning programmer. I personally prefer Ruby since I consider it a little more elegant and I find the whole indentation thing in Python a bit awkward, but it really doesn't which one you choose. Both will teach you exactly what you need to know.

C++
You will have to learn this, even if it's just so you know why you shouldn't write programs in it. Not a good first language. In fact, not a good language for most anything.

COBOL/BASIC/Pascal/Delphi
Choose one of these if you like being made fun of by your peers.

>> No.1637507

>>1637466

Goddamn code monkeys. A bit is not a physical thing. A bit could be a difference in voltage, a lamp, a perforation through a cardboard. In the computers we use now, bits are voltage, and electricity is nothing but moving electrons. So yes, essentially, information is processed in a computer (as we know them now) in the form of electrons.

And I still mantain that the first thing any programmer should do is learn about algorithms and computer architercture. And maths, lots and lots of it.

>> No.1637513

>>1637499
That you can't feed a computer "01010101" anymore than you can feed it "inc eax".
Machine code is not anymore low level than assembly, it just uses numbers instead of "words".

>> No.1637519

>>1637507
>A bit is not a physical thing.
That's my whole fucking point, asshole.

>> No.1637521

>>1637452
>>1637466
>>1637471
>>1637485
>>1637477
>>1637493
>>1637499

Unhelpful neckbeards are unhelpful.

>> No.1637522
File: 32 KB, 500x371, 1279670939045.jpg [View same] [iqdb] [saucenao] [google]
1637522

this post is gay

>> No.1637550

>>1637513
"01010101" is a series of symbols which each represent one bit in a computer. "inc eax" is not. "01010101" is a more fundamental expression.

>Machine code is not anymore low level than assembly
Most assemblers do contain features that are not 1:1 with machine code and have to be interpreted as a series of machine code instructions.

>> No.1637562

x86

>> No.1637574

> computer science
Haskell

>> No.1637596

>>1637519

A computer doesn't "think", it processes. A modern processing unit works when you feed it an electric current in a certain way. That's why you can't power a computer through rainbows and imagination. So, if we're going to say a computer "thinks" in something, that's electricity, as that's what makes the circuits work.

>> No.1637605

>>1637550
>"01010101" is a series of symbols which each represent one bit in a computer.
No, it's a cpu instruction, it represents some function to do, it can involve using bits/memory, but doesn't have to be.
>"inc eax" is not. "01010101" is a more fundamental expression.
No, they are both equivalent, one is just using different symbols.

>Most assemblers do contain features that are not 1:1 with machine code and have to be interpreted as a series of machine code instructions.
Yeah because they are optimizing the code, but that doesn't change the fact that "inc eax" will always have one and only one binary representation, regardless of context.
Also, machine code isn't even the lowest level of software btw, microcodes are.

>> No.1637606

>>1637596
Not yet they don't...

>> No.1637607

visual basic was my first language
then java, c/c++, php


visual basic was a good first language i thought

>> No.1637648

>>1637596
No one's saying that computers don't run on electricity. But electrons are not units of information. Not even a whole bunch of them together, not all the time. There are very specific conditions that determine whether a circuit interprets a group of electrons as HIGH or LOW, which we represent as oh look at that one bit.

>>1637605
>it can involve using bits/memory, but doesn't have to be.
No. All machine codes are represented as a series of bits. Any unit of information that can be ON or OFF is one bit.

>they are both equivalent, one is just using different symbols.
They represent the same instruction, but the statements are not equivalent. One is a series of bits, each a unit of information on its own, and one is shorthand.

>Yeah because they are optimizing the code
Ok, but then you do agree that machine code is lower level than assembly language.

>> No.1637677

Any opinions on this book list(add, remove, order)?

http://www.goodreads.com/list/show/2205.Essential_Books_of_Computer_Science

>> No.1637698

>>1637648
>No. All machine codes are represented as a series of bits. Any unit of information that can be ON or OFF is one bit.
Yes, but not each of the symbols in an instruction opcode is a direct representation of a bit, that's why writing "inc eax" in base 2 is not anything special, they are both just representations, one using circles and lines (0, 1), and the other uses some more symbols for representing the exact same thing.

>They represent the same instruction, but the statements are not equivalent. One is a series of bits, each a unit of information on its own, and one is shorthand.
The statements are equivalent.
100 = One followed by two zeros. 100 would be the "shorthand" mnemonic in this case.

>Ok, but then you do agree that machine code is lower level than assembly language.
No, there are optimizing assemBLERS that optimize assemBLY, the assembly code is at the same level as machine code.

>> No.1637707
File: 31 KB, 300x440, 000rpsqa.jpg [View same] [iqdb] [saucenao] [google]
1637707

This whole thread

>> No.1637882

Assuming that you're going to self-teach...

1. Python/"How to Think Like a Computer Scientist"
http://openbookproject.net/thinkcs/python/english2e/

2. C/"The C Programming Language"
http://www.google.com/url?sa=t&source=web&cd=6&ved=0CDsQFjAF&url=http%3A%2F%2Fcitese
erx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.126.4437%26rep%3Drep1%26type%3Dpdf&ei=VpZxTO
j9BYP98AbnsoHfCw&usg=AFQjCNE0mRrYFIweLQKJL8rfP0u_H6hBpA

3. LISP/"Structure and Interpretation of Computer Programs"
http://mitpress.mit.edu/sicp/full-text/book/book.html

4. At this point, assuming you were methodical in your approach to 1, 2 and 3? Whatever the hell you want.

>> No.1637924

>>1637882

>1. Python/"How to Think Like a Computer Scientist"
>http://openbookproject.net/thinkcs/python/english2e/

not OP, but I'm currently following this book (on chapter 17) its pretty good, easy to follow - and python is awesome (indents rock)