[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 170 KB, 550x510, C++ACCIDENT.jpg [View same] [iqdb] [saucenao] [google]
2023527 No.2023527 [Reply] [Original]

C++ is boring. I have much more fun programming in Java. Discuss why/why not C++ is fun.

>> No.2023534

Bullshit. You can mix in assembly code with Java, making it worthless.

>> No.2023538

ITT: The dinosaurs of programming and 2600 '90.

>> No.2023582

>>2023534
Yea, but whenever I program in C++ I spend more time trying to figure out how to program, rather than solve the problem....

>> No.2023590

>>2023582
if you knew C++ a little better you might like it more, then again, you might just be to used to Java

Also learn an interpreted language

>> No.2023602

>>2023582
What is a more effective algorithm to sort 1000 random integers? Merge Sort or Quick Sort?

>> No.2024276

Merge sort O(n logn) vs Quick sort O(n^2) so I guess Merge sort is better

>> No.2024287

>>2023602
since they are ints wouldn't Radix be the best sort?

>> No.2024293

>>2024276
Quick sort is O(n logn) as well, actually.

>> No.2024310

>>2024287
Radix wasn't an option. It was just a question to show that OP has no idea about real programming and that he is nothing more than a java copy paste monkey.

>> No.2024331

C++ is a man's language and java is for children.

>> No.2024335

C++ is halfassed pseudo-OO faggotshit. Real men program in pure C.

>> No.2024339

C++ and java are almost exactly the same syntax-wise, just Java is object oriented making it over 9000 times better than any form of C.

>> No.2024346

>>2024335
Real men program in Fortran II.
http://www.pbm.com/~lindahl/real.programmers.html

>> No.2024347

>>2024335
No one forces you to use OOP in C++. It is there if you want to use it but, you can program in pure procedural style if you want.

>> No.2024353

>>2024293
Average case. It's O(n^2) worst case, but is commonly used because it has a good constant.

>> No.2024359

>>2023527
>C++ is boring

Not relevant save for hobbyists, and even then perhaps not.

>>2023582
>I spend more time trying to figure out how to program

Then maybe you should learn how to program before you start commenting on programming languages.

>> No.2024367

>>2024339
>Java classes

I sure do love me some fifty recursive class instantiations to print one string.

>> No.2024377

If you use iostream you deserve to be hunted down like a dog and shot.

>> No.2024378

>>2024367
I.Think.This.Is.Funny

>> No.2024389

>>2024377
There is nothing wrong with any of the methods in the standard library, iostream included ( lol pun ). That is why they are in the fucking standard library. They work and they work well.

>> No.2024402

>>2024389
I'm agreeing with this.
Not to mention that using the standard library makes it easier for other programmers to work on, later on... Meh.

>> No.2024408

>>2024389
>in the standard library because they work and they work well
>implying the str* & get* functions work well

>> No.2024423

As a person who actually makes a significant amount of money programming, I can say with experience and authority that if you use iostream you are a useless fucking faggot and I would fire you from my team immediately if I saw that shit.

>> No.2024433

>>2024408
str isnt that a c function?
c++ has stringstreams
although i will admit those still kind of suck

>> No.2024443

>>2024423
Why? Don't you want your code to be maintainable? Readable by others? Portable?

>> No.2024448

>>2024408
never had a problem with the string implementation in the Standard Library. You are probably just basing your opinion off some extremely rare and obscure buffer overflow exploit that violated those functions in 1998 because the programmer was too lazy to do error checking. That is the excuse I get every time someone tells me they won't use the C STD lib.

>> No.2024455

>>2024423
Can you give a reason why iostream is so bad?

>> No.2024468

>>2024423
>As a person who lies

I see.

>> No.2024502

>>2024455
Of course he can't. He wants you to think he recreates an entire stream and stdout system in asm every time he needs to program anything,

>> No.2024549

>>2024455
cout << x;

What the fuck is << doing there. If I'm not shifting cout left by x bits, I shouldn't see that shit.

>> No.2024563

>>2024549
>I'm a giant faggot please rape my face

Assume the position.

>> No.2024579

>>2024549
You're shifting x into the end of std::cout, and std::cout can be seen to move "left" to "make room" for x.

>> No.2024589 [DELETED] 

C#>C++
Have fun doing crappy console applications, while I make GUI applications that run just as fast

>> No.2024591

>>2024563

Are you honestly defending iostream's horrible abuse of operator overloading? Like, I'm not saying it's bad in general; overloading is nice when you need to say, add things with + or subtract them with -, but when you define >> as insertion or some shit when va_args works just as well with the added benefit of it looking like a goddamned function in the end, you're doing something wrong.

>> No.2024594

>>2024589
>C#
>Just as fast
I do hope you're trolling. C# uses an interpreted bytecode, there's no way it could run just as fast as C++.

>> No.2024603
File: 1.07 MB, 400x540, 1288698865389.gif [View same] [iqdb] [saucenao] [google]
2024603

>>2024589
>C#
>just as fast
>mfw

>> No.2024605

>>2024579
Actually, you're printing something to the screen. That's what you want to do. When you're programming, you don't say "oh hey this looks like a good time to shift a message into the cout stream," you say "I want to print some shit to the terminal."

printf("some shit"); is intuitive.
cout << "some shit"; is not.

>> No.2024613

>>2024589
But why both writing a GUI when I can write a console application if it's just for my own needs? And if I need a GUI C++ works fine for that too.

>> No.2024616

I still use printf, does that make me a huge faggot?

>> No.2024617

>>2024605
cout stands for console output, but you probably knew that
You're shifting data into the end of the console output. It's the exact same thing as printing a message to the screen. I don't see how this is hard to understand.

>> No.2024619

>>2024591
It is much easier to type out '<<' 4000 times throughout a program. Not to mention there is hardly a C++ programmer on the planet that has a hard time understanding the std:: overloaded operators. It is like saying we should never use references because & can either resolve a memory address or denote a reference or imply AND if doubled.

>> No.2024623

SUBROUTINE trollCthread(likewriting2xcode,doingmath,useC,useFORTRAN)
LOGICAL, INTENT(IN) :: likewriting2xcode, doingmath,useC, useFORTRAN

IF ( likewritingcode) THEN
useC = .TRUE.
END IF
IF ( doingmath ) THEN
useFORTRAN = .TRUE.
END IF
END SUBROUTINE trollCthread

>> No.2024624

>>2024591

You're not quite sure what streams are for, are you?

>> No.2024644

Real men code in Lisp.

Thread Closed.

>> No.2024645
File: 100 KB, 800x1100, 1251693163534.jpg [View same] [iqdb] [saucenao] [google]
2024645

c++>>java

Helps that I didn't have my brain mutilated by java/basic before learning c++.

>> No.2024652

>>2024624
He obviously thinks stream = what you see on the screen.

>> No.2024656

>>2024617

Functions look like this();.
Functions *do things*.
x << 4 is something that can be evaluated. It doesn't do anything on its own. Making << do something on its own is inconsistent.

>>2024619
The difference here is the dual meaning of & is defined by the language from the get-go. << as an insertion operator is defined by a library. I don't believe it's a good thing to allow code to change the structure of a language.

Actually, maybe I do hate overloading in general.

>> No.2024678

>>2024605
Overloaded << is a standard construct that everyone knows, and as such it can usually be used for other streams, too. Like, say you want to print to a file, or send that data to a custom printing object, or send something via a network class. Oh snap, now printf doesn't work. But if those other objects are streams.... I guess you could use fwrite, but fuck, that's ugly and whoever is going to maintain your code is going to hate you.

Also, formatted output ftl.

>> No.2024683
File: 183 KB, 566x690, 1280786169585.jpg [View same] [iqdb] [saucenao] [google]
2024683

If your going to program with training wheel all you life in java then why not uses basic/matlab aka a tricycle. It's not like your trying to use your brain anymore...

>> No.2024687

>>2024656
You do realize you can use the standard '.' format for iostream right? That is the entire point of operator overloading. You can either use it if it is defined or not use it.

>> No.2024688

>>2024605
>intuitive:
>printf("OP is a %s and has sucked %d dicks", sFaggot, nMillion);
>not intuitive:
>cout << "OP is a " << sFaggot << " and has sucked " << nMillion << " dicks";

whathefuckamireading.tga

>>2024616

Naw, printf is fine. Some people just gotta be giant faggots and create elitism where it doesn't exist.

>> No.2024692

>>2024624
What I'm saying is a "stream" is a stupid analogy for anything that your code actually does. When you're writing to a file you're not putting something into a ~stream~, you're writing data to sectors on a disk. When you're printing something to a screen you're not putting something into a ~stream~, you're printing text to a display. fwrite() and printf() get this across nicely. Abstracting everything to a stream is ugly and unintuitive. A terminal is not a file is not a hardware device. They are all separate things, with separate ways to best deal with them.

>> No.2024698

>>2024692

So you don't understand what streams are for.

>> No.2024703

>>2024698
Please enlighten me, then. What is a stream for?

>> No.2024711

>>2024703
A stream is nothing more than a mechanism for copying bits. Whether you copy them to the video card frame buffer or to a print buffer it doesn't matter. Stream is the copy of bits.

>> No.2024715

>>2024703

Already been covered, which you missed because you don't know what you're talking about and had no idea the posts were pertinent. Scroll up.

>> No.2024718

>>2024688
because << is for bitshifts

its the same with math, is (n) a matrix or is (n) a combination

>> No.2024743

>>2024718

If you can't wrap your head around operator overloading, you're never going to do anything interesting.

>> No.2024746

>>2024711
Well hell, if you're just copying bits, can't you just use memcpy() for everything then? Oh wait, you can't, because you're not just copying bits, you're doing other things too.

If the operation is truly as simple as that, then memcpy(), or even = and arrays really, would suffice. If the operation is different for different devices, then it shouldn't have the same usage for every device. There's no place for streams in any of this.

>> No.2024753

>>2024703
streams are for buffering input and output, allowing you to read or write information as required without having to explicitly deal with the buffer.

>> No.2024756

>>2024692
No buddy. io/ios/iostream are the exact same underlying mechanism as fprintf or printf. The only difference is io/ios/iostream is object oriented. The reasons are obvious why someone recreated a stream mechanism that is object oriented for c++. They did it so it would be much easier to use streams with the expanding options of input and output devices.

>> No.2024777

>>2024746
>If the operation is different for different devices, then it shouldn't have the same usage for every device.

Or you could use streams and become device-independent.

My browser's using streams right now to send you this post.

>> No.2024778

>>2024746
You can't use memcpy because memcpy can't access memory addresses owned by the video card and other hardware devices. stream mechanisms like iostream integrate with the device drivers in order to be able to copy to memory addresses assigned to the video card. Windows/Linux specifically make sure that your program can't write to kernel level memory addresses. It is part of creating a robust operating system. Fuck man you obviously have no idea what you are talking about and are probably the OP of the thread who knows only how to copy and paste Java Code.

>> No.2024788

>>2024549
>I never code in other language

>> No.2024790

>>2024753
The ps/2 controller on your keyboard buffers the input in its own internal queue. The keyboard driver probably buffers that shit again, just for kicks. You're saying it need to be buffered a third time? What the hell for? And why does output need to be buffered? It just goes straight to the screen. As for files, where does the buffering need to happen? You tell the OS to write some block of data starting at address 0x1234abcd to the disk, and it'll do it and return control to the program when it's done. Where is a buffer even needed?

>> No.2024803

>>2024790
>lost data? fuck it, I didn't need it anyway.

>> No.2024808

>>2024778
>>2024777

Ugh. This trash is why I couldn't stand to program for a living. When I write code, I want to tell the electronic parts of my computer what to do, not the other way around.

Tl;dr I am a butthurt C/assembly programmer who will never work on anything larger-scale than an AVR because of an innate hatred for abstraction.

>> No.2024815

>>2024808

Abstraction is a necessary evil as complexity rises.
But I get where you're coming from.

>> No.2024824

>>2024808
I also have an innate hatred of abstraction, but eh. It's necessary.

>> No.2024846

>>2024808
Modern optimizers will in all likelihood turn that abstracted code into something that is more efficient than what you could have come up with in ASM anyway.

>> No.2024856

>>2024808
Windows had to have a way to keep amateur programmers from breaking their OS with access violations. It all stems from the days of DOS where people would write fucked up programs that corrupted memory somewhere and would crash the entire system. Then people would bitch and think it was a flaw in the OS itself. Still happens to some extent today when people think the "BSoD" is a problem with Windows itself when in fact it is almost always caused by bad hardware or drivers.

>> No.2024863

>>2024824
>>2024815

I guess this is where the "will never program for a living" part comes in. I don't think I could ever admit that complexity is necessary. I'd rather that this were a BBS we were talking on right now, with an interface written in a couple hundred lines of 6502 assembly, than it be a forum written in thousands of lines of php or whatever, on top of millions of lines of C++ for the browser, on top of millions upon millions of lines of code for the operating system software stack that enables all of this to run at all. It's too terrible of a reality for me to accept. Once a computer becomes too complex for one man to hold in his head at once, then I no longer want to deal with it.

>> No.2024868

>>2024589
this
>>2024331
>implying you can't make GUI applications in C++

>> No.2024880
File: 67 KB, 640x480, vomiting bikeman.jpg [View same] [iqdb] [saucenao] [google]
2024880

>java

>> No.2024882
File: 12 KB, 720x400, bbs fgts.png [View same] [iqdb] [saucenao] [google]
2024882

>>2024863

You now realize that Fidonet will never be the same again.

>> No.2024888

>>2024882
y u do dis ;__;

>> No.2024889

>>2024863
That level of complexity allows us to do more with less effort though.

>> No.2024891
File: 541 KB, 1450x1100, Rlogo.png [View same] [iqdb] [saucenao] [google]
2024891

REAL statisticians use R.

>> No.2024894

Just curious, what type of programming backgrounds do you guys have? Did you goto school for it, what degrees, etc..

Personally, I'm only in high school (senior) and I've been programming seriously for 3-4 years now. Calling it a hobby would be an understatement. I've mostly worked with: C++, ActionScript 3.0 (flash), PHP, (My)SQL, JavaScript, etc.. Huge web development fag here, but I also love creating desktop applications using C++.

I plan to major in computer science (not sure which area I want to go into specifically, I figure I'll determine that once I try out different fields in college).

>> No.2024895

>>2024863
So you are saying we should teach everyone derivatives in mathematics before we teach them how to use a formula like the quadratic equation so they can get straight to work solving problems and then learn how the equation was derived later? If video game companies started writing all their game code in only ASM we would have teams of 100 programmers that had a hard time making Nintendo 64 equivalent games in 4 years.

>> No.2024899
File: 13 KB, 803x515, bbs lord forestfight.png [View same] [iqdb] [saucenao] [google]
2024899

>>2024888

:3

>> No.2024905

>>2024808
How can you hat abstraction? You're probably a horrible programmer.

>> No.2024910
File: 9 KB, 278x207, tw2002.jpg [View same] [iqdb] [saucenao] [google]
2024910

>>2024899

>> No.2024911

>>2024894
I am a 3rd year Computer Science student.

>> No.2024916

>>2024889
I'd rather have a simple design that required more effort to use, than a complex design that let me be more productive. Easier to fix, errors are deterministic ("garbage in, garbage out", instead of "garbage whenever the software feels like it"), and the software can eventually be "done," instead of stuck in an endless beta phase.

In other words, if Ford still made model T's, I'd be driving one.

>> No.2024918

>>2024911
How do you like it? What field are you specializing (if any)? Did you have any serious CS experience before college?

>> No.2024933

>>2024894

Realtime systems, which is mostly C and a lot of proprietary libraries.

Now I mostly just tell some indians what to do since everyone was laid off years ago except a handful of us to direct the hadjis.

>I plan to major in computer science

I'd reconsider if I were in your shoes today.

>> No.2024946

>>2024905
How can you hat the letter "e"? You're probably a horrible grammar.

>>2024895
Not sure what you're getting at here. You could probably skip the quadratic formula entirely, really. Derivatives are actually useful.

>>2024894
Learned C and x86 and z80 assembly in highschool, loved it, majored in comp sci for a year in college, hated C++ and Java and pretty much all programming invented since the 80s, switched to electrical engineering, enjoying it so far.

>> No.2024947
File: 14 KB, 720x400, axelfcp.png [View same] [iqdb] [saucenao] [google]
2024947

So I herd 4chan lieks CP.

>> No.2024957

>>2024918
I like it. My favorite application of what I have learned so far is definitely reverse engineering. I love going through another collection of code and figuring out how it works. I am pretty sure I will do my senior project on advanced buffer overflow exploit techniques. I would much rather be sitting in a room trying to figure out how to break a piece of software ( especially expensive corporate software thought to be impenetrable ) compared to the typical office space TPS report style programming that infests the field.

As far as previous programming experience? Well my families first computer ever was an Apple II. I think I was 6 at the time we got it and I was so amateur that I thought I had to complete the spreadsheet tutorials to unlock the GUI I had seen on other computers. By age 9 I was programming a 66MHZ 4MB of ram Packard Bell. At around 11 I was making my own very simple games in ASM using Turbo ASM. By the time I was in High School my computer classes consisted of me showing my teacher tricks and how I would go about breaking into certain areas of the school networks ( He was cool as fuck ) while the other kids were doing their typing exercises.

>> No.2024966
File: 69 KB, 1000x585, ColecoVision.jpg [View same] [iqdb] [saucenao] [google]
2024966

>>2024946
>Zilog Z80

<3

>> No.2024991

>>2024815
>Abstraction is a necessary evil
>Abstraction is evil
>wtfamireading.jpg

>> No.2025015

>>2024991
Abstraction hides details. The details are what's actually real. People like to work with real things.

>> No.2025022
File: 4 KB, 126x126, deal w it.jpg [View same] [iqdb] [saucenao] [google]
2025022

>>2024991

>> No.2025040

>>2025015
Why are you using a web browser then? Shouldn't you be deciphering raw bits as they are coming across a wire instead?

>> No.2025048

>>2025040
Compromise. I like forums, but I hate abstraction. There's no option for me to satisfy both conditions, so I'll pick the less important one and ignore it.

>> No.2025052

I get great pleasure out of the fact that I could print out this discussion and show it to Stroustroup tomorrow if I wanted to. His office is down this hall.

>> No.2025067

>>2025048
You must like language too because you seem to convey a sense of it on said forums. Language is the abstraction of wave frequencies of vocal oscillation. When someone asks you what a chair is do you write out an equation detailing every subatomic interaction each particle in the chair has to exhibit to be considered a chair or do you just say "chair".

>> No.2025083

>>2025052

He isn't dead yet?

>> No.2025088

>>2025067
Again, compromise. Or really, I just pick an arbitrary level of abstraction as a base and say that that's "not really abstraction." Usually that base is around the level that less abstraction would be impractical; for example, getting rid of language.

On your end, though, the extreme would be abstracting every action a computer could take to one operator which operates on one data type. You choose an upper limit instead, and say "below this level is okay".

We're just both making arbitrary distinctions.

>> No.2025096

>>2024933
>I'd reconsider if I were in your shoes today.

why

>> No.2025112

>>2025088
A great artists' abstraction distinguishes a portrait from a stick figure. Less isn't always better.

>> No.2025117

>>2025112
I was under the impression that the stick figure would be an abstraction of the portrait, seeing as the portrait shows all the details while the stick figure shows the general idea.

>> No.2025122

>>2025083
The inventor of FORTRAN only just died. Most of the pioneers of Computer Science are still alive. Its a young field.

>> No.2025123

Is it possible to learn Java or C++ with very little programming experience?

Suppose I have a strong sense of mathematics, which is better to start? How hard would it be?

>> No.2025130

Even assembly language is an abstraction.

>> No.2025136

>>2025123
you can learn C++ and Java even if you were unable to add 1 + 1 in your head, as long as you are able to learn syntax.

>> No.2025137

>>2025117
The stick figure is composed of the primary objects of any image which are points and lines. A portrait is composed furthermore of complex geometry and shading.

>> No.2025159

>>2025137
A portrait isn't made to look like geometry and shading, though; it's made to look like a real object. Hence, less abstraction from the initial, real thing.

>> No.2025204

>>2025159
Real things are made of points and lines at their most fundamental levels. Our brain uses a visual hierarchy that resolves the images as lines propagate upward into shapes and shapes propagate upward into objects.

>> No.2025221

>>2025096

No jerbs.

>> No.2025234

>>2025221
Really? I thought the field was growing really fast with good job outlooks

>> No.2025239

ITT: old faggots who have no idea about the powers of OOP.

glad i'm a newly minted programming major cuz the world is going to suck your you retards and your 'procedural rulzz' bullshit.

>WELCOME TO THE 21ST CENTURY MOTHERFUCKERS

>we only need C for embedded micros that don't have the juice for real languages

deal_with_it.jpg

>> No.2025261

>>2025234

It is. Just not in NA, Aus or EU.
If you're in Malaysia or something then by all means go for it.

>> No.2025287

C++ is boring, we all know it. Java is where productivity lies.

>> No.2025299

>>2024743
How did you come to that conclusion? Multiple redefinition of symbols leads to ambiguity, that is a major no-no of compilers and any complex software system in general.

That's why people use ENTERPRISE bullshite like interfaces, because if everyone just did their own shit, nobody would have any clue if the function they just called is the correct one or happens to be defined in some random unrelated unit that happened to get linked in. Bugs, bugs everywhere.

Syntactic sugar supposed to make the code more readable, thats why they are called that, because its all sweetness for the programmer's eyes and no substance.

C++'s idea of syntactic sugar is more like injecting fat deposits directly into your veins to clog it up. The code becomes less readable if you're given the ability to randomly redefine symbols. You can define operator+ to have the function of operator- and none the wiser, especially when it conflicts with pre-defined symbols.

Let's say a bit shift and a print:
cout << shiftthis << bythisamount << endl;

Anyone taking a glance would have no idea what the middle << actually do. Is it the ostream operator<<? Is it shiftthis.operator<<? Is it a bit shift?

Ambiguous.

If anything operator overloading lets me get away with bullshit like these:
while(i --> 10) cout << "value: " << i << endl;

I can define operator-- and operator> in such a way to make the above sentence legal and valid C++. Does it make the code readable? Fuck no, not unless you know the context of -->.

Pick up a book on compilers, and understand why ambiguity is bad.

>> No.2025315

>Multiple redefinition of symbols leads to ambiguity, that is a major no-no of compilers and any complex software system in general.

The compilers and other programmers understand it just fine. The problem is with you.

>> No.2025319

>>2025299
I almost forgot the one additional huge pitfall, if you overload a operator, you CANNOT change it's precedence.

So you also need to check the C++ operator precendence table to ensure your code doesn't get screwed up by the compiler.

>> No.2025338
File: 76 KB, 300x360, 50s guy.jpg [View same] [iqdb] [saucenao] [google]
2025338

>>2025319
I'm not sure if you've noticed, but millions of applications went ahead anyway without issue despite your protestations.

>> No.2025341

>>2025315
The problem is some programmers such as yourself who read learn C++ in 21 days think you're the authority on such matters.

I've already given cases in which it becomes ambiguous, can't say the same about you running your mouth.

>> No.2025353

>>2025338
Show me examples where there exist heavy use of overloading, go on.

No proof, don't talk. This is supposed to be a scientific board, show proof by contradiction.

>> No.2025363

>>2025338
What about the millions of applications that are choke full of bugs and unhandled exceptions, you're trying to say they never crash?

>> No.2025381

>>2025341
>yourself who read learn C++ in 21 days

Oh, that's adorable.

But why stop there?
How can you defend using << as a symbol when < already means less-than? You needlessly complicate parsing because now the symbol is ambiguous.

>> No.2025409

>>2025353

Seeing as you've demonstrated familiarity with the scientific method and the burden of proof, you should know you are in no position to make such demands.

By all means back up the self-aggrandizing claims that started all of this and then we'll see where it goes from there.

>were luddinit

Indeed.

>> No.2025413

>>2025381
>Oh, that's adorable.
No argument? Clever.

< and < is ambiguous, < and << isn't.

< and <<< isn't ambiguous, neither is < and <=<<.

Maybe you should learn how a tokeniser work, overloading is context-sensitive, the tokeniser sees << for both instances, the parser have to perform semantic analysis to check the semantic context, is this << an overloaded operator or a pre-defined one? It then needs to look at the arguments and see if the first one defined a operator<<. That is giving the parser needless work.

>> No.2025426
File: 38 KB, 523x478, what the fuck am i skull.jpg [View same] [iqdb] [saucenao] [google]
2025426

>>2025413
>ad hominem
>argument

>> No.2025428

>>2025409
I've given two cases in which the use of operator overloading can lead to ambiguous cases. Both cases demonstrate that the use of overloading does not increase code readability, which was the purpose of syntactical sugar.

You neither showed proof and providing counterexamples, instead you beg the question.

>> No.2025433

>>2025428

If those are ambiguous to you then I can see why you have so much trouble.

>> No.2025439
File: 18 KB, 500x595, Your-Head...My-Point[1].jpg [View same] [iqdb] [saucenao] [google]
2025439

>>2025413

>> No.2025448

>>2025433
So you're going to personal preference as truths?

Okay.

cout << i << value << endl;

What does the above do, without compiling it, given you know ostream overloads << and there exist << as the bit shift operator. Which is called first.

>> No.2025458

People will say “can’t have operator overloading – it leads to weird code that doesn’t make sense”, which would be like banning electric guitars because someone might use it to make weird music.

>> No.2025459

So all of you just use single sentences, greentext, troll picture and no counterexample to prove and dismiss things?

No wonder there are people like this >>2024778:
>stream mechanisms like iostream integrate with the device drivers
>iostream
>device driver

They aren't even in the same memory space.

>> No.2025473

>>2025459
You realize where you are, right?

>> No.2025490

>>2025353

Every mathematics suite ever.
Or are you just going to move the goal post every time you get slapped with something you overlooked?

>> No.2025503

>>2025015
>Abstraction hides details.
Any good programmer knows that the details of implementation are unimportant compared to the overall correctness and verifiability of the program.

> The details are what's actually real. People like to work with real things.
Sounds familiar... http://conservapedia.com/ConservaMath_Medal


>>2025299
> Let's say a bit shift and a print
If you write code like this, then you are a retard. Anyway, how often do you do a bitshift and then send that directly to a stream? And even if you have to, you can't be bothered to have an intermediate variable for the sake of clarity? It's not like it won't be optimized out at compile time. (Or you might even put in parentheses?)

In practice, how often do you interpret the overloaded stream << operator as something else? If you answered anything other than "never," then you should not be allowed to use a computer.

>> No.2025511

>>2025490
>Every mathematics suite ever.
>No crashes.
That's a mighty big proposition you're making, care to back it up with evidence?

As for the rest:
This is the same thing with goto, overloading is useful, but in general, it should be kept to a minimum to avoid ambiguity.

Goto is useful as well, why not use it at high level? Why not? Why not create complex code flow constructs, according to you, readability can fly out the window.

>> No.2025536

>>2025503
>If you write code like this, then you are a retard.
Great argument there.

>Anyway, how often do you do a bitshift and then send that directly to a stream?
Why does how often even matter. So spaghetti code shouldn't be avoided because somehow it doesn't come up often. So exceptions shouldn't be handled because you're not expecting them to be raised often.

>In practice, how often do you interpret the overloaded stream << operator as something else?
You tell me, in my example, what does each of the << do. If you know the mechanics behind how operators are implemented, you would know precise which << is called for every << token. Without compiling it, go.

>> No.2025542

>>2025511

I use goto's to turn my code into a maze. Because mazes are fun!

>> No.2025545

>>2025511
>No crashes.

What's this? Non-sequitur, strawman, association with causation or moving the goal post? Probably a little of each.

>> No.2025553

>>2025503

Every operation has >implied parentheses anyway. _everything_ is context-sensitive in some manner so I don't get the splitting of hairs.

>>2025511

If you're trying to dismiss things like matlab and autocad you can just go home now.

>> No.2025621

>>2025536
> Why does how often even matter
Because you're complaining about something that never comes up in practice. It's like arguing about the Oxford comma. Okay, you can contrive sentences in which it's important, but you have to work really hard to do it. It's the exception, not the rule, and those constructions look unnatural anyway. If you write good code, it's never a problem.

> cout << i << value << endl;
The operator is clearly left-to-right associative, and overloaded operators have the same precedence as regular ones. So the evaluation is not that complicated, and it does exactly what you should expect it to.
> (cout << i) << value << endl;
prints i, returns cout
> (cout << value) << endl;
prints value, returns cout
> cout << endl;
prints end of line, flushes buffer

>> No.2025666

>>2023527
I.am.a.java(fag());
java.fag() = new faggot;
I.like.dicks(my.assholes());

C is the shit, not every fucking problem fits OO's little bullshit cookie cutter class system. Java is for faggots who like slow programs who have their shitty JVM do everything for them and don't understand how it works, and worst, can't fix it or mess with it if they want to. C++ is kinda leet, but MS's version is faggot with all their shit, C# is like whatever, kinda like VB but not as faggot, but it's still MS only, (although idk there's a way to write C# for linux or something similar IIRC) but eh whatever.

C remains the most popular and useful language for everything low level and crucial. Kernels, embedded systems, drivers, speed critical programs, you name it, you can write it in C. ANSI C is portable and not bullshit bloated, and sure it doesn't have you jewish fancy classes and shit, but fuck it, the argument rages on.

Also, every platform has a c compiler, every decent platform anyways, not proprietary bullshit closed stuff. Hell, even TI's calculators have a C compiler. If you want to write code that will work everywhere (fuck you java fags) and on everything, C is where it is at. the GCC is the best and most optimized compiler out there.

Also, what other language can you write cool obfuscated code in?

>> No.2025684

>>2025666
I love C as much as the next person, but OO focused languages like Java and C++ are a necessary evil.

>> No.2025818

Hate any flavour of C (C++,C#).But I like Objective C. Love the lose typing.

>> No.2025828

>>2023582
Well if that's the case, you're just too stupid to use the C family.

>> No.2025842

>>2023527
Basically boils down to this. Python is best as the most expressive scripting language and has an interpreter. However, it is a bit slow, so often C++ (or assembly) is used to optimise crucial pieces of code which are then called by Python.

Java is neither fast as C++ or flexible as Python. It does have the advantage of being cross-platform, but it is ugly and its APIs are horrible.

In sum, use java if cross-platform accessibility is important to you. Else learn Python and possibly C++ if you do CPU intensive programming.

>> No.2025857

Java is the worst programming language in the world. I'm a scientist, not some sort web-apps developer.

>> No.2025877

>>2025666
>idk there's a way to write C# for linux or something similar IIRC
There is. C and C++ depending on the situation are still master race though.

>> No.2026126

>>2025842
>python
>a bit slow

It's pretty fast to me. It's easy as pie to write code in c and plug into python, make it even faster on computational demand's problem.

Java .... The "crossplatform" is just as good as other modern programming language. Don't choose it because it's cross platform. 99% is linux/unix/windows which pretty much support by majority of language.

>> No.2026542

D is the best language, easily. But Walter Bright acts like a total fuckwad when it comes to deploying it.

>> No.2026643

Anybody who says you won't find a job after graduating with a degree in CS is lying to you. They most likely dropped out of school because calc 2 was hard or switched to psychology.

>> No.2026656

>>2024591
I mean, we can talk about alternative syntax, but you just defended var-args, an absolute bane and devil of good code. Type safety good. Var-args bullshit bad.

>> No.2026668

>>2024790
You've obviously never heard of screen tearing. Please come back after actually having written a GUI.

>> No.2026688

>>2025842
>It does have the advantage of being cross-platform, but it is ugly and its APIs are horrible.
The reason that Java is so popular is largely because of its amazingly large, portable, and useful libraries. You sir are wrong.

>> No.2026701

>>2024894
C++ mostly, with some C and some Java. I make 90k/hr in the industry, straight out of college, from U of M. It's good times.

>> No.2026705

>>2026701
90k/year, sorry.

>> No.2026719

>>2024895
You can get to the quadratic formula without using derivatives

>> No.2026729

>>2025052
/jealous
Say hi to him for me.

>> No.2026734

>>2025088
No no. You decide that the only acceptable level of abstraction is a one to one translation to assembly. I think that's stupid and quite inefficient. I also think that you don't really hold that view even though you profess it.

>> No.2026738

>>2026688
I think you'll find that portable is another word for cross-platform and you assessment of the java libraries as large and useful is inaccurate. They are useful if you are a codemonkey for whom performance is not an issue and they are large if you get a kick out of learning obfuscated apis. html parsing in java... cool story bro. Love having the interpreter reinstantiate a variable every time I pass it around. Makes coding around the inherent flaws so enjoyable.
>>2026126
>The "crossplatform" is just as good as other modern programming language. Don't choose it because it's cross platform. 99% is linux/unix/windows which pretty much support by majority of language.

You must be joking. Java comes installed on basically everything, you cannot say the same for Python and obviously not for compiled languages. Accessibility IS the main reason you use java. Why do you think its used so much on the web O_o

>> No.2026740

hurr durr abstraction -- assembly is for pussies, I only write in microcode.

>> No.2026744

>>2026738
Technically, C is the language which is available everywhere. Java might be better than say, Python, but C reigns supreme in this regard. gcc is ported to everything (and consequently we get C++ damn near everywhere too).

>> No.2026746

>>2026738
Ok sir. You and I have a different opinion on the facts. I still posit that the Java libraries are amazingly vast and useful for those who don't have severe performance requirements but do have coding deadlines.

>> No.2026760

>>2026744
I think you're confused. First of all, if you are prepared to build everything locally every time you want to use it, than the point of portability is mute. I mean you still have to dl the c compiler and linker, just as you could dl python or anything. What I (and everyone else) referrs to when I say portability is the idea of 'write once excecute anywhere'.

>>2026746
It sounds like you are very proficient with Java and its APIs, which is great. I do however, think you are guilty of ertrospectivley calling the language wonderful after having invested the time to learn it. Out of curiosity how many other languages do you program in seriously?

>> No.2026773

>>2026760
Same dude here. I do mostly C++ and Java, more of the C++ honestly, though my shop is mostly Java.

There's portable in terms of "compile once, execute anywhere", and "write the code once, execute anywhere". In both senses, the Java libraries are portable. In the second (and more useful sense), standard conforming C programs are portable.

>> No.2026793

>>2026773
Same dude.

I have to go to bed now. Sorry. Night sir.

>> No.2026796

>>2026793
Goodnight sweet prince

>> No.2026809

>>2026738
>Why do you think its used so much on the web
hype, once it catch on ( buildup alot of fanbase), it'll never go away.

Same thing happen to ruby/php. People keep using because everyone else using it.

minirant: That remind me of one thing. Why the fuck everybody keep using perl for bioinformatics is beyond my comprehension.

>> No.2026817

>>2026809
Java is used so much because it is great at what it is used for. If you have a group of developers -- some of whom will inevitably be mediocre -- all working on the same codebase, I'd rather that codebase be in Java than anything else, as it forces developers to use a disciplined common OO methodology.

>> No.2027000
File: 220 KB, 640x422, Two_women_operating_ENIAC.gif [View same] [iqdb] [saucenao] [google]
2027000

fuck languages, you pussies, I program by twirling knobs and unplugging cables

>> No.2027022

>>2024346
Real men use a wire to short processor pins and write exe files directly into memory this way.

>> No.2027023

>>2027022
Real men just work everything out with a pencil and paper.

>> No.2027028

stay away from .NET it's gay

>> No.2027036

1) lisp family
2) ML group
3) assembly
4) Python
...
99) C
...
9999999999) C++

You can consider this in terms of "best syntax," or you could consider it in terms of "order to learn languages", either way it works.

>> No.2027044

>>2027036
>lisp before ML
parentheses fetish detected

>> No.2027047

>>2027044
macros, otherwise it'd definitely be the other way

>> No.2027078

>>2027036
3) Haskell

fixed that for ya !

>> No.2027095

>>2027078
Maybe it isn't appropriate but i consider haskell in the ML group

>> No.2027641

c++ is definately noob, i hate programming in that shitt

>> No.2027871
File: 100 KB, 600x955, forth cover.jpg [View same] [iqdb] [saucenao] [google]
2027871

>C++ vs. Java syntax arguments everywhere

laughing_forth_programmers.jpg

>> No.2028059

>>2025299
Never program in LISP then, + can fucking mean - if you want to.

Operator overloading is a good thing.

>> No.2028067

>>2027871
I found a forth book in an old storage room near our robotics lab called "Going Forth"
I think it was from 83 or something like that

>> No.2028094

>>2028059
Shadowing a name in a lexically scoped language is a very, very far cry from operator overloading.

>> No.2028111

Java is strange. Also a bit of slow to program in it, for me at least.

C++ syntax is pure logic on the other side. Again, for me at least.

>> No.2028130

holy shit this thread is still here

all of you are a bunch of no good college failures destined to be code monkeys anyway

>> No.2028225

I like java...

>> No.2028244

C++ and Java is for pussies. Machine code ftw

>> No.2028948

What are robots programmed in?

>> No.2029635

>>2028948

Usually Intercal, I think.

>> No.2029643

>>2028244
machine code is for faggots
real programmers input binary instructions directly into the RAM using very tiny magnets

>> No.2029664
File: 96 KB, 640x485, 1268350315743.jpg [View same] [iqdb] [saucenao] [google]
2029664

>>2029643
lold