Is this an antiqued tool only relevant today for the purpose of intellectual amusement?

is this an antiqued tool only relevant today for the purpose of intellectual amusement?
who cares about the minuscule performance gains when modern hardware is nearly inexhaustible?

Attached: C.png (1200x1276, 77K)

Other urls found in this thread:

en.wikipedia.org/wiki/FastCGI
twitter.com/SFWRedditVideos

>who cares about the minuscule performance gains when modern hardware is nearly inexhaustible?
People who write non toy programs, modern hardware is completely insufficient for many non trivial tasks, so the performance gain of C over some scripting language is really important.

>modern hardware is nearly inexhaustible
Then why can't I play WoW on my laptop?

>who cares about the minuscule performance gains when modern hardware is nearly inexhaustible?
You still need low-level control of said hardware, which is where C excels. Think of it as a portable, human-readable assembly.

But C is less and less of an accurate representation of hardware.
Just think of Caches , which are enormously important.

Enormously important but also not exposed for the developer. You can't manipulate the cache directly in any language. Why are you mentioning it every thread.

>But C is less and less of an accurate representation of hardware.
It's not supposed to be a representation of hardware, it is a high-level language after all.

>Just think of Caches
You can optimise your C code for cache locality, what the fuck are you on about? C programmers do this all the time.

>Why are you mentioning it every thread.
First time I ever did.

But the point is that C is not "how a computer really works".

>It's not supposed to be a representation of hardware, it is a high-level language after all.
My point.

>You can optimise your C code for cache locality, what the fuck are you on about? C programmers do this all the time.
Yes, but it isn't modelled in C and that is probably perfectly fine.

>My point.
Although, you could probably use architecture-specific intrinsics to prefetch cache lines or flush the cache.

>Yes, but it isn't modelled in C and that is probably perfectly fine.
Yes. But I don't think that was OP's point.

I'm not arguing that C is the best thing since sliced bread, it clearly has its shortcomings. But C is very simple and allow implementation-defined behaviour, thus the way it synthesises to machine code is very predictable and understandable, which in turn makes hardware-specific optimisations relatively trivial compared to languages with more complex runtimes (such as interpreted languages).

I wasn't really arguing with OP, but I completely agree C is a good abstraction of the hardware which allows fast implementations.

>who cares about the minuscule performance gains when modern hardware is nearly inexhaustible?

That's why people don't programm in C, unless they are doing embedded, OS or some really performance relevant thing (i.e. games).


We've long since reached a point where the compiler outperforms any human optimization. Maybe you can create a scenario where micro-optimization is really necessary, but generally if don't do really stupid things the compiler will take care of the rest.


C is a lot of fun. You will learn a big deal about programming by simply playing with different data structures (i.e. doubly linked list), signal handling, system calls (i.e. fork), posix threads, and writing your own UDP sockets and of course you have a lot of libraries for special tasks. Also makefiles are great.


But unfortunately it has so many quirks and functions that are considered "deprecated" by many coding standards. Even if you look at K&R, their code wouldn't pass most guidelines today.

Bascially anyone should look into assembly before looking at C, because (like this guy wrote: ) it's basically one big macro system for ASM..

int i = 1;
int arr[3] = {11, 22, 33};

printf("OMG what's that: %d", i[arr] );

>like this guy wrote
FYI, I am the same guy who wrote both the post you're responding to and the post you're referring to.

>That's why people don't programm in C, unless they are doing embedded, OS or some really performance relevant thing (i.e. games).
And device drivers.
Also, games is not the best example of performance critical systems, you could easily mention things like ffmpeg.

>modern hardware is nearly inexhaustible
Webshitters should be burned alive.

People who write boomer code for things from 10 years ago, like if they're really stuck on 256mb RAM hardware for some reason, even though you could just buy entirely new modern hardware for the entire company at the price of one developer.

C code is the most readable when written correctly
>antiqued
like a wine? last standard was 2018? I guess not quite then
>minuscule performance
they add up, so never minuscule at the end

>FYI, I am the same guy who wrote both the post you're responding to and the post you're referring to

OK, but I didn't disagree with you anyway, so..

Also yeah, games is more C++ I guess, but my point was that C development is not that common today and often tied to hardware industry.


He does have a point though.
Dev time is way more expensive than runtime, at least from a MGMT point of view.

>but my point was that C development is not that common today and often tied to hardware industry.
Yeah, although in my experience C is actually used a lot in large-scale FOSS projects. My working theory is that it is more common to find contributors that know C than C++, and probably more easy to enforce very strict programming guidelines when the language itself doesn't allow a lot of different things.

But I guess in general, it's not so often used anymore other than to maintain legacy software or to do hardware related stuff.

Anything using SIMD is still mostly hand-written in assembly, C simply doesn't have language constructs that translate efficiently into it.

>Anything using SIMD is still mostly hand-written in assembly,
>C simply doesn't have language constructs that translate efficiently into it.
Wrong, it's mostly using compiler intrinsics.

static void rshift_fast(uint8_t* p)
{
__m128i in = _mm_load_si128((__m128i*) p);
__m128i shift = _mm_setr_epi8(15, 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14);
__m128i out = _mm_shuffle_epi8(in, shift);
_mm_store_si128((__m128i*) p, out);
}

static void rshift_slow(uint8_t* p)
{
size_t i;

uint8_t last = p[15];
for (i = 15; i > 0; --i)
{
p[i] = p[i - 1];
}
p[0] = last;
}

That's when you use libraries written in C, C++, or Fortran.

All things being equal, if your platform supports it, C++ is almost always a better choice. It provides a lot more functionality while maintaining top-notch performance.
The only real reasons to use C right now are having a simpler, de facto standard ABI and C compilers being more widespread, targeting more architectures.

rekt

The rust language itself offers nothing new and is poorly implemented. So what's left besides marketing wank and the community full of shills who harass everyone.

thats was the main argument for Ruby and Python for years, one is ded an the other is used as a scripting language...
The truth is that if you go big every performance gain counts and C can give you that control, enjoy having to pay 12$ a month to host a site while i can do it with 1$

>next generation proprietary memory allocation techniques.
STRINGS, VECTORS, STL, BOOST
all this shit slow as f-ck
INSTEAD OF USING NAMSPASES CREATE MUH OBJECTS
then use em for creating memory leaks everevere

You do web dev in C?
That makes no fucking sense buddy.

If you want web performance you use Node, if you want super web performance you use Go or Elixir.

The whole purpose of bulky, heavy high level languages is that you can focus 100% on the product, get it out quick and have the least friction while doing it. You need users for your "servers" to matter in the first place, and unless you're hitting a point where you have a user hitting your DB every 50ms, you can host it on some shitty $5 DO droplet forever.

People (even in JS) are fucking stupid and waste their weeks away reinventing the wheel to save 0.05ms when the main bottleneck will be their DB anyway. It's fucking dumb.

better than c++

>That makes no fucking sense buddy.
en.wikipedia.org/wiki/FastCGI

Nothing can beat a long-running FastCGI process.

N
NO

>next generation proprietary memory allocation techniques.
STRINGS, VECTORS, STL, BOOST
all this shit slow as f-ck
INSTEAD OF USING NAMSPASES CREATE MUH OBJECTS
then use em for creating memory leaks everevere

>who cares about the minuscule performance gains
They're not minuscule. C can be an order of magnitude faster on common algorithms. When you optimize a solution to take advantage of C it can be 2-3 orders of magnitude faster.

"hurr durr who cares computers are like so fast now!" is why we have web pages that consume 25-30% of a multi-billion instruction per second CPU just to display a fucking ad. Not to mention text editors that use 13% just to blink a cursor.

We've blown all of our hardware speed gains. I realized that the other day while playing with a late 2000s notebook I never sold. My every hardware measure my current notebook is faster. Yet common apps run faster on the old one.

You mean except..You know..Actually putting a product out at a reasonable pace?

>modern hardware is nearly inexhaustible?
Yeah, that's why you need 8GB of RAM and Laptop grade CPU on your phone to browse instagram now.

You know that if you invest 1 hour per month in coding for your web app, this hypothetical financial advantage would be void?

>hurr durr i can't kode without muh high levels language
There were productivity gains through the early 2000s. Since then we've lost productivity because of:
* Meme language shit.
* Over 9,000 constantly changing frameworks that no one masters.
* Muh trendy design paradigm of the day which changes nothing but usually introduces another layer of performance degrading abstraction.
* Severe performance issues with our tools. Visual Studio is probably the worst offender with half day installs/updates and shit slow compiling on large projects. (Slow down edit/compile/debug and slow down the entire project.) There's also usually 20 ways to do something in VS but nobody can agree which is best and choosing path X may mean you can't reuse code in project Y.

Honestly, the proliferation of languages and frameworks and the incredibly fragmented world they've produced sucks more out of productivity than writing in assembler. If you could fork our timeline around the late 1970s and simulate a world where there was only C BUT everyone worked in C and everyone studied the appropriate design patterns for C and all libraries were usable because they were all in C you would have a world where projects were completed in half the time. Using fucking C.

We talk about productivity all the time and then do shit which absolutely wrecks reusability of both code and knowledge, and therefore productivity.

Which doesn't matter unless you write small programs, the more trivial your program is the less you care about the overhead of the interpreter. But for complex programs it is simply not viable for serious computing where you need the performance.

>We've long since reached a point where the compiler outperforms any human optimization. Maybe you can create a scenario where micro-optimization is really necessary, but generally if don't do really stupid things the compiler will take care of the rest.
Why do you think I would disagree with this?

Ask yourself what software projects started this decade have chosen to use C? It is a dinosaur language only used for maintaining legacy codebases.

hurf durf everyun shuld use java!1!
old = bad
c = old
c = bad
me smart! java minecrafT!

Attached: Retard.jpg (600x600, 57K)

>modern hardware is nearly inexhaustible
>it literally takes 30(thirty) seconds to open the start menu in windows 10.[spoiler]not even joking literally measuerd it with a stopper[/spoiler]

kill yourself op

We should gas all the San Fran "developers", round up everyone involved in creating XP into a work camp and force them to make a 64bit windows XP at gunpoint.

COSE C PROGRAMMERS TOO RICH AND HAPPY TO DO ANYTHING

Attached: ALCO_FUN.jpg (1620x2160, 552K)

>people say c is bad(which is true)
>faggot automatically thinks the people want to replace it with java
>and then calls others brainlets.

No, the best fork would be one where C never existed in the first place.
We could have Lisp machines and secure software, but thanks to C, we don't.

That's 99% of Cniles. Intellectual dishonesty and strawmen at their finest.

>next generation proprietary memory allocation techniques.
STRINGS, VECTORS, STL, BOOST
all this shit slow as f-ck
INSTEAD OF USING NAMSPASES CREATE MUH OBJECTS
then use em for creating memory leaks everevere

C is good if you want to program a glorified PDP-11 simulated on your actual hardware.

>next generation proprietary memory allocation techniques.

because you use gnu/linux with a minimal tiling wm