C is too verbos-

>C is too verbos-
>C is too low-leve-
>C is too unsaf-
C is clean and readable. C standard library functions offer high-level interfaces for pretty much anything, from argument parsing to string integer conversion. Dynamic arrays are literally one call to malloc(). C is safe provided you check return values and do adequate bounds checking. For example, my sieve here doesn't check if input is negative, but it doesn't matter, because a negative simply decrements down down from an unsigned type - it's a feature in this case.

Comparatively, Python is excessively verbose due to gimmick abstraction, and requires convoluted use of try-except exception handling to use safely. Python requires the same amount of libraries to accomplish the equivalent task. Setting up a dynamic char array is equivalent as in C. Python conditionals are so slow that you're forced to introduce another logic branch, because computing and checking j yields too much overhead. This then requires more code than the C sieve - you use an interpreted language to reduce boilerplate, not introduce it.

The Python interpreter relies heavily on the underlying system, and as such you can see that 8ms of script runtime was spent waiting on the kernel, not even executing the code.
It almost takes longer just to start the python interpreter than it does to compile the C code and compute and output the first 2 million primes:
$ time python3 -c ''

real 0m0.018s
user 0m0.017s
sys 0m0.000s
$ time cc sieve3.c && ./a.out 2000000 >/dev/null

real 0m0.035s
user 0m0.026s
sys 0m0.014s

Attached: Screenshot_2018-07-31_12-42-16.png (1180x874, 111K)

Other urls found in this thread:

thegeekstuff.com/2013/06/buffer-overflow/
en.wikipedia.org/wiki/Wirth's_law
support.hdfgroup.org/HDF5/doc/HL/RM_H5TB.html
twitter.com/SFWRedditImages

Jesus. C is so powerful. The only problem is that the standard library lacks containers.

C is nice and all but if I'm making advanced software that does a lot, then I will need the features C++ provides.

yeah it's fine for doing small toy programs like this, but if you need to do serious work python is a lot more suitable

>all that whitespace/tabbing in the C program

fucking hell.

Python wasn't made to compete in the same domain as C and the fact it wasn't doesn't excuse C from being as terrible as it is. Either your screenshot is contrived to support your post or you're just genuinely bad at Python.

>it's safe as long as you do all this stuff manualy!

This.

Attached: 1526661665270.jpg (475x363, 40K)

Name one "serious" Python project, then.
It's only 6 characters of whitespace at the longest.
>Python wasn't made to compete in the same domain as C
Python literally relies on C for its standard library performance. Python integrated ctypes into the standard library because Python devs needed to start supplementing their codebase with C modules for performance needs.
>you're just genuinely bad at Python
That sieve is optimized for Python, though. It even masks to odd numbers to halve the array size, which actually improves performance on Python range iterators, because range(1, n, 2) is slower than range(1, n/2, 1) despite the same number of iterations.

C and python are both fine, niether are worth complaining about.

wtf is yield
wtf is //

Attached: 1532208541406.jpg (760x430, 160K)

>this gun is safe as long as you don't point it at your own body!

And?

If you lean forwards on a motorbike, you'll crash. Isn't that crazy? Don't do that.

A couple of days ago Jow Forums converted me from C++ to C, started working on a game engine and I have to say I love it, the code is just so beautiful. Definitely going to learn it in depth.
I'm still going to study python in parallel, might be slower and maybe uglier but it's still searched quite a bit in the job industry and knowing more is better than less. Also it's quite useful for simple scripts integrated into my WM.

>write python like it's C
>complain that's is ugly
That is some amigara fault shit.

Here's normal ugly python
sieve = lambda limit: reduce( (lambda table, i: (table.difference_update(range(i**2,limit,i)) or table) if (i in table) else table), range(2,limit), set(range(2,limit)))

it's not python's fault you suck at programming and writing readable elegant code

Oh boy, another amateur programmer who thinks it's easy to write safe code in C. Tell us the secret that so many other professionals have failed to grasp

>write python like it's C
But I wrote C like it's Python. I started thinking it's going to be excessively verbose, but found I could copy it verbatim while reducing LoC. It's ugly because Python slow, and the cleanest form is rarely the most efficient.

>Python literally relies on C for its standard library performance.

This is agreeing with what you quoted. Python is a scripting language and wasn't made for performance intensive work. It'd be bizarre if this was the first time you've heard this. Python integrates ctypes for FFI like most interpreted languages, not because it's trying to replicate C's performance characteristics. The Pythonic way of getting C's performance is still to import C modules.

Also, optimized or not, nothing is stopping you from dropping the separate generator function, whitespace lines, and comments, and replicating your rightward nesting in python. The lefthand example is sloppily written to make python look wordy.

yield turns function into a generator which keeps state between calls and returns the yielded value on each call
// rounds into integer, / returns float

be careful and don't let managers interfere
remember that 90% of what you're using now is c. go ask linus how they do it

#include
#include
#include

void usage(char *err)
{
if (err != NULL)
fprintf(stderr, "Error: %s\n", err);
fprintf(stderr, "usage: sieve LIMIT\n");
exit(1);
}

int main(int argc, char *argv[])
{
char *a;
size_t limit, i, j;
size_t sum = 0;

if (argc != 2)
usage("no limit given");
if (sscanf(argv[1], "%zu", &limit) != 1)
usage("limit must be positive integer");

if ((a = calloc(limit, sizeof(char))) == NULL)
err(1, "malloc");
a[0] = 1; a[1] = 1;

for (i = 0; i < limit; i++) {
if (a[i] == 1)
continue;
sum += i;
for (j = i*i; j < limit; j += i)
a[j] = 1;
}
printf("%zu\n", sum);
}

I just hate unnecessary nesting and sort error-handling elses so much

>code without brackets
it's almost like you want readability and correctness to depend on something as fickle, insignificant and third-class-citizen as text formatting

#include
#include
#include

int main(int argc, char** argv) {
long limit = strtol(argv[1], NULL, 10);
long bounds = sqrt(limit);
char* sieve = malloc((limit + 1) * sizeof(char));
memset(sieve, 1, limit + 1);

sieve[0] = 0;
sieve[1] = 0;

for(int i = 2; i

Attached: 1531885770910.jpg (598x900, 75K)

*competent
Not checking for error codes is a bad coding practice.

Don't need to check for errors because my code is perfect.

I mean, that's all nice an shiny, but now do something like a linear interpolation of a 3d-field with an additional time-dependence dimension. And read that data from an HDF5 file, please.

Not sure what you learn from such a highly trivial little exercise as yours; might as well do a 'hello world' in C and Python and learn nothing.

Yes, Python isn't being developed benchmarked alongside C like, say, Zig is, but all languages indirectly compete with each other by offering something the others don't.
>nothing is stopping you
It is, because than I'd have to output to stdout, which is far slower than internally summing.
>generator function
Drop core Python features and make it even more useless?
>whitespace
Mandated by PEP8.

But lets see:
$ time python3 sieve.py 2000000 | awk '{ sum += $1} END {printf "%4.f\n", sum}'
142913828922

real 0m0.323s
user 0m0.336s
sys 0m0.011s
$ time python3 sieve.py 2000000 >/dev/null

real 0m0.335s
user 0m0.335s
sys 0m0.000s
Runtime increases by 44%, and code is no longer usable internally.

Attached: Screenshot_2018-07-31_15-14-54.png (590x617, 44K)

This is why no one likes NEET programmers. Obnoxious as shit.

**

Attached: 516R4ZoMqBL._SX402_BO1,204,203,200_.jpg (404x500, 35K)

>assigning 0 and 1 to false instead of just starting the loop from 2
>testing even numbers instead of incrementing by steps of 2 from 3
>removing the mask that halves the required array size by 2 (and removes even numbers), thus requiring twice as much memory (try sieving 2147483648 (2GiB))
>sizeof char over sizeof *arr (requires two LoC to change if type is changed, requires more maintenance)
And off by one to boot:
$ ./your-sieve 5
5
$ ./your-sieve 6
10
Embarrassing

A language's feature set directly determines how concisely and elegantly you can express your solution.

Attached: 1529142241230.png (271x369, 96K)

>Name one "serious" Python project, then

Attached: lqb.jpg (311x313, 8K)

C is literally the greatest language ever created
It has exactly what you need to solve any problem, no more and no less

Attached: Ai_Enma.png (1920x1080, 2.77M)

Why did she piss herself?

They both have their uses. Try doing something like web scraping in C.

libcurl is written in C.

>Name one "serious" Python project, then
the automated tests I made for my company that use a neural network and OCR to find and click buttons in on the GUI served by the DUT

Actually I'm quite curious about that CLI tool. Never seen one that colorizes the output based on the syntax. Kinda looks like more or cat, what's its name?

>if you ride a bicycle in the middle of a highway, you'll get hit by a car

Blender? Half of the GNU libraries?

Install any modern Linux distribution and type python -v

>programming in python like a retarded nigger monkey
>woooooow it sucks

(OP)
Oh yeah. If you want speed. Compile with cython or use pypy

That's a text editor (vim).

Attached: 1431181213056.jpg (1846x1212, 244K)

It’s obviously possible, but I don’t see why I would use C and write 200 lines while relying on multiple external libraries, while I can accomplish the same thing with 10 or 20 lines of Python using its built in libraries.

>writing a prime finding algorithm to create a unix-style utiliy program in Python
Hey op, which is more concise and readable, your version or mine?
from sympy import sieve
Amazing.
Use the right tool for the job, fucking dipshit. Sorry most of us are solving problems that are at a greater level of abstraction than an 'Intro to Data Structures and Algorithms 100' course lmao.

GNU's preferred scripting language is Guile, doubt they're using very much Python.

I thought it would have been perl

>C is safe provided
>provided
aaaand you provided unfettered access for the whole world to read your memory.

Attached: 1303-and-itsgone.jpg (960x540, 75K)

What exactly are you trying to say here?

thegeekstuff.com/2013/06/buffer-overflow/

The year is 2018 and you're still having buffer overflow issues?

>// rounds into integer, / returns float
Huh. TIL.

>hurr durr look how much better C is on my toy example

t. never worked as a professional dev
the only goal of software development is to have code execute as fast as possible

Attached: 1523614125988.png (251x234, 142K)

>the linux kernel is a toy

>char *x
>sizeof(*x)
topkek

Only with C.

Attached: reece.jpg (600x494, 29K)

>Using the performance on the Sieve of Eratosthenes (a toy example) as justification for why one language is better than another.

Attached: 1500416458611.jpg (207x233, 5K)

>a thread in which OP gets BTFO
%timeit sum(sieve.primerange(0, 2000000))
35.8 ms ± 312 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)
fucking kek. and that's on my shit computer.

Attached: 65041350432143.jpg (1920x816, 45K)

Everyone knows that modern scripting languages such as python are shit in terms of performance. The point is that modern computers are so fast that the hit in performance doesn't matter, and the benefits of the ease of use of modern languages far outweighs the minor speed increase that you can get with languages like C.

>C standard library functions offer high-level interfaces for pretty much anything, from argument parsing to string integer conversion
False. Get back to me when C adds standardized cross-platform filesystem and system time utilities.

>The point is that modern computers are so fast that the hit in performance doesn't matter

That mindset is the reason why modern software is so bloated and slow. Many programmers don't care about optimization and performance, expecting fast hardware to take care of everything. It's a nonstop arms-race between lazy programmers and hardware innovation.

en.wikipedia.org/wiki/Wirth's_law

std::filesystem
std::chrono
idiot

I said C, not C++. Fucking retard.

Here's a challenge for you to do in C if you think it's just as expressive as Python.
Find the top 10 posts with the most (You)s given a Jow Forums thread number. No one has ever been able to do this in C.

no u

If you use C, you'll run into memory leaks and double free. Isn't that crazy? Don't do that.

>C++
u wat m8?

fuck off that's so easy
libcurl and a json or xml lib

I agree with you that Python (or even better, Ruby) would be the faster way to accomplish this. However, it's really not as much an issue with C as much as it is with the C community. There is no reason why C cannot have good utilities for parsing the responses from html requests. It's not a limitation of the language. It's just that the C community doesn't care.

That being said, there is nothing stopping someone from writing a C program using libcurl and some random parsing library to accomplish such a task.

If it's so easy then show us how to do it.

I've worked in both the government and enterprise commercial software support services.

Python is used pretty extensively for its robust MongoDB driver support as well as Flask framework. There are quite a few serious projects that rely on this technology alone.

I suspect you never actually *worked* in a serious project, and that this is just pure projection. Believe it or not, in the real world, businesses don't care so much about efficiency. There is a cost-benefit analysis. Python's duck typing and dynamic paradigms make it instrumental because services provided by it are cheep to make. Having everything implemented in C, especially the higher-level stuff, is quite counterproductive. C is used for performance. Performance isn't really an issue for larger projects since much of the stuff is usually designed to scale horizontally anyway.

I literally just told you that a high level language like python or ruby would be better. I'm not going to waste my time writing code just to prove a point to some idiot who can't into C

somehow i wonder how people wrote compilers, operating systems, user interfaces , latex... for decades. Oh right, C.

Because there was nothing better. People have moved on. The Linux kernel is a legacy project. New parts of LLVM are written in C++.

>I agree with you that Python (or even better, Ruby) would be the faster way to accomplish this. However, it's really not as much an issue with assembly as much as it is with the assembly community. There is no reason why assembly cannot have good utilities for parsing the responses from html requests. It's not a limitation of the language. It's just that the assembly community doesn't care.
>That being said, there is nothing stopping someone from writing an assembly program using C and some random parsing library to accomplish such a task.

C is a perfect example of the pervasive white male patriarchys hold over simpler programming languages like python or ruby on rails.

It's use actively underminds anyone who isn't some privy rich kid who got to go to a nice university.
Let's make the difference together.
Let's start here first and make a change.

trips of unbridled decimation

Attached: HWYO3136.gif (200x267, 858K)

Are you retarded? Just because it's possible to do something doesn't mean that I am going to or that I am obligated to. If I wanted to be coding right now, I would be working on my C++ n-dimensional rendering engine and not browsing Jow Forums talking to a retard.

They do it by having an enormous pool of the most quality autistic developers submitting code for free, and Linus himself as the supreme dictator of what passes muster. Everything is reviewed by multiple people with a combined hundreds of years of experience, and even then sometime flaws slip through. Finally the kernel is tested like a motherfucker by thousands of people It's fuzzed to death, it's valgrinded. It's hit with a huge quantity of regression tests. It's inspected by governments and fortune 100 companies.
Each line of code costs an incredible amount of time and effort. Likely in the tens of thousands of dollars worth.

This is the price of high performance with high maintainability, and it's only justifiable because Linux runs on billions of devices. You, well you're going to fuck up, write a cute little buffer overrun or use after free vulnerability into your network facing API, and get pwned, if anyone gives a fuck that is. You'll do it again, and again. History doesn't lie. C is dangerous like a nuclear reactor.

Their point remains. No one has done this with C. That is still the case.

Cool story. I don't see that as being a valid point against C. There are an infinite number of useless programs that do not exist in C.

Did you read this post?

The point is OP is an idiot. He created a situation where it would be moronic to use Python, then used Python anyway and claimed it is a point against Python.

So others create a situation where it would be moronic to use C as a counterexample. Pretty simple.

No one even halfway reasonable is saying C is useless or Python is useless. But the OP and anyone siding with the OP is legitimately retarded.

They're both pretty unreadable, your formatting and whitespace habits are cancer.

>linear interpolation of a 3d-field with an additional time-dependence dimension
float lerp(float *start, float *end, float *res, unsigned length, float percent)
{
for(int i = 0; i < length; ++i)
res[i] = start[i] + percent * (end[i] - start[i]);
return res;
}

>And read that data from and HDF5 file
support.hdfgroup.org/HDF5/doc/HL/RM_H5TB.html

>libcurl and a json or xml lib
Even that's too much.
Literally just download the html source and find the line with the most >>'s in it.

C shills address this

what font?

>Write shitty code
>Get buffer overflows
>"It's C's fault, not my fault. I was totally not being retarded by going outside the bounds of allocated memory"

since the 1970's they invented something called zero cost abstractions

hours pass and still we wait

C has libraries as well, dipshit

>Name one "serious" Python project, then.
Reddit

>implying all serious projects are open source.

Sounds like you have never work on the serious project before.

$ time sleep 1 && sleep 2

real 0m1.013s
user 0m0.001s
sys 0m0.000s
$


That's right, boys, C is so good that sleep, a program written in C, takes only one second to sleep for 3 seconds.

retard ctard

C is truly amazing

>wtf why are you taking the training wheels of my bike mom, I could fall down!