The transistor

>the transistor
>cellular telephones
>lasers
>solar cells
>communication sattelites
>unix operating system
>c programming language

You are now aware the entire modern world was essentially created by them

Attached: 1018316866.jpg (609x343, 29K)

transistors are a botnet.

What a dumb logic, the English discovered most of the principles of Physics & Mathematics, they didn't make shit.

>the transistor
Vacuum tubes are superior
>Cellular telephones
Lanines are good enough
>Lasers
Oh wow cat toys
>Solar cells
How bout a gallon of diesel fuel?
>Communication satellites
Russians did it first
>UNIX operating systems
Windows is better
>C programming language
Just use java or python.

Attached: 1519313117857.png (728x682, 721K)

post baits

Attached: 1533883128662.jpg (680x510, 14K)

>>c programming language
>modern
OH NONONONONONONONONONONO

*NOKIA™ Bell Labs

t. python "coder"

t. forgets the null terminator and then talks down to people not programming in C (actually saw this on Jow Forums)

>0x07E3
>not using C
What are you, a fucking web developer?

See

A null terminator on a uint16_t?

lmao fuck off skiddy

t. forgets the null terminator for strings and then thinks it's an integer

Do tell, if I do what you say and add the ASCII terminator to 0x07E3 and now it's 0x07E300... what the fuck do you think that should be interpreted as?

Fuck off back to your JS and PHP

>he still thinks I was talking about the integer 0x07E3 and not null termination for strings instead
Holy shit you are obtuse.

I look forward to exploiting some shitty code you write in the future

>>the transistor
>Vacuum tubes are superior
no
>>Cellular telephones
>Landlines are good enough
no
>>Lasers
>Oh wow cat toys
no
>>Solar cells
>How bout a gallon of diesel fuel?
yeah, but no
>>Communication satellites
>Russians did it first
okay, I'll give you this
>>UNIX operating systems
>Windows is better
no
>>C programming language
>Just use java or python.
no

I rate this 1.5/6

PARC, MIT and Bell Labs are the backbone of a lot of today's computing

>he thinks I wrote the post with the "string" in it
lmao jesus christ, nice attempt at sounding intelligent but you're still a fucking loser

>he thinks I was referring specifically to his post and not him in general
You're the type of person to forget null termination on strings in general, it had nothing to do with the content of your post.

You're still making yourself look like a dipshit lmaooooo

t. forgets the null terminator

t. can only cum to japanese pedophile comics

t. segfaults every time he compiles but at least he's not writing python amirite

Um soory sweety,but most of that was thanks to the Muslims whose invention are called algebra, and the use of Arabic numerals.

>he thinks writing perfect code in one go is even remotely realistic
you're showing your ass again

go back to your python scripts lmaoooooo

Attached: image.jpg (667x488, 46K)

>tacitly admits to writing programs that segfault when he compiles
AHAHAHAHAHAHAHAHAHAHAHAHA *breathes in*

>being afraid of segfaults
You know, just admitting you've never done any real coding is easier

>he thinks segfaults are acceptable at all
sweety... your mental model of your so called "superior" language is broken...

>he thinks segfaults appearing during coding matter
>he doesn't think diagnosing and fixing segfaults is piss easy
The fact that you think a segfault is the be-all end-all of C coding mistakes... just... go back to Python

Yeah but what have they done for me lately?

>he programs by trial and error
oh nononononono

>he thinks he programs
oh nononononono

What about Xerox PARC?

>laser printers
>ethernet
>graphical user interface
>OOP
>WYSIWYG

They killed computer science development for 50 years by creating C and Unix, not sure why Jow Forums and Suckless fags wank over them.

>he's still SEETHING over writing shitty vulnerable c code and thinking that makes him better than a python programmer who produces secure code
lol

I know this is bait but I'll bite:

People who think this shit in real life are insufferable faggots who have 0 concept of how much abstraction there already is between them programming in "real" languages bare metal. The amount of abstraction that's achieved by going from logic gates to C is fucking astronomical, and you could probably never achieve writing a C compiler from bare metal in your entire life if it weren't for the work of hundreds of years of mathematicians working all this shit out for you. Comparatively, the jump from C to Python is minuscule in the grand scheme of things. Sure it might abstract away a few data structures and make things nicer to use, but it's nothing compared to carefully organizing bits of sand together in such a way that they perform actual computation.

Computer science and abstraction are tied together at the hip. Losers who can't understand that the continuous abstraction of mathematics is the very essence of computing are the most obnoxious fucking faggots in the world. The losers who can't understand that it's actually harder to think abstractly are the funniest of them all. Sure you can do the equivalent of hand-holding a modern abacus. You're not accomplishing anything by reinventing the wheel for the thousandth time, you're just doing it because someone needs some retarded shit to be fast, but you're not furthering the field of computer-science. You're not developing new fucking algorithms in C, you're implementing something that someone (far smarter than you) has already thought of in a slightly different flavor just so you can get a paycheck.

woah dude

You know Python was written in C right?

yeah, by good C programmers. Not you.

You know C is written in C++, right?

Then we can agree the problem is not C, but that most programmers suck, therefore more "programmers" are capable of using Python.

Bad programmers can use Python and be productive and not blow things up

Jow Forumstards that think they are the exception to the rule are much more likely to do serious damage than these people. Hence why I find it funny when the C """elitists""" on this board talk down to people using other languages but forget null terminators on strings like I've seen (kek)

Shut the fuck up numale faggot, you must think you are very modern and intelligent don't you?

If things were as you said then no one would program in C anymore, and the reality is that most other modern languages are slow as fuck and allow very little flexibility for actual complex systems that require efficiency.

It isn't about reinventing the wheel, there are thousands of libraries available and code can be re-used just like in any other language. But it is a language that allows you to tune certain aspects of low level systems that you could not in Python/Java/C# or whatever fucking high level language that has its place but not in actual systems programming.

Hell, even ASM has its place to this day.

That retarded comparison with the binary system just made your post even more idiotic, you are just as bad as the Jow Forums autistic neet faggots that defend C to death when they actually haven't done anything of significance in their lives.

you forgot
>C++

assmad cnile faggot who can't learn C++ detected

the mooselimbs just copied Indian numerals and algebra was practiced by the greeks centuries before moohamad, even though their name stuck.

>Bad programmers can use Python and be productive and not blow things up
This is not a good thing. Bad programmers should not be allowed to produce anything, this is why things are so bad.

Just because there are bad C programmers it doesn't justify the above.

>muh OOP
kys

>me too stupid for templates, hurrdurr
cnile seething

good programmers were all at one time bad programmers

>quoted twice
are you sure you know how to use a computer?
that's why you go to school and learn, if you are not ready to be a professional, don't be one
Also, if you don't test and revise your code you are a fucking retard, there is no excuse for a language to let you get away with bad practices.

lol another loser who thinks they know how to program

>thinking I even use strings in my code

Attached: 1549351819176.gif (480x292, 1.99M)

>there is no excuse for a language to let you get away with bad practices.
There is no way to formally define "good practices" before the language itself has been defined.

Attached: 1375247478042.jpg (580x464, 46K)

wtf are you goyim arguing about again?

you do know that C++ was written in C××, right?

I work in a central offices for mama bell, rummaging through old abandoned offices and cabinets is fucking awesome.. the bell system was truly a force of innovation and brilliance.. mostly had to do with the quality of people on all levels at the time.. now we are surrounded by globalist scum that want to pay slave wages and outsource as much as possible

They cheated and just reverse-engineered flying saucer tech. Truly Chink-tier.

C and UNIX are the worst scourge in the history of computing. Just imagine using a system made for 70's minicomputers... oh wait, that's what we are using right now

Unless they called Xerox SPARC, I aint tryin to hear that shit.

Attached: jet engine fan tester v2.jpg (220x220, 19K)

>Oh wow cat toys
Savage

Why am I retraining myself in Ada? Because since 1979 I
have been trying to write reliable code in C. (Definition:
reliable code never gives wrong answers without an explicit
apology.) Trying and failing. I have been frustrated to
the screaming point by trying to write code that could
survive (some) run-time errors in other people's code linked
with it. I'd look wistfully at BSD's three-argument signal
handlers, which at least offered the possibility of provide
hardware specific recovery code in #ifdefs, but grit my
teeth and struggle on having to write code that would work
in System V as well.

There are times when I feel that clocks are running faster
but the calendar is running backwards. My first serious
programming was done in Burroughs B6700 Extended Algol. I
got used to the idea that if the hardware can't give you the
right answer, it complains, and your ON OVERFLOW statement
has a chance to do something else. That saved my bacon more
than once.

When I met C, it was obviously pathetic compared with the
_real_ languages I'd used, but heck, it ran on a 16-bit
machine, and it was better than 'as'. When the VAX came
out, I was very pleased: "the interrupt on integer overflow
bit is _just_ what I want". Then I was very disappointed:
"the wretched C system _has_ a signal for integer overflow
but makes sure it never happens even when it ought to".

>WYSIWYG
That kind of ruined computers in my opinion. It doesn't work as intended, and it confines the user to factory preset choices mostly.
But ethernet and laser printers, definitely necessary for today.

Well...... C is to ASM how Perl is to Java.

Attached: 1548880076535.png (419x398, 191K)

Who wrote this? I hope I am not using any software they ever touched.

Oh, sweet user... trust me, you are.

My reply to a discussion on another list concerning shell
output redirection:

>> What's going on is that you're overflowing the *shell's*
>> output buffer. When you do > you're asking
>> the shell to capture all the output from and when
>> it's done, stuff that into . As you've noticed,
>> there's not much of a buffer there and when you've filled
>> it up it goes kablooie.

...producing no error message or useful diagnostic, of
course. Such wasteful civilities are obviously far beyond
the ``small is beautiful'' design philosophy of which Weenix
Unies are so fond. And you weren't actually expecting the
shell to be designed to buffer its output correctly without
segfaulting, were you? As a user, keeping system programs
from wetting their pants is, after all, YOUR responsibility.

Subject: why Unix sucks

Some Andrew weenie, writing of Unix buffer-length bugs, says:
> The big ones are grep(1) and sort(1). Their "silent
> truncation" have introduced the most heinous of subtle bugs
> in shell script database programs. Bugs that don't show up
> until the system has been working perfectly for a long time,
> and when they do show up, their only clue might be that some
> inverted index doesn't have as many matches as were expected.


Unix encourages, by egregious example, the most
irresponsible programming style imaginable. No error
checking. No error messages. No conscience. If a student
here turned in code like that, I'd flunk his ass.

Unix software comes as close to real software as Teenage
Mutant Ninja Turtles comes to the classic Three Musketeers:
a childish, vulgar, totally unsatisfying imitation.

>now

Uh, sweetie.

Oh it's that fat turd Stallman. That guy didn't program shit. He even stole emacs source from a magazine.