Thank god all the morons are in web development

Thank god all the morons are in web development.

Attached: garbage.png (264x54, 4K)

Other urls found in this thread:

stackoverflow.com/questions/81656/where-do-i-find-the-current-c-or-c-standard-documents
open-std.org/jtc1/sc22/wg14/www/docs/n1548.pdf
libgen.io/search.php?req=iec 9899 2012
myredditnudes.com/
twitter.com/NSFWRedditGif

>put in garbage
>garbage comes out
who wud of thunk

It was all garbage to begin with.

Yes we know loose typed interperted langs are shit. Everyone agrees

int i = *(int*) 123456789;

Segmentation fault (core dumped)

Thank god all the morons are in C.

>int i = *(int*) 123456789;
Yes your illegal line of C shows how stupid C programmers are. You can do better than that.

>illegal
it compiles

You're right. Illegal is the wrong word. Braindead code that invokes undefined behavior. Fails like this:

int n = 123456789;
int i = *(int*)n;

$ gcc -Wall -Werror test.c
test.c: In function ‘main’:
test.c:7:11: error: cast to pointer from integer of different size [-Werror=int-to-pointer-cast]
int i = *(int*)n;

Is it just me or is node JS absolute fucking garbage that never should have been tried as server side code?

Works fine on my machine as long as it's on the same line. Kill yourself.

That's a good point.

>twitter, uber, ebay and netflix engineers vs anonymous Jow Forums specialist
who would win????

Attached: ss-2018-04-27-05-51-54.png (776x157, 47K)

Works fine, it's just that you enabled -Werror.

you're bypassing the typechecker in order to force undefined behavior. you're being retarded on purpose and blaming a language for it.

if you have ever seen driver code it looks a lot like what you posted, anyway. so the code itself isn't retarded out of context, there could be a perfectly valid reason for writing that.

you failed twice in one post, great work user.

> Companies that have websites use Javascript.
Doesn't make web dev any less stupid. At least Microsoft is trying to inject some sanity into the ecosystem with TypeScript.

Yes what a very strange and non-standard thing to do.

I'm not "bypassing" anything. Casting is a standard operation defined within the specifications of the language.

why exactly is truncating pointers not covered by -Wall? it seems like a serious mistake that gcc should always try to catch, especially because is could easily create a heisenbug that's hard to track down.

and it's caught as long as you compile with error flags. you're compiling without warnings to bypass the typechecker.

tim@timspc:/home/tim$ gcc -Wall test.c
test.c: In function ‘main’:
test.c:7:11: warning: cast to pointer from integer of different size [-Wint-to-pointer-cast]
int i = *(int*)n;
^

Fuck are you on about?

int main() {
int i = *(int*) 12345678;
return i;
}


Compiles fine with -Wall and -Werror.

It's because it's an int literal. It's because it's useful in settings where exact memory addresses can actually matter e.g. microcontrollers.

because memory mapped IO exists

Here's another one for you.
#include

int main() {
char str[] = "abc";
str[3] = 'd';
puts(str);
return 0;
}

What are you even trying to say? If you explicitly try to access invalid memory, you get an invalid memory error?

What are you trying to say? If you try to do retarded things in Javascript you get retarded output?

out of bounds?

I will concede that null-terminated strings are broken concept and a huge wart on C, along with the mass of cryptically named 'str-' functions in libc.

Yes, because instead of erroring, JavaScript does something stupid that will cause more serious problems as the program continues running.

Imagine if you have a butler and you tell the butler to divide by zero. The C butler will tell you "I can't do that". The JS butler will say "Okay" and go shit in your bed and do a floral arrangement in the pile of shit.

>ITT: hipsters who want web just pure get and post

Guess what this does. No cheating.
#include

int main() {
double i = 1 / 0.0;
printf("%d", (int)i);
return 0;
}

Literally impossible to know. Floating point division by zero is undefined behavior. Ideally you would get a compiler warning.

>Floating point division by zero is undefined behavior.
Ctards everyone!

Oh please do elaborate.

Floating point division by zero is NOT undefined behavior. Ctards literally do not even know the basics of programming and think they are qualified to criticize other languages.

> The result of the / operator is the quotient from the division of the first operand by the second ...
> ... if the value of the second operand is zero, the behavior is undefined.
From the C11 standard.

>9gag

The C standard is not some random document you can find online for free. You have to pay money to get it. You're being exposed pretty hard.
stackoverflow.com/questions/81656/where-do-i-find-the-current-c-or-c-standard-documents

Oh fuck off. I'm not writing a compiler, the draft standard is fucking sufficient:
> open-std.org/jtc1/sc22/wg14/www/docs/n1548.pdf
Clause 5 in section 6.5.5, page 92.

Bite me.

a better string is dead simple to create though
struct bstr {
int size;
char data[0];
};
then
struct bstr s = malloc(sizeof(struct bstr) + sizeof(char) * string_length);
s.size = string_length;
or whatever. just realloc whenever you need to change the string. you can use enums and create polymorphism with struct pointers and casts this way, too. it's useful if you're writing for robots or anything embedded like that.

Or you can use C++ like anyone not stuck in the 80s.

Does the 'pedantic' flag catch it at least? Not able to check myself because I shut down the server running my VPN before leaving the house and forgot to turn it back on, so I'm not able to SSH into any of my machines right now to test it.

Yes, but many components of the C standard library require null-terminated strings. If you want your library/utility/whatever to interoperate with any code that you don't own, then your hand-rolled string wrapper better make sure the data is null terminated at all times. This is why even C++'s std::string is null terminated.

It's a pain in the ass and it's bug-prone.

Except now your string is limited to int in size.

Could easily be a size_t

Hello webdev who thinks he's hot shit because he just read about IEEE 754. The C standard does not specify compliance with IEEE 754.

C++ compilers aren't always available for embedded systems and even when they are they tend to produce garbage code. when you're working with very limited memory the overhead of C++ can become unworkable as well.

it seems to catch it when it's a not an integer literal but it doesn't mind when it is. i would expect to have to write
int i = *(int*)7928375L
but it doesn't seem to require the 'L' designator. i don't have the standard lying around but i'm willing to bet this is a compiler specific thing.

Clang has the same behavior as GCC, don't know about MSVC.

So now all of your strings take an extra 7 bytes of memory. Good job.

Speak for yourself.

>Typescript
>Sanity
>>>All generics including arrays are bivariant
HAHAHAHAHA

We can talk about different solutions to the string problem all day, but null-terminated is almost always the worst one.

One option is to do something like an LEB128 encoding of the length, followed by the data. This minimizes the memory usage of storing the length for all but absurdly long strings but makes accessing the size slower than just using a plain size_t. It's definitely faster than a strlen(), and has no additional memory usage for small strings if we're throwing out the null terminator.

Or you could just use a fucking size_t.

>TypeScript

gag

Attached: 1521729635795.png (215x207, 79K)

just use a _u16 and if your string is longer than 65,535 characters you're probably doing something very stupid.

You seem to be forgetting that at the time, a computer had ~4 KB of memory and was slower than your modern e-toaster's CPU.

God don't prevents you from punching a slab full of live nails, yet you're not expected to do it.

I'm just talking about string implementations in general now. Not saying I have a good solution for the time period, but null terminators have definitely proven to be more trouble than they're worth.

a typeerror should come up, going "YOUR GIVING ME GARBAGE". It shouldn't try to 'just figure it out'

>the language needs to babysit me and second-guess what I meant

> I'm studying computer science and think it's rare that I write bugs.

>guy who invented node moved to golang
really makes you tink

>the idea of adding and taking away ints and lists and dictionaries and strings should be something the language both knows how to handle and handles without warning

Lol no generics.

This bait is weak dude. Every web developer already hate itself and knows that what they do is fucking stupid, hence why so many are college dropouts wishing to be the next ZUCC.
It doesn't change the fact that the ones that aren't stupid are doing better than everybody else in our field. Internet will takeover everything and every single kind of development will be web focused, from machine learning to software development.

> every single kind of development will be web focused
Slow down there buster. Non-web environments aren't going anywhere.

>every single kind of development will be web focused
>people will be programming microcontrollers in node.js
>air traffic control will use flask to orchestrate air traffic
>operating systems will somehow run on the web running on the OS like some kind of psychotic toy lisp that bootstraps itself
>everything will be magic and CPUs will no longer run on bits and IRQ will be replaced with reactive asynchronous file I/O
>medical equipment will lag out in the middle of operations but it's all the telecom companies fault because they didn't move my retarded amount of data fast enough
>the internet is going to drive my self-crashing death-trap
yes, i'm sure this is all going to happen

every time I us JS I want to die.
Even just as a sciprting language it makes even a little bit of sense.
Using it in something like After Effects, I can handle. (although adobe makes a lot of mistakes on it's extended functionality)

haha, add that shit to make a entire backend for, fucking just shoot me.
There's a very good reason why you don't make languages ever so simple. It makes it more complicated, or even requiring even more overhead to even end up with something that works with a great effeciency level.

There's a number of great languages out there for different purposes.
And the one that people use javascript for, don't make it's proper use case.

I don't think so. I never thought this using JS, that the language was too simple to be comfortable.

What exactly are you having issues with?

There are some good reasons to criticize Javascript. The type casting is not one of them. It's just syntactic fluff, completely irrelevant to the core of the language. If you're trying to add the number 1 and an array together in the first place, your code probably has bigger problems than the fact that the interpreter lets you try to do it. I've written JS for years and have literally never encountered a bug caused by any of this stuff.

Cont. That said, if you're writing something that might kill people if it malfunctions, I don't think you should use Javascript. Just because I've never seen this kind of bug doesn't mean it doesn't happen.

>The C standard is not some random document you can find online for free. You have to pay money to get it. You're being exposed pretty hard.
But user, others have already paid for it, so you don't have to: libgen.io/search.php?req=iec 9899 2012
Pic related, page 92 (page 110 of the PDF).

Attached: file.png (991x108, 22K)

You need a reality check. You are not as smart as you think.