/dpt/ - Daily Programming Thread

Old thread: What are you working on, Jow Forums?

Attached: 1559521785493.jpg (839x821, 229K)

Other urls found in this thread:

news.ycombinator.com/item?id=20107451
quora.com/How-do-I-learn-mathematics-for-machine-learning
twitter.com/NSFWRedditGif

cute akarin

daisuki~

.succ() for Ruby

Why are there no functional languages in the top 10 most used languages?

>C
>Java
>JavaScript
>Python
>C#
>PHP
>C++
>Swift
>Ruby
>TypeScript

Where is Lisp, Scheme, Haskell, Clojure etc.?

>Where is Lisp, Scheme, Haskell, Clojure etc.?
in the garbage

>which is almost completely worthless, I guarantee you that you are doing worse than the opengl driver.
I suppose, but I do understand why people with higher performance requirements would care. I've only really done 2D with OpenGL, but I have done a lot of low-level Linux graphics stuff (for a Wayland compositor), which does also provide its own level of control.

>state is needed for real programs
Basically all OpenGL state is global, and can make it a huge pain in the ass to manage. It's even worse for library code (which I primarily write), because I have to keep binding and unbinding shit to make sure we aren't leaking or accidentally relying upon OpenGL state.
I just want to pass all of the state I need into the functions directly.

Attached: 1559771073981.jpg (800x1000, 187K)

>send-notify
Is there something that looks a bit better than this?

>I just want to pass all of the state I need into the functions directly.
You can write wrapper code that does that yourself

Functional isn’t brainlet friendly

I'm using GLES2, and I AM the one writing the wrapper. That doesn't make it any less stateful.

You mean it's not pragmatic in the world and only works if you're a turbosperg who works alone.

please do not lewd akarin

reminder that telling someone to use Java for IO is a rude way of saying kill you're self.

You write a wrapper around the global state to turn it into local state dumbass

opengl state is global because it's stored in vram

I'm currently writing a command line app in Java and it's painful.

No it's not. The OpenGL state machine does not exist on the GPU.

What does the compiler want from me?
What the fuck?

Attached: wtf.jpg (633x849, 155K)

A state machine of some sort probably exists on the GPU and it's less abstract to use global variables I assume

>What does the compiler want from me?
An =.

What are you using META_NAME for on line 360? It's not even part of an expression.

it's probably another typo in the file doing it

Oh kek. I am a retard. I just looked at the define

>Order of evaluation for most of the binary operators is left undefined to give the compiler opportunities for optimization.
>This strategy presents a trade-off between efficient code generation and potential pitfalls in the use of the language by the programmer.
>Do you consider that an acceptable trade-off? Why or why not?

>>Order of evaluation for most of the binary operators is left undefined to give the compiler opportunities for optimization.
take C behind the shed and shoot it please

Obviously how it works is highly device-specific, but OpenGL is purely a software thing, and just provides a standard interface. The driver will take whatever the OpenGL state is and create some sort of command buffer for the GPU.
Vulkan does the same thing, but just doesn't have this big unwieldy state machine sitting in the way, and allows the driver to be a lot "thinner" and closer to how it works under the hood.

Attached: 1557279904231.jpg (900x882, 217K)

Who are you quoting?

Stanley B. Lippman

Thanks, I'll check it out. The name 'methods' was just an example, I'd actually use this for game programming. With a fuckload of members in a class I'd like to group them in namespaces by their purpose, for example members relating to graphics in one table and audio in another. Maybe I won't do it like that but if the source code is going to be thousands of lines, I'll have to find some way to keep things organized.

How do you make programming fun

Attached: Sp46GSE.jpg (445x488, 47K)

you don't

Well the end result isn't really thinner you're just offloading that complexity from the graphics API to you
I'm assuming there's a blend mode flag on the GPU, so having a copy of that flag in RAM is a useful thing to have, everything is global state when it comes down to it

It's been a while since I used lua but I made an OOP system in it too, I'm trying to remember how it works, it has something to do with the metamethods of the objects and how the method call syntax using : works

sepples. 1% boring work 99% misc. implementation and language problems.

Sorry for reddit reaction image but I couldn't find an anime girl to accurate portray my face when I read

news.ycombinator.com/item?id=20107451

>What no one really wants to admit here on YC is that the userbase here is something like the top .01% of (intelligence / literacy / analytical skills).

Attached: file.png (365x272, 120K)

>I'm assuming there's a blend mode flag on the GPU
I doubt that exists. That would just be a part of the command buffer.
When you call `glDrawArrays`, that's not sending a command to the GPU; it's just filling some command buffer, which eventually gets flushed later. You can't have global flags for shit like that.

when did rage comic become a reddit thing?

Am I a brainlet for struggling with my C data structure class, or is a large part of the reason why I'm struggling because of how nasty C can be?

/dpt/ is the top 0.01% of intelligence, people here just act dumb to shoo the reddit ''''intellectuals'''' away

The command buffer is how the CPU asynchronously communicates with the GPU
If you call SetBlend, it adds a command to the buffer, and when the GPU reads it it sets the blend mode flag in VRAM (unless GPUs are now so general they no longer have any internal concept of a blend mode, but I doubt it)

to be fair you have to be extremely intelligent to edit programming books into images of anime girls

2008

>C
>class

Like 10 years ago. All of the "Jow Forums memes" eventually become "reddit memes" which those stupid fucks won't stop posting over and over again, never able to come up with anything original.
Frogposters and Wojack posters are the prime example of it at the moment.

>to shoo the reddit ''''intellectuals'''' away
I post cute anime girls for that very reason.

C is an excellent language for learning data structures. The entire point is that you need to implement it yourself to learn, and C makes you do just that.

Attached: 1551584821194.png (960x1340, 2.72M)

My real point is that calling things like glEnable(GL_BLEND); isn't going to have any immediate change to the GPU.

Attached: 1541532591349.png (821x3109, 559K)

I know
and my point is it's useful to have a RAM copy of that state so you can avoid redundant state changes, because the GPU still has global state

Js is functional

frogposting was always spiritually reddit

>we're taking away immediate mode, the scene graph and built in math functions
>you have to use 2.1 to use shaders.
>don't worry, someone will make a high level library to replace that functionality
>no one ever does
>good news! we're taking away all abstractions, you get full control, but you have to be a graphics programming alumni and write 700 lines for one triangle.
>you have to use vulkan and dx12 for raytracing
>don't worry, someone will write a high level library to replace that functionality
>trust us

Attached: appleadpctrustme2[1].png (370x209, 46K)

Yes, pretty much. It was a stupid image that wasn't even funny or endearing in any way to begin with.
It should have fizzled after only a matter of days, but here we fucking are, years later.

Attached: 1558655869718.jpg (1654x1654, 181K)

the clown is honorary though

frogposting is spiritually Jow Forums, Jow Forums just has a disgusting and pathetic spirit

frogposting keeps evolving, that's why it stays relevant, unlike reddit autistically sticking to a "formula" until the end of time.

Attached: 46a[1].jpg (3000x1600, 767K)

Kill yourself.

Attached: 1559032613005.jpg (815x611, 86K)

You're probably just rarted.

Mate they are kids just having a laugh if you're posting that to make fun of them you're a fucking loser.

I've been playing around with this concept in my head of a programming language, where you define virtual machines and their instructions, like you might define/use functions in a functional language. A program in this language would consist of defining a series of VMs, defining instructions for these VMs, and then sending data through these VMs to be processed. I'm not 100% sure what this all would look like, but the idea feels interesting. Are there any languages like this?

He's making fun of you, you fucking loser

I wasn't the person he was replying. I am simply pointing out that even having that picture saved on your computer makes you a loser.

C already does this.

You just need to get good. It's like you don't want to write efficient software.
Even then, for gaymes at least, it doesn't seem like most of them even use graphics APIs directly. Most of them go through engines, which do all of the heavy lifting.

Attached: 1543000339886.jpg (996x1000, 866K)

no

he could have posted an image of any of the le epic Jow Forums memes facebook pages

what about Ruby? I don't use it, but Scala's Martin Odersky includes it when lists functional languages

peddling an engine as an alternative to an API is the most jewish shit I've seen

I'm not saying you should use an engine, I'm just saying that people do.

There's a big difference between Ruby and Python's functionalism, compared to Lisp and Haskell's functionalism.

Can someone help me translate this into C:

#include

char *strtok_new(char * string, char const * delimiter){
static char *source = NULL;
char *p, *riturn = 0;
if(string != NULL) source = string;
if(source == NULL) return NULL;

if((p = strpbrk (source, delimiter)) != NULL) {
*p = 0;
riturn = source;
source = ++p;
}
return riturn;
}

int main(){
char string[] = "one,,three,";
char delimiter[] = ",";
char * p = strtok_new(string, delimiter);

while(p){
if(*p) cout

but that's literally C only using iostream

Sending my final project to Uni today and it's barelly 60% of the code, kek, hopefully it will be enough to graduate, then I can start actual to study.

Attached: 1560175018409.jpg (640x1006, 146K)

I hope you fail and fall into a suicidal depression, you frogposting scum.

Professors don't read code. My code for both my final year project and masters dissertation didn't even compile.

>tfw code for master's is like 100 lines in python most of which I just copied somewhere else

mine compiled but i'm still NEET
tfw fell for the STEM meme

They often don't even read the entire written thesis. Friends put a little weak glue between a few of their pages (enough that it would stick, but so little that any slight pull would separate them) and check whether they were pulled apart at some point after they got it back.
Got a straight A, all pages were still together.

Sorry if you got offended fren, I was trying to convey the amount of fucks given I have for this project as of now
I have to do a 20 minute presentation on the project itself, but that is true, they don't give a fuck too. Besides, 90% of what I learned wasn't from uni, I just want my degree and never step in that place ever again, I'm so fucking tired

Attached: 1543683659298.jpg (665x1024, 52K)

>What are you working on, Jow Forums?
documentation for my true random number generator.

Attached: doc.png (983x731, 82K)

Another similar test is to spell words farcically bad, like instead of writing "the following code shows that" you could write "the following cuck shows that" and nobody ever points it out.

it just is

I mean, as said, isn't this the only difference
while(p){
if(*p) printf("%s\n", p);
else printf("No data.\n");
p = strtok_new(NULL, delimiter);
} ?
And god damn stop using system("pause");
just do getchar();

Mine wantee a demonstration. Because if the code doesn't run, that does have a negative input on the marks for the thesis.

If it isn't, you either
1. don't really like programming
2. need to work on a project that's less shit

Also keeping in mind that sometimes it's just not going to be fun, but the payoff at the end is worth it. Still if you're just dreading it every day, see above

wish we had 20 minutes, we just get 15 minutes 'officially' going by the procedure guide, but then they tell you to cut it short and be done in 10 minutes once it starts

It's very easy to fake a demo.

incorporating lambda functions don't make a language functional moron

I stopped liking programming once I got a job
maybe I should just become a coach driver, fulltime

why can't I be 15 again and have all those years back I could have spent mastering programming

Attached: 1525652647494.gif (500x500, 205K)

>alright user can you give us a demonstration?
>yea sure prof, let me just fire up the network learning and we'll get some results in about 2 weeks of running, the laptop is kinda slow

Attached: 4d4.png (640x640, 36K)

Maybe you just don't like your job

because you would have done exactly the same thing if you could go back.

Imagine having to deal with memory and bits in a programming language. Ugh.

yes, I hate my job
it also made me think that there's no possible way a programming job can be enjoyable, so I'm sticking to that until I get to see otherwise

Be honest, is learning machine learning and googling every math term that comes up enough, or should I rape myself by learning all the math first as recommended in quora.com/How-do-I-learn-mathematics-for-machine-learning ?

Mine wasn't. I got aerial images and had to generate a 3D model of a city out of it that you could navigate through as if it were a 3D game.

No fucking way if I could go back with what I know now

my best friend was making gmod mods when he was 14

why couldn't I have done the same

and make sure to use
#include
#include
instead of iostream

I feel the opposite. Wish I could go back and have spent my years doing something more fun

Depends on how much you already know, or if you just need to brush up on things as they come along.

Just start learning and see how far you can get with the math you have. Better to just start doing than to sit around and think about it, or burn out studying math and lose interest entirely