/dpt/ - Daily Programming Thread

Old thread: What are you working on, Jow Forums?

Attached: 1558097984137.jpg (1000x1399, 681K)

Other urls found in this thread:

qz.com/1624252/pythons-creator-thinks-it-has-a-diversity-problem/
khronos.org/registry/EGL/extensions/KHR/EGL_KHR_swap_buffers_with_damage.txt
khronos.org/registry/EGL/extensions/EXT/EGL_EXT_buffer_age.txt
khronos.org/registry/EGL/extensions/KHR/EGL_KHR_partial_update.txt
youtube.com/watch?v=ZR3Jirqk6W8
insights.stackoverflow.com/survey/2019#developer-profile-_-undergraduate-major
twitter.com/NSFWRedditGif

First for huge anime titties

What's your excuse for not using C for all of your programs?
What are you, stupid?

C++ is the most powerful programming language

I prefer performance

*retarded
FTFY

It's so powerful no one really understands how it works

I do

C consistently is the fastest in benchmarks, you dumb fuck.

First for Python.

The "is" goes before "consistently", ESL-kun.

>11 minutes later
Yup, sounds like python.

You think you do, but you don't.

Python slow as always

Imagine being so retarded you can't even understand C++ and assume no one else does either.

Absolutely based. You converted me

did anyone from here get a good programming job without any formal education? i graduated with a bachelors in biotech and realized i don't like working in the wet lab and have been self studying python.

Attached: 1454798717471.jpg (294x415, 37K)

>programming job
>good

More like poothon

pros/cons of this paradigm?

Cringe
Explain?

if __race__ == "White":
refuse_mentoring()

Pros: Easy.
Cons: Inefficient as fuck.

Normally things like that don't piss me off, but after reading the article, it really did >:(

It's simple, just like how a video game renders things.

while running:
input = poll()
update(data, input)
render(data)

And that just keeps runnning at 30/60/120/144 Hz or whatever your monitor refresh rate is.

What article?

easy and elegant as fug, and retains its elegance regardless of language
but it's inefficient

Video games have an "excuse" here, a usually the entire contents of their frame changes each time, making most optimisations you do here (e.g. damage tracking) pointless.
However, if you're just some shitty Gooey, you're seriously wasting a lot of resources (and therefore battery) by constantly redrawing shit.

>a usually
as usually*

>implying that you are required to keep redrawing shit all the time

You can just not redraw things that don't have to be redrawn.

Sure, you don't have to draw every vsync or whatever, but there are still a lot of wastage.
Do you redraw the entire window just because you have a little animated loading spinner?
Also, you can be dealing with some seriously big framebuffers with high-DPI these days. That's a lot of content to draw, which is wasting a lot of memory bandwidth and power.

And how are you tracking that with immediate mode? By actually tracking what parts of the frame has changed, you've effectively implemented a retained mode renderer then.

I can’t decide whether to do my final project in Sepples, Rust or Haskell. Learning on the latter to avoid success.

Attached: 8954DDD4-C9FF-4E64-B975-C9B0AA32CCB4.png (666x1200, 3.06M)

qz.com/1624252/pythons-creator-thinks-it-has-a-diversity-problem/

This rat bastard deserves to get his ass whooped

What gui uses dirty rectangles these days?
They are all hardware accelerated and double or triple buffered, so they clear & redraw everything every frame just like a video game.

Every real GUI framework still does this. I'm actually an author of a Wayland compositor, so I know of the damage that clients are sending in.
Dumb fucks like you are why battery lives are still so incredibly shitty.

The white man fears the rustacean

Attached: B2F2BF0F-A455-4AA8-A45E-ED3E8A407E8E.jpg (1188x716, 133K)

I hate python! I hate js! >m

any hardware worth caring about doesn't have batteries, so it's fine

In other news: people still use Stackoverflow's "number of questions asked" as an indicator which programming languages are popular.
Fucking morons.

What language should I do for programming problems? It feels like python is too high level and always has some 2 line solution using built in functions or standard modules, but it doesn't feel like I'm really understanding how to work with data structures and algs with it.

I barely know sepples but is it worth using it for programming probs just so I can really understand the concepts on a lower level?

>Every real GUI framework still does this
No, it would be flicker hell.
And in the end probably less efficient.
Sending a few quads to the gpu is cheap compared to dirty tracking on the cpu.

C++ has a lot of abstractions, learn C

he's just saying he's personally going to focus on mentoring minority groups in the coding community. Jow Forums so easily triggered

How about not using the built in functions then?

Ya but sepples is a lot of work to write correctly

I think that's overkill. I'm mainly coming at this from a preparing for code interviews perspective. I just need to understand the structures and algorithm, not just python magic it all away

But then I feel like I'm just not using my language to its full capacity...

I've decided I'm just going to do each problem in a lower level lang first and then in every language I want to learn. Should be relatively trivial to do the same solution in a different syntax. That way I can learn the python magic and the hard way

>No, it would be flicker hell.
What the fuck are you talking about? Everything is double buffered these days.
>And in the end probably less efficient.
>Sending a few quads to the gpu is cheap compared to dirty tracking on the cpu.
It's certainly way more efficient. Implementing damage tracking and frame reuse was by far the largest optimisation we ever made to our Wayland compositor. CPU usage dropped from like 6% to < 0.5% at idle (mostly inside of the GPU driver). I don't even know what it was for the GPU usage, but it'd certainly be a lot less too.
By not using the GPU, you're allowing for it to stay asleep longer, saving more power. Not to mention the power costs of the memory bandwidth used copying a bunch of shit around.

Use D then. At least you won't find tons of morons telling you to use Boost to solve the problems for you.

>damage tracking and frame reuse
Got any links to this topic?

I'd recommend a high level language to focus on the algorithms and not on minutiae.
Use Scheme or Haskell for a FP approach, or something like C# for a more imperative/hybrid one.

The rendering loop can still be event driven, so in a typical gui program it would sit idle most of the time.

based and AC power pilled

Yeah I was thinking C# is probably a good midpoint

Can you fuck off and take your reddit-speak somewhere else

Reminds me of kitty, a GPU accelerated terminal. Causes my GPU power usage to go up by 20-25 W as soon as it is started up. Causes little spikes even when it's not doing anything because of a blinking cursor.

based and Jow Forums pilled

I don't know of any articles/blogs written about it (they probably do exist), but if you were to use OpenGL/EGL, these are the extensions you might use to implement it:
khronos.org/registry/EGL/extensions/KHR/EGL_KHR_swap_buffers_with_damage.txt
khronos.org/registry/EGL/extensions/EXT/EGL_EXT_buffer_age.txt
khronos.org/registry/EGL/extensions/KHR/EGL_KHR_partial_update.txt

Some of the descriptions in there are pretty useful.

relaxing with based Bryan videos
youtube.com/watch?v=ZR3Jirqk6W8

bryan blessed

Except he didn't say that. He dismissively said that whites should not even try.

As much I don't really like Gnome stuff, their terminal is a glorious example of doing it correctly.
They handle damage correctly for each individual character.

I've been writing a terminal emulator myself freetype and opengl. My code is flexible enough for me to address each character individually. How can I update just one character, though? For example if I hit backspace, the cursor and the preceding character both change but nothing else is invalidated. I also support control characters for doing ncurses type things, and that also changes a small part of the screen only.

I want to keep using opengl but I also want it to be power efficiemt

Also I've tried reading VTE source code but GNU software is such a mess that I couldn't do it, it was pure torture. I had to read st instead. Only gnu project I can figure out is gnu make, whoever maintains that is a good person.

The answer depends on what you're using to create/manage your OpenGL context.

In practice, minutia end up being a huge factor. It can be harder to analyze algorithms correctly in high level languages because you may be unable to tell what is really happening. Are you modifying an object or copying it? Are these operations creating temporary garbage? When can you move or elide copies? Are you overlooking how the garbage collection works and underestimating the space complexity?
These are totally practical questions to ask when thinking about because ignoring them can easily turn a linear operations into exponentials.

I've made a mistake of never properly learning about the build process while learning programming, so while programming concepts themselves are not a problem I've no idea how shit gets compiled, linked and whatever else.
What do

Attached: 1556549765720.png (600x450, 222K)

that depends majorly on the programming language you're using

oh right
C++, guess C as well along with it

go fuck with llvm as a library for a bit

Read up on compilation units, why header files exist, the difference between the compiler and linker, and static and dynamic libraries. That should be good enough.

C source -> Preprocessor -> Compiler -> Assembler -> Linker -> Executable

I'm using libretro, actually. It sets up the GL ES context for me. I just use it.

C and C++ have roughly the same compilation model.

Each .c/.cpp file is known as a translation unit or a compilation unit. The compiler takes each one of these and turns it into an .o file, which is object code. Then all the object files are linked into a single executable or dynamic library.
Makefiles are basically just scripts that call every tool in the build process when they need to be called according to the dependencies in the build system.

You forgot the part where Dimitri changed the stdlibs and replaced your main at link time

Wrong. The design and analysis of algorithms is even obscured by language minutiae. The main concern is correctness, and how time/space requirements change with growing input size.
It does not matter if you're copying or modifying if your algorithm is exponential or does the wrong thing.

.c
C preprocessor
translation unit
C compiler
object
Linker
executable

That's always a possible attack vector. You can use LD_LIBRARY_PATH or replace stdlib functions in the statically linked executables. IDA even comes with recognizers for the machine code common to stdlib functions.

I don't know about libretro's API, then. I'm mainly familiar with EGL, which is pretty likely the thing libretro would be using internally.
It's actually pretty rare for those sorts of "opengl wrapper" libraries to expose any of the EGL bits you need for proper frame-reuse to work properly.

would you use OOP for an assembler?

No

...to get mentorship from him, specifically. Because he's focusing on underrepresented groups. It's not like there isn't a dearth of resources for white men to get into programming. Most programmers are white men.

ok cool, thanks

Fuck python and fuck Guido "fuck white people" Rossum

those aren't white male exclusive resources

And they don't need to be, because obviously white men aren't in need of targeted mentorship as a group, as they are the majority in the industry

>let's not target our target demographic
These people have never studied marketing.

got me there

>hello I am white and male and I would like a mentor
>yeah but you don't NEED a mentor because look at all these other white male programmers

nigga programming be hard
you finna gotta read books an shiet
computaz are raycis

just go find someone else who will mentor you then.

Obviously if you're in a professional environment, nobody at your worklplace should be denying you help because of your identity. But at the end of the day Rossum can do whatever he wants with his time, doesn't really matter what you or I think about it

Why would white men even need to? They have working brains, they don't need handholding from fucking guido.

>I have decided not to help white men
how will these sorts ever convince the unconverted that they aren't doing this out of the popularity of white and male self hatred?
I feel bad for the poor minorities receiving "help" from Guido. That's like choosing Python as a first language - you'll NEVER be a good programmer without serious rehabilitation.

or not real self hatred but sort of fake and projected as a social display (not that it doesn't often cause more damage than those who do really hate whites/males)

insights.stackoverflow.com/survey/2019#developer-profile-_-undergraduate-major

There may be dumb employers unwilling to hire you but smart employers hire anyone who is capable.

I mean, what do you expect of that guy?

Attached: rossum_sicp.png (604x417, 70K)

I want to develop a game in C++, Roguelike-style + RPG elements. What should I use? libtcod?
inb4 Python

C++

why are you answering your own questions?

Because for me it's quite difficult. I wanted to code my own small engine for RPG, but it would take me a year meanwhile a skilled developer could get this done in a week, I'm sure.
Maybe there are better approaches I'm not aware of.