/dpt/ - Daily Programming Thread

What are you working on, Jow Forums?

Previous thread:

Attached: dlangide.png (956x768, 208K)

Other urls found in this thread:

stackoverflow.com/questions/10247721/on-writing-a-linux-shell-script-to-safely-detach-programs-from-a-terminal
en.wikipedia.org/wiki/Impostor_syndrome
twitter.com/NSFWRedditVideo

Attached: 1533189188905.png (1294x3510, 866K)

Is there a way to bypass the "\x" in python and use it as a literal "\x"
I need to insert stuff in database and the lookup is based on "\x" prefixing which I can't figure out how to bypass.
"\\x" doesn't help as it goes in as 2 backslashes and not only 1.

Shove it in me daddy

>if if if
rip

>It sounds to me like your renderer really only draws meshes
Yes.
Should it draw more? I figured since GPUs only draw triangles anyway, a mesh is all the renderer should be able to draw.

So batch is essentially two dynamic arrays: a dynamic array of vertices and a dynamic array of indexes.
And batch has methods like
drawTriangle(ShapeTriangle)
drawRect(ShapeRect)
drawCircle(ShapeCircle)
The Shape types are just structs.
And the methods basically extend the dynamic arrays of the batch.

Models are more complicated. They have texture(s) attached to them. Models could be sprites, at their simplest, or something much more complicated, like entire terrains.

Dont know python, but normally its "\\x"

What's a good language to start off game development with? I'm familiar with python, java and c++, but I'm assuming the latter two are the best in terms of library

Might not be a suitable question for this thread, but I'll ask anyway. I'm going back to uni in two months and I need to have written a project specification for my final year's project. I'm wanting to do generated music, using some form of AI, but at this point the most complicated thing I've done is a knapsack password cracker using a GA and a river crossing problem using an A* algorithm. I've watched some videos on music generation and a lot of it seems to be based on RNNs, however I'm thinking this will he unsuitable for my project as my supervisor is a old school AI guy, not a data scientist. He told me to look into cellular automata and more GAs. What would be the best way to incorporating these into a music generative program? Is there anything in particular I should research?

Ok, I am a stupid fuckwit, which did not read your question completely. Sorry.

How to set different indentation style for curly braces and parentheses in emacs?
My Emacs 25 in Debian produces this style of indentation
void foo(int a,
int b)
{
struct Bar bar = {
a,
b
};
}

But my Emacs 26 in Arch with identical dotfiles produces this indentation
void foo(int a,
int b)
{
struct Bar bar = {
a,
b
};
}

How to configure the Emacs 26 so it uses the Emacs 25 indentation style?

What's the average salary for a Haskell programmer?
Surely it must be 300k+

0

dunno len("\\x") is 2 in python

>Should it draw more?
Thinking only in meshes and polygons is very narrow-minded. Yes, it has persisted to this day as the most popular way of rendering for various reasons:
>relatively cheap memory allows for large numbers of simple objects to be stored
>massive parallelization makes execution of many simple operations on a large number of objects faster than doing complex operations on a small number of objects
>culling allows reduction of the number of small objects to render
You can't have perfect curves this way though, you're always stuck with approximations. The amount of preprocessing and memory increases a lot also: a mesh of a "sphere" will consist of a LOT of vertices. It has to be constructed before it can be rendered, and that's usually done automatically by -- creating a sphere in your modelling software, extrapolating a limited number of vertices from its mathematical description, collecting these into a batch, baking a mesh and then rendering that. The actual mathematical description of a sphere consists of just one parameter (its radius) -- a lot less memory!
>GPUs only draw triangles anyway
I don't know about that, I wouldn't be surprised to see them have the ability to directly render certain simple objects. Writing software renderers is more fun, you're less limited by what operations your hardware is capable of. Projecting a sphere onto the camera plane would be trivial, too.
>And batch has methods like
>drawTriangle(ShapeTriangle)
Why would batch have those? A batch is a collection of identifiable points described by vertices. A batch doesn't draw anything, the renderer does. Either you're thinking in a completely different way, or your methods have misleading names. What does e.g. drawTriangle(ShapeTriangle) do? I'm guessing it creates a batch whose lists have a length of 3, with the vertices marking the corners of the triangle. Right?

I'm writing a simple 2D , turn based, strategy game. I'm thinking about an efficient way to have crowd control effects. I thought doing something like this:
#define STUN 0b01
#define ROOT 0b010
#define BLIND 0b0100
and so on, and checking each bit with bit masks to find if a player is , for example, stunned. However, I can't find a similar way to make the effects last for more than 1 turn.Any ideas?

Jesus fuck it was ... something and now it's working but I used

SOMESHIT = "\\xBLABLABLA"
client.run("""psql -U DBNAME -c "select ID from TABLENAME where THING=\'{}\'" """".format(SOMESHIT))

so escaping with \\x works but I used ' quotes after the -c option and it terminated when I escaped it later in the expression with \'

Maybe try a raw string

If you don't need ANY escape sequences in the string, you can use raw strings by putting an r in front of the opening quote
>>> print(r"\x923")
\x923

>Yes, it has persisted to this day as the most popular way of rendering
I think it's the only hardware-accelerated method of rendering, and hardware-accelerated rendering will always be faster than a any software-renderer.
>>GPUs only draw triangles anyway
>I don't know about that
I'm pretty sure of that.
>You can't have perfect curves this way though
What do you mean by "perfect curves"? The pixels are rectangular, you can never have perfect curves, you can only get better approximation.
And as far as approximation is concerned, just add more triangles. If they're small and the shader is simple, a high-end GPU can process over a billion (1000000000) of them per second.
>a mesh of a "sphere" will consist of a LOT of vertices
only a couple hundreds of them, over a thousand if it's in really high definition.
>has to be constructed before it can be rendered
Only done once, then the mesh is sent to GPU memory where it rests.
>The actual mathematical description of a sphere consists of just one parameter (its radius) -- a lot less memory!
Yeah, but GPUs can't render them, only triangles.
>draw
Yes, I was mistaken. The methods are called batchTriangle, batchRectangle and so on.

Can I somehow retrieve the actual value from a database query?
I can't do anything with the row but I can use the value so is there an SQL command to do that?

Instead of SELECT * FROM my_table
use SELECT attr1, attr2 FROM my_table

Yes I thought so but I am still getting a row.
SELECT var FROM my_table WHERE x=y
I get
var
---
actual value
(1 row)

what lang is this?

Attached: 1509448367258.jpg (1415x789, 91K)

What's the best way to learn JavaScript basics from the POV of someone who already knows better programming languages? I don't need to know everything, just some normal stuff about the language.

I'm trying to run a program in a bash script, detach it form the calling bash shell and let the same shell exit without killing the child process in question.
So, taking inspiration from I don't remember which application launcher I did something like this:
setsid myappl &

but the myappl child process get killed with the shell unless I do something like:
setsid myappl & sleep 0.01

(or in general anything else that spawn another child process).
In this case when the shell is killed the child is "promoted" and adopted by pid 1.

What's is happening here?

>inb4 fuck bash, already dropped bash, it's just curiosity now.

looks like lua

I want to make browser multiplayer game, dubs decide if I use golang or nim, arguments are welcome

you are never going to make it

Haskell

node.js

truth

Attached: shiggy.jpg (1280x720, 318K)

I'm gonna make it all right, tho it probably won't have players.

Serious responses only, this is a serious thread

Attached: phases of software development.jpg (600x632, 27K)

>I'm gonna make it all right,
If you were you would have already done it by now.

I'm finishing my previous shitty project and designing the game mechanics, so no

>previous shitty project
What would that be?

>it's the only hardware-accelerated method of rendering
But it doesn't necessarily have to be -- you could, in theory, make GPUs that operate on basic 3D shapes in addition to polygons. There is no law that says GPUs can only ever work with triangles and that's it. In fact, I'm pretty sure it's perfectly possible to implement this even on current GPUs, with the right drivers.
>hardware-accelerated rendering will always be faster than a any software-renderer.
The complexity of the objects you're feeding into your renderer does not change the fact that you can parallelize lots of computations. There's just a point where it becomes less efficient (because your scene only consists of a very small number of relatively complex objects, and thus there is only a small number of calculations to do at the same time), as long as you ignore memory limits (which are universal -- no CPU or GPU can handle arbitrary amounts of data at the same time.
>I'm pretty sure of that.
GPUs are also used for calculations that do not have anything to do with polygons, Graphics cards are nothing more than a number of CPUs with reduced instruction sets and memory sizes that work in parallel. You can program them the same way. Bitcoin mining was done on them for a while, for example.
>What do you mean by "perfect curves"? The pixels are rectangular, you can never have perfect curves, you can only get better approximation.
You can get arbitrary* precision with actual mathematical descriptions, whereas with a mesh model increased precision is very expensive as the number of neighbor relationships it needs to store grows quickly with the number of vertices. Calculating those neighbor relationships in the first place gets EXTREMELY computationally expensive. Say you're modelling a pool table in 3D and want to be able to zoom in extremely far. Mesh models will have visible edges and overlap (or "collide" without visibly touching).

* subject to floating point precision, of course

Crawler that gathers statistics about some specific sites

websocket server
- libwebsockets (C)
- implemented in C++
- Protobuf
- python API

API returns data from circuit packs with custom coded FPGAs

x86 intel assembly is best for this task

I'm sure it is, but I'm choosing between nim and go

Attached: image-773266.jpg (506x662, 82K)

Instead of reinventing the wheel Nim should have learned from Rust's error handling with proper optional/result type and match ergonomics.

have a good day

whats the best tool to handle websockets with python+flask?

I am only written the basic vertex and pixel(fragment) shaders. I am not familiar with geometry, tesselation, primitive and compute shaders, so there may be more efficient ways of rendering spheres/curves/other-poorly-triangularized-objects with the APIs we have, and there's Vulkan which I have not even tried. But I'm sticking to OpenGL ES 2.0 for now, since it's the common denominator and the smallest subset of OpenGL, and that's what it can render - triangles, lines and points.

Anyway, for now I am okay with my
Model -> Batch -> Renderer -> OpenGL pipeline.

I also may consider crystal

are you planning on connecting to a websocket with python or writing the server with python?
Python has a websocket library
> from websocket import create_connection

>just add more triangles
Adding an additional vertex between each pair of neighboring vertices (halving each edge) of a sphere will quadruple* the number of vertices. To bake the mesh, you have to iterate over all N vertices in your batch and check all other N-1 vertices for the closest 3 neighbors. I can't be bothered to figure out the exact number of checks performed but it's certainly proportional to N!. That's SLOW as fuck.
If you have objects interacting, your physics engine is working with mathematical 3D objects anyway. Why not use them? If only for the programming exercise.
>Only done once, then the mesh is sent to GPU memory where it rests.
GPU memory is limited. A very, VERY high number of vertices will overrun it. Mathematical base objects (cylinders, spheres, simple 3D polygons) are much smaller.
>Yeah, but GPUs can't render them, only triangles.
Why would they not be able to do it? Maybe the API you're using just doesn't support that. That's not a limitation of the GPU, but of the driver. A pure software problem. Again, GPUs have been used for lots of other types of calculations.

So you're just using high-level APIs without thinking about the underlying data structures and how they are handled at low level. Gotcha. That's fine, but it makes it really difficult to understand what you're actually doing, as your explanations are lacking.

* I did not check that number

There's nothing anybody can learn from Rust. It's completely a dead end in design.

writing the server
i tried gunicorn, but its no good

Enjoy your 1970's exception try catch block, retard boomer

>learn from mistakes guise!
>optional/result type
>moves the problem to bazillion other locations

what

this will get you started
import sys,json
from gevent import monkey; monkey.patch_all()
from ws4py.server.geventserver import WSGIServer
from ws4py.server.wsgiutils import WebSocketWSGIApplication
from ws4py.websocket import WebSocket


class MyWebSocket(WebSocket):
def opened(self):
print "Socket opened"
self.send("Socket opened")

def received_message(self, message):
print message


def closed(self, code, reason):
print "Socket closed %s %s" % (code, reason)

i'll try it out, thanks

>So you're just using high-level APIs
What are the lower-level APIs? Are you suggesting Vulkan? Or should I read a book on how GPUs work, and then how OpenGL abstracts that?
>without thinking about the underlying data structures and how they are handled at low level. Gotcha. That's fine
It's not. I would like to learn more about GPUs. I think that would make be a better graphics programmer.

stackoverflow.com/questions/10247721/on-writing-a-linux-shell-script-to-safely-detach-programs-from-a-terminal
>A suitable delay before bash exit is necessary for the success of the script.
>I have not looked at the source code, but I guess a possible explanation is that bash may exit before setsid finishes configuring the execution environment of the program to run, if an appropriate delay is not applied.

>What are the lower-level APIs?
Those you write yourself.
>Or should I read a book on how GPUs work
Yes, do that.

To be fair, if you're just interested in making a working game engine and then want to move on to different things, just ignore this. But if you actually want to learn something worthwhile that may help you in other projects, go for it and learn it. Not everyone can write device drivers or low-level APIs, so that's a good niche if you're looking for a job.

Godspeed, user.

Learning Haskell via a Udemy course.
Haskell so far looks like a shitty Lisp.
Need to learn it for a Minor I'm going to follow next year. Will create my own language or something in the Minor.

C#. Learn Unity and UrhoSharp.

>Protobuf

Attached: haram.jpg (525x380, 27K)

>To be fair, if you're just interested in making a working game engine and then want to move on to different things, just ignore this.
I want to learn how they make such realistic images in real time.
I wouldn't even know where to start if anybody asked me.
I feel retarded.

Attached: unreal.jpg (1280x720, 167K)

Why exception is stigmatized as expensive while they basically cross-function goto?
isn't
int foo(int a)
{
try{
int[100] bar;
Baz baz;
do_something();
if( a == 12){
throw (Shit){};
}
do_another_thing();
return 0;
}
catch(Shit shit){
cleanup_code();
return 1;
}
}

and
int foo(int a)
{
{
int[100] bar;
Baz baz;
do_something();
if( a == 12){
goto Shit;
}
do_another_thing();
return 0;
}
Shit:
cleanup_code();
return 1;
}


will do almost identical action such as they deallocate bar and baz, and call the cleanup code in case of a == 12?

Jesus Christ did I forget to put on my glasses?

What do you mean by that?

It's blurry as fuck so imperfections aren't so noticeable. It's low resolution and compressed as fuck. Did you take the screenshot from a Youtube video?
But yes, it looks quite nice compared to graphics a couple of decades ago. A lot of it is artistic shit like composition, lighting etc.
High level of detail in the modelling. Either extreme autism went into modelling it, or some randomized algorithms added complexity to a simple model. I wouldn't be surprised if there are plugins for modern 3D modelling software that do this.
The engine is ready-made as well, so whoever made that didn't even have to implement that part.
Honestly, scenes like that don't really show off anything impressive. Interior scenes with partial occlusion, transparent / translucent objects, multiple colored light sources and reflections are way more telling, and even then, pre-rendered scenes often use "cheats" to get around the computationally expensive operations. Read about ray tracing some time, it's very intuitive and was used in early 3D graphics but has become too expensive for modern levels of detail when baked lightmaps do a good enough job for interactive scenes (games).

missed that one somehow, thanks.

hi Jow Forums
I'm an absolute coding novice so I was looking for some basic advice on where to get started -
for teaching I'd like to build a programme that can simulate medical monitoring - something like pic related but doesn't (yet) need to show ECG traces etc, just simply displaying values that can be changed as the medical simulation develops (e.g. blood pressure changes as treatment is given)
I'm happy to put work into learning how to do this I just would benefit from a push in the right direction
I thought for running it from a windows laptop, my current setup, I could build this in visual studio
is this a good idea or should I be using something else? Any absolute beginner materials you'd recommend?
Sorry to be so useless

Attached: 770078933_823.jpg (700x525, 71K)

Look at some fuzzy logic shieet

I read The Pragmatic Programmer and now I'm better than 99% of /dpt/... I'm lmaoing at you sad sacks

Do the java courses at mooc.fi. Have some patience, even if what you're describing is not too difficult, you'll probably need to spend a few months learning programming before you can take on the challenge without getting really frustrated.

How do I program a gf?

Attached: feelmouse.jpg (400x419, 53K)

There's also Code Complete, The Clean Coder, Thinking Forth, Agile Principles Patterns and Practices, GoF, Mythical Man-month...
Get reading, boy.

I have to read all those to beat the remaining 1%? Talk about diminishing returns...

I have a girlfriend. She's pretty nice, I love her. but sometimes, I dream to a life where I have no friends, no social life, no girlfriend, and all I do is sit inside miserable and code all day. It wouldn't be as fun but it would be productive.
Like those people who are in a wheelchair and they don't have a choice but sit inside all day and do cool shit.

I'm having a bit of a crisis today, lads. I feel like I'm a bad programmer. Like, what if I'm just bad at logical reasoning which inhibits me from being a good programmer or something? I'm getting top marks in classes, I have a few certificates under my belt, but for some reason I feel like a total fraud.

How do you even know you're a good programmer?

What are the necessary steps to be able to run x86 asm programs on my T420 with GNU/Linux? I write it in a text editor of my choice, then... what? Do I need a compiler, assembler, whatever, that creates the executables?

[Code]from irl import gf[/code]

When you're employed and they don't fire you for incompetence.

ayy bb want to see my Python?

en.wikipedia.org/wiki/Impostor_syndrome
Common among programmers.

Don't worry bro, I think that as long as you stay humble and try to learn new things, you either are or are on your way to becoming a really good programmer. The shit programmers are the ones who think that they know stuff.

well, how did you do during the Advent of Code? If you didn't get at least a hundred points, you're definitely shit. If you got less than a thousand, you're probably shit. If you won the whole thing then I guess you're ok.

>When you're employed and they don't fire you for incompetence.

Well, I'm not even being hired so I guess I got my answer. Thanks.

lol, noob

just read the file and imagine it doing the instructions LOL!

>mooc.fi.
thanks, i'll take a look
So for a beginner with intentions to make simple apps, do you think java is the best language to try first?

yeah, Java is probably your best option for that kind of app and is a stellar beginner choice in general

Write an interpreter.

Threadly reminder that your favorite language is shit and you should feel bad for using it.

Seriously, what the fuck is wrong with you?

but I already knew that php is shit, I just can't quit it because it's so comfy

>ywn be a trap programmer
why do people become programmers otherwise?

is this really? Unless you are using Anrdoid Studio I dont know any other UI IDE that will help you with the fuckery that is developing UI in Java

I think Java ui development is mostly unpleasant for experienced developers. Bear in mind that for a beginner, a lot of the powerful tools of IDEs are just more things to learn.

Personally I'd build this as a webpage with D3, but I can't recommend that as that's basically 3 whole languages to learn.

Hmm, that is true, i wanted to recommended Android Studio but remembered how much googling you would need to correctly set up their goddamn sample projects. if the guy is fine with working from the terminal, python should do the trick?

Speaking of, which language is the easiest to develop uis for?

With that said, I agree, but I think that his UI so simple that it shouldn't be a large problem no matter how you do it. A short youtube video should be enough desu.

Hello bois

I have a (C) question: is there a quick and easy way to populate a byte with n sequential 1's using the properties of bitwise operations? So far, the only way I have of doing this is with

unsigned char mask;
unsigned char mask_place = 0;
unsigned char mask_size = 5;
while (mask_place < mask_size) {
mask = (mask

Attached: 36729890_440247283157399_756729798255968256_n.jpg (1080x1350, 126K)

C# is great for UI development in both terms of memory usage and ease of development but thanks to the GC and VM crap, it is not suitable for large graphic intensive programs.

is there anything for Python 3 that helps out when you're writing a p2p desktop app

x = 1

Thanks, but shoudln't it be
x = 1

nah