/dpt/ - Daily Programming Thread

Anime edition.

Previous thread: What are you working on Jow Forums?

Attached: teach_me_cpp_senpai.jpg (1440x810, 155K)

Other urls found in this thread:

gcc.gnu.org/onlinedocs/gcc/Vector-Extensions.html
pastebin.com/SR1g4b7q
twitter.com/NSFWRedditVideo

ur face

sound fx with sepples

Is it possible to get a AUMID of an application that's not been installed yet?

standard SIMD WHEN?

Attached: 4891234123.jpg (359x251, 13K)

this, also I want standard ways to disable demormal floats
fucking denormies

>that cat is in all likelihood dead now.

Attached: Christ_in_the_Wilderness_-_Ivan_Kramskoy_-_Google_Cultural_Institute_detail.jpg (604x588, 291K)

GNUC YOU FOOL
gcc.gnu.org/onlinedocs/gcc/Vector-Extensions.html

>non-standard
>g*u
>communism
please don't talk to me

clang supports vector extension also so in any relevant way it's standard C.

>clang and gcc are the only c compilers

only relevant ones.
If you need other C compiler then it's usually someway gimped and has platform specific extensions for specialized needs.

>standard [insert any modern* (last 20 years) cpu feature]
>in C
kek no

Trying to wrap my head around pointers. I still don't understand what they do or why I should use them.

just use a build guard and generic fallback implementation

can you imagine being so stupid you willingly use C lol

Read a book.

They simply refer to other variables. It's not necessary to overcomplicate it or overthink it.

Redpill me on why you faggots program for free.

>working for other people for 50k/y when anyone can do a start up and get bought for millions
wwwwwwwww

No one is going to pay you for your retarded first time programmer code unless you are female

Tthen invest that money in crypto and make trillions.

I know people who almost can't write code and still have jobs. Where I live it's mostly about the degree.

>crypto
Jow Forums pls

Attached: [Coalgirls]_Yuru_Yuri_05_(1280x720_Blu-Ray_FLAC)_[62FD25E0].mkv_snapshot_12.25_[2018.07.11_09.59.52] (1280x720, 1.68M)

Thread 1

void* b (void *data) {
while(1) {
printf("test");
}

pthread_create(b) // creates thread 2

Later in thread 2, end of program

return 0;

Does b keep printing or does it stop when return 0 is called on thread 1?

A code monkey is usually envious of a person with an academic degree. A code monkey claims to be able to "write code" (whatever that means) better than a professional with a degree, yet their code is completely worthless, because it is not built using theoretical frameworks which are taught to someone with a degree. A person with a degree is able to write highly optimized, concise, and demonstrably efficient code, whereas a code monkey usually writes nonsense code which routinely crashes, leads to inefficient use of processor and memory resources, and is overall worthless in a global, competitive market.
This is why nobody will hire you if you don't have a degree. It isn't because you're some misunderstood genius who learned to code on their own mitigating years of academic study. It is because your code is worthless.
Still think you're some misunderstood genius, and not a mere buffoon? Start a firm, and see how well that goes for you. Chances are, you will make a fool of yourself within the first days of trying to create anything of remote value.

>A person with a degree is able to write highly optimized,
then why do 9/10 CS shitters fail to program basic things?

Is there a source for this claim?

this thread.

hhmmmmmph
a = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100]
for b in a:
#if not b % 3 == 0 or b % 5 == 0 or b % 3 == 0 and b % 5 == 0:
#print b
if b % 3 == 0:
print "fizz"
if b % 5 == 0:
print "buzz"
if b % 3 == 0 and b % 5 == 0:
print "fizzbuzz"

Attached: 1508775640028.jpg (655x527, 59K)

No, it's a Jow Forums meme because most people on this board are either still in college or hasn't started yet.

Starting your own business does not require the same skills as being a decent programmer.

I'm just a genius who lurks the web. I have quite a few nicknames and you can call me by any of them: "Hawking", "Einstien", or, sometimes, "Musk".

I collect ancient Japanese swords as a hobby, but don't worry I'm not a weeaboo, I actually respect the folded steel from the land of the rising sun. I'm also a self-taught mathematician and funnily enough I discovered almost all of proofs for the higher level stuff I never went to school for myself.

I wouldn't be surprised if you hear of my academic exploits in the future. I currently don't have a girlfriend, but it's because that's of my own choice. I'd rather have a deep and intriguing conversation about parallels of economic depression in our lifetime to those of Ayn Rand's "Atlas Shrugged" as opposed to stupid Instagram pictures.

As for programming, I picked it up when at the age of four as I wanted to a way to generate the hardest Sudoku's possible since anything else I found was far too easy. From there is just took off and now I'm sitting here with my own custom OS.

Programming languages I know:

C+

dll

binary (what I wrote my OS in)

html

are you kidding?

b will keep printing until the thread that created it stops.

I get paid to program in C#.

I program both for free and for money.

Working on an app in Ruby and developing status codes for it.
Is this a decent way of doing it?

Attached: Screen Shot 2018-08-06 at 7.34.59 AM.png (1024x600, 97K)

don't use status codes.

Lua vs Julia vs Lisp vs Elixir vs Go vs Kotlin
Who wins?

You don't need status codes unless you are making a webserver or something. Just output status messages to let the user know about progress and any errors that occur.

Well, there's several levels of statuses I want the user to know about. As an example, inputting a directory and not finding one isn't as bad of an error compared to the configuration file not being set up properly.

I'm also planning on doing a Curses app, so I'm thinking of doing different-colored messages for the different codes.

>What are you working on Jow Forums?
I've been ripping the fuck out of my DVD collection with MakeMKV and then transcoding to save space with Handbrake. So mostly I've been writing routines to manage this process, rename the files to match what Plex expects, move them to my NAS, etc. Irritatingly tedious work. My next step is going to be to start some kind of saved state in case the process stalls (e.g. power failure) during a run as right now the number of episodes and seasons depends on the process running start to finish, which is fine for a 12-episode animoo but normal /tv/ shows a good transcode can take quite a few days.

Of course by the time I'm satisfied with the code I will not have anything left to rip.

inb4 buyfag

I've just started learning C. I alredy know C# and the change from a high level language like C# to a low level C is hard, but not as hard as expected because I alredy know C syntax.

Please give me ideas of things to do in C. I need to practice but I don't have any ideas.

vastly different languages.
i'd pick elixir over Go for backends though.
Kotlin i'd never have to use.
Julia over lua and python.
and lisp doesn't really win anything so it can be discarded.

>we now live in a time where C is considered low-level
As for ideas I don't know. I only use C when I have to leverage libraries written in C, and even then that's usually only as a wrapper to make a shared library so I can load it via some better language's FFI.

C is just one step over assembly man

I use Racket for just about everything, mostly because I don't work on time-sensitive computation and I'm comfortable with it. If someone asked me when to use a lisp the answer is always "whenever you wish you could program your compiler." For most people this is almost never because most languages people learn don't even offer it in any way so no one understands how helpful it can be. But that's the answer.

Well if this program is going to be complex, then sure, go ahead and make a status code system. Look up common status codes used by other software to ensure some uniformity.

based

Interviews in Python3, Java or Go?

What languages do you consider low level, save for assembler?

Not him, but any programming language is by definition high-level.

>jump_buf is an array type
why does C have to have so much stupid legacy bollocks/

What is your definition of a programming language?
Don't you feel like if we stick to that definition, we lose tools to talk about different languages. Because that means we say that C and Python are both "high level" languages, and so we can't separate as easily anymore.

I'm all for being more specific, but then we should get rid of "high/low" level words all together and just stick to "gc / not gc" etc

>Don't you feel like if we stick to that definition, we lose tools to talk about different languages
Of course not.

>Because that means we say that C and Python are both "high level" languages, and so we can't separate as easily anymore.
Of course you can, what a ridiculous premise you are asserting.

It was at that moment, that I realized it was bait.

It was just a remark on how times change. I do consider C low level. Also C++.

I don't have the use case for needing elixir/julia, the others are java clones and lisp is a worse lua. so lua

Numerical Recipes C or C++?

You're not giving me any arguments here my friend.

>Of course you can, what a ridiculous premise you are asserting.
Let me step you through this.
>Today
>Python is a high level language
>C is a low level language
>From this we infer certain differences between the languages, such as python having gc

>Your definition
>C is a high level language
>Python is a high level language
>From this we can not infer anything regarding the differences between python and c

I see, I understand completely that C was considered high level in the past.

How bad is it to write ~1000 lines in 1 file for a feature, THEN after it works you try to reduce any overlap (break out into smaller functions), and separate to other files for source/logical organization.

Is this a bad habit? Should I be able to build to the final design from nothing, instead of writing a big chunk and refining it later?

I feel a bit embarrassed looking at the clean commit history of my peers while they are in progress.

Attached: 1533270195756.jpg (1080x1079, 416K)

what should I be using for precise benchmarks in C? I've heard it's possible to use the googletest in C, but is their benchmark framework also usable? how do I make it work? any good alternatives?

You should always be ready to refactor and restructure. Nothing wrong with it

Elixir > Lua > Kotlin > Julia > Go > Lisp

There's nothing really wrong with it, however as you get more experience you'll probably predict the structure of things earlier.
It's better to do what you're doing than prematurely optimize imo

Can you give an example where macros really helped you? Just a high-level overview, no need for actual code.

I prefer your method of development because easily 90% of my code is one-off shit. Code that I've released has easily gone through about twenty rewrites over time as I learned more about how I use it, how others use it, etc.

Are you writing for an organization or is this a personal question? I found literate programming to help a lot for organizing code better. Spending even ten seconds to describe what code is supposed to do before I write the code made a lot more of the overall organization gel in my head.

ymmv

Yes but that sounds extreme. Why don't you try to organize it at least a bit from the start?

>You're not giving me any arguments here my friend.
A non sequitur (as in arguing two things that don't follow) doesn't warrant an argument, friend.

>From this we infer certain differences between the languages, such as python having gc
I hope this is bait, because if it's not, then you're even being more ridiculous than I thought. There are many "high-level" languages that don't have a garbage collector and rely on reference counting alone.

Also, I never ever use the high/low level argument to differentiate between languages, how fucking stupid is that. If you absolutely need to classify Python or C, you'd say that C is a system's programming language that's more suited for manipulating bits and memory addresses directly, and that Python hides the gory details away through abstractions. But they're both obviously high-level languages, because you can write a C program that works on one processor and it will run on another processor, whereas low-level means that it is specific for the architecture.

>A non sequitur (as in arguing two things that don't follow)
Show me the non sequitur please

>if we say that A is a type X and that B is also a type X then it is harder/impossible to differentiate between A and B
C and Python are both programming languages.

>"oh no!! it is impossible to differentiate between them now that you said they are the same thing!!!!"

Thanks all for the advice.

>Are you writing for an organization or is this a personal question?
I am contributing to an open source project.

>Spending even ten seconds to describe what code is supposed to do before I write the code made a lot more of the overall organization gel in my head.
That sounds like a good method. I've taken to pacing a bit and thinking things through and it seems to have helped a little bit, but I need to get the balance of thought and action down. I might be thinking too much sometimes on things that change anyway.

Sometimes I can, but other times I don't know what the solution is going to / supposed to look like. Sometimes I just tackle problems step by step, implementing what is needed, as needed, then test it, and repeat.
After I have more experience with the problem, I can take what I know and basically re-implement it with a more sane design. But that's not clear from the beginning in all cases. It must mean I need more experience.

>It's better to do what you're doing than prematurely optimize imo
This seems to be the general consensus and I'm inclined to agree since a lot of things I write tend to get deleted. I started a habbit of keeping all my deleted buffers in a file to count the lines I'm not committing and it turned out to be somewhat large for moderate sized tasks. No sense in refactoring too early.

Now I'm sure you're baiting.
I didn't say you couldn't differentiate between them whatsoever, I even specified how you could without even talking about high/low level.

What I did say is that today high/low level is used to describe languages, and if you call all languages high level languages you lose those words.

>whereas low-level means that it is specific for the architecture
I think that this is a fine definition, though you're the first one I've heard use it. If that's what you want to use it for, fine by me.

I think I'm done responding now.

>I even specified how you could without even talking about high/low level.
They have vastly different characteristics, you dolt.

>What I did say is that today high/low level is used to describe languages
But they aren't, because all languages are high-level and only C and Fortran would even remotely be described as "low-level".

>I think that this is a fine definition, though you're the first one I've heard use it.
This is the definition from K&R, so I'm clearly not the first one.

I made this little Fortran 95 module for implementing variable length text strings.
module var_string
implicit none

! Data type for variable length text string
type varstring_t
character (len=1), dimension (:), allocatable :: sText
end type

contains

!
! Convert given text into a variable length string.
! Memory is allocated for the string.
!
subroutine VarStringCreate(sText, varString)
implicit none
character (len=*), intent(in) :: sText
type (varstring_t), intent(inout) :: varString

integer :: iNumChars
integer :: i

iNumChars = len_trim(sText)
print *, 'Creating variable string of length: ', iNumChars

! Memory allocation
allocate(varString%sText(1:iNumChars))

! Copying chars
do i = 1, iNumChars
varString%sText(i:i) = sText(i:i)
end do

end subroutine


!
! Remove variable string.
! Memory is freed.
!
subroutine VarStringDelete(varString)
implicit none
type (varstring_t), intent(inout) :: varString

integer :: iNumChars

if (allocated(varString%sText)) then
iNumChars = size(varString%sText)
print *, 'Removing variable string of length: ', iNumChars
deallocate(varString%sText)
end if

end subroutine

end module

>C has abstract types and lets you define custom compound data types
>has a well-defined runtime that is guaranteed to be the same on any architecture
>allows you to write code that is human-centric and express intention for humans to understand
>not a high-level language

95% of the time I use macros whenever I want to abstract over procedures but can't accept eager evaluation of procedure arguments.

Example: the macro I wrote that I use the most is "andlet*". The "let" form binds the result of an expression to a name. "let" is just syntactic sugar over anonymous procedures, and procedure arguments are evaluated eagerly. It's quite common to want to bind a bunch of things in sequence but you only want to proceed if previous bindings were successful. If you wrote the code, you'd have to write a tedious sequence of "let" and "and/if" statements. But the macro automates this and let's you write the code in the form that makes sense.

It takes literally zero time to read and understand the "andlet*" form I wrote but if I had to write it as a complicated sequence of bindings and conditional branches it would be horribly obscure and easy to fuck up.

Doesn't work and is fucking ugly nigger, learn to use range.

I know it's shit, but I'm having an issue wherein the program will only accept up to 2 lines to be removed from the list.
e.g, ./a.out rm 0 1 2 ; only lines 1 and 2 will be removed from the file.
paste: pastebin.com/SR1g4b7q

Nuts, I opened a new thread when I should have just posted here. Hopefully no one minds the double post

Hey Jow Forums, I'm a beginner programmer. I know the bare basics (data types, loops, arrays, functions) and I've taken first semester classes on c# and Javascript.

I'm interested in C. I'm not knowledgeable enough to know when to use what, but the simplistic nature and strict composition and memory management has me interested. Even if I don't use it long term, I'm wondering if taking the time to learn it well would do me good in the long run for sheer fundamentals.

Would it be worth picking up this book and running through it? Does anyone here use C as their primary language?

Attached: The_C_Programming_Language,_First_Edition_Cover_(2).svg.png (1920x2698, 145K)

...

Does anyone know of a piece of software or website that allows me to make good looking user stories? Right now im using markdown but it looks ugly

Sorry, in that case I meant to say that only lines 0 and 1 would be removed, and 2 would be left

C is not a good language to learn programming fundamentals because it is difficult for a novice to abstract principles of programming from principles of C programming. A lot of good practices in C are there to combat how bad of a language C is.

But it's without a doubt the most widely supported language in existence so any time spent with it is well-spent.

>the simplistic nature
C is not a simple language by most meanings of the word "simple." Scheme is a simple language, you should be able to learn core R4RS scheme in an afternoon. But simple is relative and relative to most non-lisps C has a relatively simple syntax which is nice.

[ebuild N ] dev-lang/rust-bin-1.28.0 USE="-cargo -doc -rustfmt"
[ebuild N ] dev-util/cargo-0.29.0 [0.21.0] USE="-debug -doc -libressl"
[ebuild U ] www-client/firefox-61.0-r1 [52.9.0]

oh, I guess it's time to change browser.

autism

>Firefox is already slow to build
>throw Rust in on top of that
Just fuck me right up

I have a four hour on-site interview for a software job this Wednesday. The interview is divided up into four one hour long one-on-one sessions. I already completed a take-home assignment and a phone interview.

What should I expect and how do I prepare? This is my first software job interview coming from a mechanical engineering degree.

Attached: 1533061907290.jpg (1463x1485, 507K)

this better be a good job because 4 hour interview sounds like a bad joke

>c
It's 2018, not 1983.

C has had it's time and place, but now it's pretty much dead with the exception of a few very niche fields or legacy applications.

Trying to master C is just asking for a world of pain and agony for nothing gained. Mastering or attempting to master C won't make you a better programmer, it will make you concentrate on things that aren't relevant in almost all modern programming languages and development work.

There are features in newer languages that are either not present in C or require a large amount of screwing around to get in a similar manner.

If it wasn't for Jow Forums glorifying C most people would never think to learn it or have a reason to. So, stop trying to win the approval of a handful of retarded fanboys who probably don't even work and look at reality.

tfw
>code compiles with gcc and tcc
>valgrind clean

Attached: books.jpg (450x735, 218K)

What's considered normal? It's for a backend SDE position at a Seattle startup.

sounds l7ke pasta

TIOBE index, bitch. Check it and kys, because you don't know shit.

>There are features in newer languages that are either not present in C or require a large amount of screwing around to get in a similar manner.
This is a remarkably vague statement. What features in particular are you talking about?

>4 hour interview + phone interview + homework to work at a startup

what de fug

Attached: 1525383476445.jpg (285x322, 43K)

>tiobe
Software quality. A meaningless buzzword.

Lambdas, cross-platform execution, web-auth libraries and web frameworks are the first things that come to mind.

not him but
>i name them
>you call them bloat or proclaim your 500 line define is just as good, or simply "not needed"

Day 2 of learning Python

Attached: CostOfMeal.png (1914x976, 103K)

Crystal