Programmer red flag

I will start:
He writes O(n) procedures.

Attached: Young_theodore_kaczynski.jpg (640x453, 222K)

Other urls found in this thread:

en.wikipedia.org/wiki/Time_complexity
twitter.com/NSFWRedditGif

>he calls himself a programmer and not a developer/software engineer
>he codes in *language I don't like but exists no credible reason why it's a bad language*

>Is a fanboy for legacy or meme languages.
>Talks about high level concepts like strong typing or monads but doesn't have a clue about things people actually use like OOP principals like inheritance and polymorphism.
>Claims to be proficient in 4 or more languages. (Bonus points for languages outside of a certain common tech stack)
>Arxiv papers on high level academic topics bookmarked without an academic background or current academic studies.
>Anime wallpapers
>Laptop stickers
>Desktop riced to the point of it being impractical

What if O(n) is provably the lowest bound for the procedure? Kill yourself.

>he/she/xir refuses to start working until they've had their coffee

Attached: il_570xN.642586078_sj3k.jpg (570x542, 49K)

They know Python.

He's not a straight white male.

>xir

Attached: 1483373673050.png (454x444, 523K)

They're not a virgin.

>hates black coffee/espresso

>implies writing O(n) procedures is bad regardless of context
red flag right there

I knew this would be triggering, just pick the best algorithm and everything should be fine. en.wikipedia.org/wiki/Time_complexity

Feels good not being a virgin

Your understanding of the analysis of algorithms is cargo cult tier.

>writes code to get the job done, not to get the job done *properly*, e.g. does not care about clean code or performant code
>enterprise developers
>uses emoji in code or docs (read: millenial developers)
>bad at english but uses machine translated identifier names
>prefers a single coding style (or none at all), independent of the language and culture thereof (e.g. writes code in the same coding style in JavaScript as in C#)
>ungrounded fanboyism

Oh and I forgot
>unironically hates microsoft or apple or open source software, because their peers do, not because of any personal experience

Every problem is bog(n) if you unlock the secrets of the universe

Attached: bogs.jpg (480x360, 13K)

>is addicted to caffeine like a common street urchin degenerate drug addict

major red flag

They're not Ukrainian.

>he calls himself a programmer and not a developer/software engineer
Hello, Mr. McKenzie.

They use more mouse than keyboard

>strong typing
lol

>unironically hates ... open source software
Outside of a few famous eccentrics, is this a thing?

Laptop stickers? Who am I, a child?

I cпpaвдi.

>>Talks about high level concepts like strong typing or monads but doesn't have a clue about things people actually use like OOP principals like inheritance and polymorphism.
What else is there to talk about then?

Phew, dodged that one by using two mices but three keyboards.

>>uses emoji in code or docs (read: millenial developers)

Dear god people actually do this ?

Attached: Pablo Picasso in his studio, 1965..jpg (412x351, 20K)

>He writes all his functions in O(0) time

Attached: 1505819441982.jpg (750x738, 52K)

multimethods, inversed vtable calling order and mutable vs imutable types

Anonymous enumerations in function signatures.

I refuse to believe anyone uses emoji in code or serious documentation. Docs for your own framework, sure. But a serious thing that businesses use to make money? no way.

That hit way too close..

>sad_frog.gif


Nasty.

Sometimes clean code and performance can't be combined. In this case I'd still prefer the one that performs better

How de fuk do you write code to find a specific element in an unsorted unindexed list without O(n)

Attached: 1482838792434.jpg (1920x1280, 341K)

By not working with unsorted unindexed data in the first place. Do you have a specific real-world example?

If that's a sorting algorithm, that's actually great.
I bet you learned big O just yesterday and just want to look smart.

O(n) is far more desirable than something with exponential complexity

But those are all unironically good signs.
"OOP principles" are a meme unlike strong typing and the like.

t. red-flag-worthy programmer

>>He writes all his functions in O(0) time
how does that even work with scalable workloads

It doesn't.

what the fuck is O(n)

Red flag programmer detected.

Attached: 1379388446876.gif (493x342, 393K)

>shitposting about O(n)
>not knowing what it is

>But those are all unironically good signs.
flagged and kek'd

>cares about memes like running time
I mean, it's important, but caring about algorithms is just reddit tier circlejerk.

means your algorithm loops (n) times, and (n) is your length of the array for example, so if you had 9 million results you'd loop through 9 million items.

bit too specific - "run time scales linearly as input size increases" is a bit more general I think

i lol'd and then checked

I like explaining things in super basic terms because it makes sense the very second you read it, even someone who isn't a programmer can understand that.

>TFW no stackoverflow for brainlets where every "How to do X" is written in super plain English

eh fair enough, I see where you're coming from. Might be best to not lock into the array example though - it is easy to understand though

what are the alternatives?

It depends on the problem.

literally any equation in terms of n

In general, O(equation) means that the run time of your function/method/whatever increases according to that equation

so O(n[]) would count as alternative?

>not knowing what it is

[]? What are you using that notation for?

>compiles C code with a C++ compiler
>believes that software development methodologies like agile etc. is not a subset of self-help literature
>has sexual relations with women

array of n

that's not an equation then?
You know, equations
n, n^2, n!, etc

For example if you have two nested for loops each iterating from 0->n, you'd have the complexity of O(n*n) i.e. O(n^2). When n grows linearly, the required computation grows exponentially. While 10 units of data would require 100 computations, 1000 units would require 1,000,000 computations etc. So the amount of processing power needed starts getting out of hand as your data set grows in size.

I use emoji at times in commit messages just to piss people off.

Who are they again? Can I get a quick rundown?

You'd be surprised how many people think that this is the uniform to be some 'leet coder'.

They buy into all of this reddit/hollywood hacker culture bullshit with mechanical keyboards, meme Linux distros like Arch, multi-monitor setups, watch shows like Silicon Valley or Red Dwarf or shit like that and so on.

For these people the subculture and appearing to be a part of that subculture is more valuable than having technical skills or even being able to contribute as a developer to whatever.

Attached: quick-rundown.gif (380x380, 1.18M)

Okay, this is epic

>Eee wantz do geet ah jerb

By not being a faggot nazi

Having images of Hitler on your machine doesn't make you a nazi any more than having pictures of girls on your machine make you attractive

>>Arxiv papers on high level academic topics bookmarked without an (((academic background))) or current (((academic studies))).

>oop principles
listen here niggers, i'm starting oop next month and Jow Forums told me it was fucking shit for ages and now this guy's saying it's important. i thought it was a meme. who was in the wrong here?

I only write O(n!)

i'm more surprised at how one can hate both apple/microsoft alongside FOSS

DON'T FUCK WITH MY ANIMAYS NIGGA

Jow Forums is mostly unemployed. They don't understand why OOP is important for projects larger than fizzbuzz. Speaking as a professional software engineer the information posted here isn't very valuable outside of hobbyist computing and autism.

IMO the biggest advantage of OOP is the ability to separate the interface from the implementation through encapsulation and polymorphism, which is incredibly important in larger systems for a number of reasons.

Don't talk ill of mechanical keyboards, dickhead.

Attached: 1503183213283.jpg (1130x508, 31K)

Christ, yes. OOP is a lifesaver wherever you've got a fuckload of stuff that is supposed to behave in a similar way... except for some details.

I've inherited a project where there were four separate variants of an analysis to run. The way the chucklefuck that wrote it handled it was ELSE/IFs. A *lot* of else/ifs. All of that handled by a small handful of God functions. And, obviously, zero tests. Any tests.

Not only did that shit fall apart as soon as clients requested changes, it fell apart if you looked at the source too hard, or whenever the stars aligned.

The very first thing I did was rewrite whatever could be salvaged into proper OOP and what do you know, suddenly it became possible to make a single change without breaking everything, some of the time. My only regret is that I didn't just torch that pile of garbage altogether and start fresh from the get-go.

>D-do i fit in yet guise?

>he is a try-hard sjw

int twodimensional[N][N]
int onedimensional[N^2]


Let's say we have an algorithm that only has 1 pass over every element. The two dimensional is done:

for(int i = 0; i < N; i++){
for j = 0; j < N; j++){
//do stuff
}
}

This would be O(n^2) where as accessing the one dimensional array has O(n), but both algorithms would take the same amount of time? How does that make sense?

Freshmen who think they're hot shit after finding out about time complexity

said "or", not "and".

The one dimensional array is clearly n^2 the length of input, you brainlet.

Attached: Emojicode.jpg (537x765, 81K)

kek, you seem like a gentleman. Come back when you can write simple assembler and I'll humour you with your ((((high level concepts)))) my fine chap.

>Laptop stickers
>Child
Self conscious like some white trash or nigglet.

>mac
>electron based editor

thats just n

Attached: DbocrgMVMAEIQ16.png (800x1028, 237K)

if you use swift you can use emojis as variable names
O(n!)

>he went to college for CS, Math, or any type of engineering
>he went to college

The red flag is that his name is Pajeet.
O(n) is the least of your worries.

>Pajeet
>mfw your office is hiring new talent from east

Attached: 1531354558195.gif (487x416, 3.75M)

How do they always fuck comics up?

Attached: chrome_2018-07-16_14-16-34.png (1372x1191, 446K)

>self-taught and completely oblivious to templates
>has no idea how o-Notation works and its ramifications for the final product
>says that he is proficient in more than 2 languages
>jumped from company to company every 1,5 years
>brings dog to office

Scratch the last one, dogs are great

you ever heard of union-find with path-compression and rebalancing, nigga ?

>not coding everything in 0(n!) and wait for the singularity to expose NP as P-easy, so everything in modern Computer Science becomes 0(n) inevitably

damn this post gave me cringe

Attached: 1323028063892.jpg (600x600, 36K)