Will learning higher math realistically help me become better at programming in any way...

Will learning higher math realistically help me become better at programming in any way? I've been working as a programmer for a few years without any difficulties and only ever used some basics of algorithmic complexity and binary arithmetic a few times and it wasn't even necessary to solve the problem. I never learned more complex math topics, like abstract algebra or category theory, the most advanced uni class I had was multivariate analysis. Is relearning it from scratch (I think I forgot it all at this point, even basics from highschool), but going further this time, worth it?

Attached: 1.png (542x497, 53K)

Other urls found in this thread:

en.wikipedia.org/wiki/Linguistic_relativity
ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2015/proofs/tp1-1/
twitter.com/AnonBabble

Like I'd tell a filthy peep

Not unless you do graphics programming or whatever.
Protip:
Don't waste your time learning useless stuff.

Depends on what kind of programming you're doing. There are plenty of kinds of programming that use a little bit of math, some that use a lot, and plenty that use none.

>Don't waste your time learning useless stuff.
what do you recommend to progress as a programmer at an advanced level instead

Yes, you can relate a lot of things from mathematical analysis to programming

Pick a programming project that is difficult but doable for you and do it, learning all the things it takes to accomplish it. Slowly try to move out of the "toy script" projects and into applications that actually have uses.

>I never learned more complex math topics, like abstract algebra or category theory
en.wikipedia.org/wiki/Linguistic_relativity

Unfortunately, you literally wouldn't /truly/ know, or possibly be convinced of how useful it is until you learn it user. The "I haven't had needed it so far!" excuse is a meme. Humanity didn't "need" the number 0 for millennia either, but that obviously didn't mean it wasn't useful.

pic related is probably the smoothest introduction that shows how it's useful.

Attached: 1501577140333.png (1530x1980, 214K)

but I already worked on huge commercial projects, I'm not at the toy scripts level. I'm just wondering how to progress now and I have a feeling that forgetting math is holding me back, even though I never needed it.

>I have a feeling that forgetting math is holding me back
forgetting math is literally holding the entire industry back

It is not, that higher math will help you in every situation as a programmer in the form of math itself. At first it changes the way you think. It's a more abstract way of thinking, which makes your live easier. Not only in programming. If you really need higher math for mathematical reasons depends on your job.

Learning higher math can be a real pain. I would not recommend it if you don't need it.

Practice the basics. 20-30 minutes every day.

seriously, what is wrong with the code in the picture?

second this

Fucking programming.

Only check C if A and B are Equal. Don't need to for every combination to be checked.

What modern language doesn't short circuit && operators?

Not what he is saying. He is suggesting to not execute the inner most loop at all if a !=b. There is no reason to.

The code itself isn't wrong. The problem is exactly what OP is concerned with: surely there's a faster way to do it (and there is) besides iterating full over each array as is. Math and algorithms point to better efficiency.

O(groupA*groupB*groupC). You can do it in O(groupA*ln(groupA)+groupB*ln(groupB)+groupC*ln(groupC)).

sort(groupA);
sort(groupB);
sort(groupC);

iterator A = groupA.begin();
iterator B = groupB.begin();
iterator C = groupC.begin();

while(A != groupA.end() && B != groupB.end() && C != groupC.end() ){
if(*A == *B && *B == *C ) return false;

if(*A

Or you could do it in O(n) by iterating over each group individually and storing the result of each value in a hashtable, then on the final group iteration check if the other groups all added a value to the table there. This would simplify the code, especially if we wanted to support an arbitrary number of groups, and is overall faster than all the proposed solutions.

>seriously, what is wrong with the code in the picture?
LMAO! Lemme guess, you "program" java, huh codelet?

Programming in Assembly for a project or two will make you a better programmer, you'll see where all the flow control structures come from and learn how data structures are represented at the lowest level.

It's not about being able to find answers, it's about finding them faster than "bruteforcing". Sometimes the speed doesn't matter, sometimes it's crucial. It's about what the program is supposed to do.

Other than that, a complementary field of study is helpful in programs that use it directly. Simulating for a cure to cancer?, better study medicine/biochemistry.

Numerical analysis is a branch of applied math that's actually applicable to CS. It pretty much builds off of Taylor series, polynomial interpolation, and a few basic calculus theorems (IVT, MVT etc). Abstract algebra/category theory is interesting and all but probably too much for Jow Forums brainlets who stopped at calc in college.

>Will learning higher math realistically help me become better at programming in any way?

See
Apperantly yes

Only cons is the memory footprint
> Like anyone care in 2018

welcome to the leetcode memorization club

“If he [Thomas Edison] had a needle to find in a haystack, he would not stop to reason where it was most likely to be, but would proceed at once with the feverish diligence of a bee, to examine straw after straw until he found the object of his search. … Just a little theory and calculation would have saved him ninety percent of his labor.”

― Nikola Tesla

bump

You know the first condition that isn't met will already break checking of the remaining ones?

They are 95% separate
The fact that half of a normal computer science degree's classes are about programming is criminal

fn disjoint1(group_a: &[i32], group_b: &[i32], group_c: &[i32]) -> bool {
let mut table: HashMap = HashMap::with_capacity(group_a.len());
for &a in group_a {
table.insert(a, 0);
}
for &b in group_b {
table.get_mut(&b).map(|cnt| *cnt = 1);
}
for &c in group_c {
if let Some(&1) = table.get(&c) {
return false;
}
}
return true;
}

There are other uses for advanced mathematics besides graphics programming. I took a machine learning course on Coursera, and it made use of gradients and directional derivatives for solving even the most basic regression problems.

You only need to check C if A and B are equal. If they're not, then the condition A == B == C obviously isn't true, so short-circuit evaluation isn't a problem here.

It's not that bad. Its running time is O(n^3), which is still in the polynomial range. What you really want to avoid is algorithms that run in exponential time.

You do know that function calls are computationally expensive, and that program might actually take longer to run than the one in the OP?

lol

Math can help, but their are a lot of shitty programmers in C.S. departments who are basically math professors.

Basically, the logic for both is related, programming can make you better at math, advanced math can make you better at programming or at least understanding certain concepts.

All that said you can kind of short cut your way by reading books on data algorithm's (Groking algorithms) or reading the Imposter's Syndrome. Learning some cryptography and how it works, doesn't hurt either. On the math side there are plently of Calculus for CS, etc training courses to help. A logic philosophy course doesn't hurt either.

>>Imposter's Syndrome

I meant to say Imposter's Handbook

Anything above O(n) is pretty shit desu.

ORLY? Quicksort runs in O(n log(n)).

This isn't reddit, you can reply to multiple posts.

ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2015/proofs/tp1-1/

My first instinct is to use a hashtable and add elements to it until I get a collision. That would be the most efficient but take the most memory, assuming we start the hashtable big enough for all three arrays.

If memory is an issue we could sort each array, compare the first elements, and advance the index of the lowest element.

Personally, I would loop through just the first two and check for equality, and if equality is found, then and only then do the innermost loop. That would reduce the time complexity to Ω(n^2), with that lower bound being very likely since a==b is the less likely outcome.

I'm not that good at designing efficient algorithms though. I have a bad habit of abandoning an algorithm design and moving on to something else once I have something that works.

You only need a hashtable as large as the shortest array (possibly smaller if the array contains duplicates).

you can tell Jow Forums is full of dropouts when they're saying no

Sort all 3 arrays and iterate once
O(4n log(n))

Imo, learning advanced maths will help you more with reasoning than just mathematical knowledge. If you master the logical basics, and can make rigorous proofs, you'll be stepping your programming game up

I would burn the hay. easier to sift thru ash than hay

I like to say "When you need to find a needle in a haystack, use a magnet."

Not sure if I came up with that myself, or saw it somewhere long ago and forgot about it, but it pretty well sums up the philosophy of coming up with a clever solution to a problem rather than trying to brute-force everything.

>4n

Attached: ....jpg (250x250, 17K)

At the very least, yes, discrete maths will help. Download a textbook and do the needful, I think it really will help you sir

Probably just a typo, seeing as everything else is spot-on.

Opinions?

Attached: 112243.jpg (305x400, 78K)

If we draw a line between programmer and computer scientist/software engineer, then no. If not, then yes.

If each array has 1000 elements, you'll already take up a whole second.
Running this function on every pair is much faster than running on all three
And that would still be way too slow.
Since the arrays are integers, you can just sort them and then move along them checking if any two values are the same, which would reduce the complexity from O(n^2) to O(n log n), and log n is ~32 for 4000000000, so it's much faster than n^2

>O(n^3)
lmao 10000 elements will cause that retarded algorithm to take minutes to finish

For( a)
For (b)
Check a == b
For (c)
Check a == c

This can be less than O(n) though

More or less this.

if what you want to get into is very cs heavy, then math is probably a good complement. if you want to start writing webpage widgets then i wouldn't worry.

radix sort is linear

Attached: oppression.jpg (750x721, 219K)