Is learning to program increasingly becoming a waste of time? Major software is starting to rely on machine learning, deep learning and NN to accomplish tasks. Someone that knows these are infinitely more valuable than a typical programmer.
Should I drop everything and 100% focus on these fields?
>Major software is starting to rely on machine learning, deep learning and NN to accomplish tasks.
lmao ok name them
David White
google(image recognition, google home, etc) social media(directed advertising) everything medical related(radiology, brain scans) everything military related(automated drones, weapons)
All those have products powered by AI. Programming how something works line by line is going to be cave man shit in the next decade.
Julian Adams
>on machine learning, deep learning and NN Which blinker fluid should I use? Is SKF muffler bearing compatible with Euro5 emissions? Should I replace piston return springs in my honda?
Anthony Barnes
>Which blinker fluid should I use? Is SKF muffler bearing compatible with Euro5 emissions? Should I replace piston return springs in my honda? What point are you trying to make? All those questions can be easily answered by AI provided it was given enough data to train on.
Brayden Davis
Shhh don't tell Jow Forums, they get angry when you mention that a field that requires rudimentary mathematical skills is going to be the future of the industry.
AI makes menial programming more efficient and faster. Code still needs to written by humans and ideas need creative sparks that only humans can provide. We're nowhere near AI being able to "out think" us yet.
Dylan Robinson
>a field that requires rudimentary mathematical skills What the fuck are you smoking?
>ideas need creative sparks that only humans can provide Yes, creative ideas are still needed. The heavy lifting is however, accomplished by AI. Instead of having 50+ programmers working on a product, you only need 1 or 2.
>AI AI is a meme. You can't call an excel spreadshit with a lot of date and function, that calculates correlation an AI. AI is much complicated.
Benjamin Gutierrez
you didn't name any specific "major software"
Aaron Ramirez
College math is rudimentary.
Anthony Butler
>multibillion dollar client comes in >hello yes I want a software that does this and this >here you go, have this software that relies on machine learning, deep learning and NN to accomplish tasks >three months later >multibillion dollar client: okay I want to add feature X, remove feature Y, and apparently the Z module has some bugs, what can you do about it? >ugh...here's another neural network.. >I PAID YOU A BILLION DOLLARS, WHY CAN'T YOU FIX IT >w-w-we only know how to machine learning, we never maintained software, what do you mean fix bugs and add features? that's not how deep learning works, how about another software for another billion? >WHAT THE FUCK YOU FAGS ARE USELESS
>50+ programmers I think you're grossly overestimating the current ability of contemporary AI.
Noah Ortiz
OP i think you misunderstand what programming is guess who designed and implemented those AIs? Some mathematicians and computer scientists leading an army of pajeet coders.
Luke Perez
That doesn't make sense. Considering you posted an anime reaction image, I'm just going to assume you''ve no idea what you're saying.
Alexander Sanchez
>What the fuck are you smoking? user, I
Nicholas Gray
Even defining just the requirements of a program to be consistent requires someone who has the mind of a programmer, otherwise it will become an inconsistent and and illogical definition. So even if there was some magic "definition to program" thing, it would basically be just a more advanced and neat compiler. In essence the amount of code needing to be written may become less, but there will always be a need for people who can solve these kind of problems. And this is without all the other factors which will stop this from happening.
The only way "programmers" will become irrelevant is if machines become capable of true understanding. If this happens there is no "need" for a single human on this planet anymore.
Michael Taylor
Yes, you should stop learning right now OP, you've got us all beat and can comfortably sit back
Gabriel James
this, if you can't grasp undergrad math very quickly at least on a surface level you're a brainlet.
>College math is rudimentary. LOL. 1 + 1 = 2 is rudimentary. Not him, but what are you smoking and where can I get some?
Caleb Morris
The thing that machine learning does mostly isn't stuff programmers used to do, but stuff that no programmer could do before.
And it's still worth learning some programming. You need it for ML, it just isn't the only skill needed for the latter.
Jonathan Sanchez
that user didn't say 'undergrad math'.
they said 'college math' as a whole, as if paid research groups who work at colleges doing ground breaking theoretical math dont exist
Nolan Cox
None of these things (or rather the components that are being done by ML) were "manually" programmed. ML enabled novel features and usages.
You seriously think people know how to manually program image recognition or directed advertising or brain scan analysis? Even for the non-programming plebs whose responsibility were closer to those features it was impossible, the best they could do is put out unreliable surveys or have monkeys make guesses.
Brody Anderson
> undergrad math very quickly at least on a surface level you're a brainlet So, theoretical mathematics should be standard? Stfu, faggot.
>implying people are not coding the NNs Unless the world hits the singularity (and maybe even then), programming will always be in demand. Someone always has the create the starting point and maintain the codebase after.
Tyler Williams
>literally calc 1 and basic ML theory that uses biology 101 language as an analogy
The absolute state of Jow Forums...
Hunter Morales
>a summation, function application, addition and multiplication
This isn't even college math. You learn this in high school.
John Fisher
user literally all that's going on in that picture is addition and multiplication. We're not asking you to classify the differential structures of higher-order spheres.
Robert Williams
That's because that image is showing you a basic example of how it functions. Of course it's meant to be simplified.
If you want to do more advanced techniques and tweak the NN, you need to know your shit. Actually, you need to know more than just your shit, you need to innovate.
Lucas Reyes
t. Brainlet Calculus students mad they cannot figure out what a derivative is.
Liam Williams
The math you posted is rudimentary linear algebra that is taught in the first semester of college in most engineering programs, and in high school in places that have good education systems.
Literally any engineer in a non-CS field will laugh at you if you struggle with basic stuff like this.
The statistics part of ML can be a bit harder, but the linear algebra bit is usually very trivial.
Ian Bailey
>mom, I randomly tweaked the sigmoid function again and saw if anything changed!
James Hall
well, yeah. It's not that hard to understand in a rudimentary way. You take classes to get into the details and rigor that you should be applying when working in that field.
Lincoln Kelly
>literally calc 1 and basic ML theory that uses biology 101 language as an analogy What did he mean by this?
Logan Rodriguez
> all that's going on in that picture is addition and multiplication ..All of math can be trivialized into addition and multiplication. Holy fuck, fuck off.
Nathaniel Barnes
Not him, but that picture is of a brain cell. Lol. I figure it's immaterial.
Juan Scott
>> ..All of math can be trivialized into addition and multiplication. Holy fuck, fuck off. Oh you sweet summer child...
Eli Hill
Fucking cunt, I meant
Jordan Lopez
>hurr why is a basic example so basic? ml is so easy!
Most concepts are easy to grasp at a rudimentary level that even a self-taught scp thumping Jow Forumstard can even grasp.
Try deriving the equations for PCA or SVM's from first principles. Or as a basic exercise, do a backprop by hand. Can anyone explain what the fundamental learning problem is?
>tweak the NN, you need to know your shit. No you don't. A 3 month coursera can teach you the fundamental math behind it and how to design simple systems. If you want to design a complex ML system from scratch then sure you need to be better at math but that's because you're dealing with the equivalent of architecting a gothic cathedral and you need to be able to proof that shit so you're not wasting time running off into the woods with your undies on your head, though if you're a hobbyist you totally can do that and get something working because you don't have corporate breathing down your neck, asking why they're paying you hundreds of grands and wondering if they can replace you.
Anyway, you posted what amounts to a brick and went "whoa look at this!"
Eli Reed
AI will eventually take EVERY job, including programmers. I would bet that will start being a serious issue for the next generation tho, so I wouldn't worry much right now. Also, I guess programming complex systems is one of the last things AI will be able to beat humans at, there's tons of easier tasks that will be automated first.
>The math you posted is rudimentary linear algebra Okay, fair enough. I guess I didn't have enough context.
Michael Ortiz
You don't need that to use ML though, you just need to know a bit of linear algebra.
Anthony Diaz
>understanding is derivation from first principles I suppose none of us understand calculus either, because we just looked at a set of proofs by Riemann and Leibniz instead of deriving it ourselves. Your image adds matrices, transposition, and inverses, where's the magic sauce we aren't supposed to get?
John Young
is it the Summa which is scaring you so much?
Logan Lewis
I too took Linear Algebra 101 my friend
Gavin Phillips
Which part of linear algebra do you need to know? All the basic algebra for a NN is already handled by most software packages. When something goes wrong that's a problem with your math and not your code, how are you supposed to interpret that?
>en.wikipedia.org/wiki/Taylor_series You've introduce the concept of limits of sequences, otherwise you're just giving me an approximation. I'm not sure how you're going to define those without at least some set theory.
Cooper Baker
>a lot of date and function, that calculates correlation that's exactly how the human brain works though.
Luke Carter
>hurr durr AI is supur smort and wunt need hoomans!!! >Proograming is for dum dums!!! >my mommy says i'm smurt yes user, it is a waste of time. You should sit in your moms basement and wait for the communist AI overlords to take over and create a utopia.
Bentley Williams
Curious, why are you so butt blasted by the ever increasing demand in these fields? Does it scare you?
Carson King
This pretty much. AI and Machine learning are useful for certain tasks, but always require some level of human supervision and are not capable of everything.
None of these things were written line by line by AI. You should take a class on ML/AI. ML is nothing at all like what you think it is.
A team of programmers wrote the ML code, then learning algorithms adjust weights, etc... There is nothing "line by line" happening in any of those examples. And behind the ML code is the framework that was used, which is also maintained by a large number of programmers. Behind those guys are even more programmers.
Caleb Kelly
No idea, but I'd assume if I worked in AI I would know. I never claimed to be an ML expert user, just that the math required is simple.
And it is.
Cameron Lewis
Careful user, the poster image digit maches the post number, may be a memegician.
Isaiah Rivera
No not really, i'm actually dipping my toes in the field right now and hope to work with Boston dynamics or similar in the future so i can brag and because that will get me massive street cred.
I just find all this AI, ML, and NN memery to be incredibly fucking annoying, because absolute brainlets who don't know shit keep telling me, an actual fucking researcher, how "AI will totally replace all our jobs". They are the dunning kruger effect incarnate, they just watched some ted-talks and listened to famous retards and now think AI is anywhere near skynet or any of that bullshit. Worse are the ones who think that AI. AI shitters who spew the meme of "le skynet XDDDD" are the snakeoils salesmen of our time; those retards should be lined up and shot to purify the gene pool imo. AI is magic to these morons, they don't understand that its a glorified calculator at this point.
Logan Phillips
My company has a machine learning department and they run some program that figures out when we’re going to quit the company and it effects how we’re offered raises and promotions
Caleb Collins
>the human brain is rudimentary please explain how we manage to see both a shape and a colour despite both data items never combining withing the brain, or being in the same region.
Easton Hall
>in other news, some snake-oil salesmen created a RNG that says bullshit and cuts wages
Aiden Cox
Those were all created by programmers you retard. Machine learning algorithms don't magically create other pieces of software and other machine learning algorithms by themselves, what the hell are you trying to say?
Blake Sullivan
That's actually pretty funny.
Jason Thompson
An AI which can write arbitrary computer programs is the last program that humans will need to write.
At that point, it won't matter if you focus on machine learning or any other discipline. We'll have truly reached the "singularity" and everything that can be automated will be automated in short order, unless we decide otherwise.
Until then, we'll still need people to write software. AI that can write software is perhaps more difficult than any other task in machine learning - in fact it is a fundamental result of computer science that there exists no algorithm which can take an arbitrary set of requirements and produce a program which meets the requirements. As any CS major would know, there are uncountably many sets of strings, but only countably many Turing machines - you can't create a program which accepts an arbitrary (and for the non-CS readers, this includes infinitely-sized) set of strings and rejects all others for every possible set of strings.
You can try, though, and human minds are the best tool we've yet discovered to do it. In my opinion, that threshold will mark the end of the "golden age of AI" and the beginning of "the singularity" - we'll have moved from AIs enabling humans to be more productive, amplifying our power and letting us focus on larger choices, to AIs making us irrelevant in the grand scheme of things.
Carter Foster
>MACHINE LEARNING WILL PUT SOFTWARE ENGINEERS OUT OF JOBS! If all it took was one recursive NN algorithm to put you out of a job, you didn't deserve one to begin with.
Henry Morales
Other user has it right, that sounds like snake oil to me. I'd bet they haven't run it for a year to verify the accuracy. People don't even know when they're going to quit a good portion of the time, and they have a fuck of a lot more information.
Ethan Bell
@echo off echo [insert code here] > [file.ext]
HOLY SHIT SINGULARITY!!!!
Ryan Williams
Uber has already killed a homeless person Tesla's autopilot is killing drivers at an alarming rate The hype train is about to round a corner and derail on "AI" and kill the industry for another 30 years
Your best bet is to focus on projects that seek to replicate the brain in software something like numenta Current "AI" is a dead end