A.I

>he isn't studying A.I./Machine Learning/Deep-learning
>his job will be among the first to be taken by robottos
>even if he escapes to the """"arts"""" bots will do it better than him too

Face it Jow Forums, there is only one job you can be safe from being taken by automation. And by the time it's also taken, we will either be dead or living like gods

A.I. chads assemble

Attached: 1529254507533.gif (403x500, 142K)

Other urls found in this thread:

en.wikipedia.org/wiki/AI_winter
arxiv-sanity.com/search?q=anime
twitter.com/NSFWRedditVideo

This field is an overblown fad that will die off once the economy crashes. A.I. can only do very specific tasks when fed billions of data samples, it's not even close to biological intelligence in the ability of general problem solving and abstract thinking, which is what it would require to replace more than a fraction of current occupations, let alone programming jobs. Nu-male redditors like you will say general A.I. is right around the corner and that Moore's law is evidence of this. But faster computation cycles has nothing to do with emulating the insane complexity of the brain, and Moore's law is estimated to end within a couple decades anyway.

Moore's law is already ending, has it not? All the major chip producers are forced to prioritize lower power consumption over increasing performance

Either now or very soon. You can only fit so many transistors on a chip before it becomes nearly atomic size. And energy dissipation may be a problem already.

>he's "studying" AI/ML/DL
>his "job" will be to feed the robots data
>he still have the better/worse dichotomy for arts

Attached: eh.jpg (165x213, 14K)

>This field is an overblown fad that will die off once the economy crashes.

The field isn't anywhere near overblown, and as A.I. permeates more of the economy it will only expand and demand more professionals. Nice fear mongering though.

>A.I. can only do very specific tasks when fed billions of data samples

And only by doing that is capable of displacing hundreds of millions of jobs already

>it's not even close to biological intelligence in the ability of general problem solving and abstract thinking

And the first plane wasn't anywhere near from extraterrestrial travel. Nevertheless the burgers got to the moon 60 years after it's invention

>Nu-male redditors like you will say general A.I. is right around the corner and that Moore's law is evidence of this.

Since you are so insecure you need to resort to name calling and putting words in my mouth i can only assume you lift burgers for a living and are shitting your pants at the robots that can already do it faster, cheaper, and better then you.

>But faster computation cycles has nothing to do with emulating the insane complexity of the brain, and Moore's law is estimated to end within a couple decades anyway.

Again, retard, Moore's law isn't necessary for A.I. to advance, the current rate is already enough to predict the emergence of A.G.I. by a few decades according to most experts in the field, and it's a matter of enhancing algorithms and data far more than mere hardware by now

>he isn't studying AI/ML/DL
>he won't have a job

Attached: 1491339070908.gif (340x340, 134K)

I've been on Jow Forums since it was a slow board. Back when /prog/ was a thing there were people who said AI will always be stuck in meme-tier capabilities

Equating a fancy application of gradient descent to AI should be a punishable offense.

>at work
>guy wants to do supervised learning but only has 50 training examples
Also any tips to "level up" my ML skills?
Looked at questionnaire for research group and questions were hard.

Attached: 1514483951148.jpg (700x700, 110K)

Any developer job is automation-proof. By the time AI can actually write code we will already be at the singularity.
Also deep learning is not nearly enough to qualify as AI. It's a fancy brute-force regression tool, nothing more. Just another tool in the toolbox.

ML chad here whaddup

>Why is dimensionality reduction in a streaming setting challenging? Describe three issues and how you might solve them (no more than one paragraph for each).

Attached: ako.jpg (665x574, 29K)

dumb Satania poster

>Since you are so insecure you need to resort to name calling
>Again, retard,
Please no bully.
I remember fondly the time when I was flipping burgers, for I have a positive view of life and its opportunities.
All you have is a monomania which makes you dismiss other possibilities.

>50 training samples
Give up. ML is about producing massive inferences from tiny datasets, not nonexistent datasets.
That's why innovation in ML only comes from places where datasets are easy to generate or procure.

look into one shot learning techniques. the inferencing ability isn't too great though

I have this vision of the future where there is only one or two big corporation that do literally all "programming". You'll have this single WYSIWYG interface with a hyper sophisticated backend to design any application imaginable more efficiently than any hand codded programs.
Being unable to compete, programmers not working for a mega-corp with be forced into irrelevance.

Designing an application is as much an art as it is a science, I don't think you can teach an AI model to derive some optimal data model for a given use case or collection of use cases.
At least, something like that is in the far future.

>AI model to

Where have i heard this before

Give it a shot, I dare you.
I am 100% confident that there are zero neural network architectures even remotely suited to the task.
Guess how I know.

As companies embrace buzzwords, a shortage of blockchain cryptocurrency connoisseurs opens. Only the finest theoretical code artisans with a background in machine learning (20 years of experience minimum) and artificial general intelligence (5+ years of experience) can shed light on the future of quantum computing as we know it. The rest of us simply can't hope to compete with the influx of Stanford graduates feeding all the big data to their insatiable models, tensor by tensor. "Nobody knows how these models really work, but they do and it's time to embrace them." said Boris Yue, 20, self-appointed "AI Expert" and "Code Samurai". But Yue wasn’t worried about so much potential competition. While the job outlook for those with computer skills is generally good, Yue is in an even more rarified category: he is studying artificial intelligence, working on technology that teaches machines to learn and think in ways that mimic human cognition. You know, just like when you read a list of 50000000 pictures + labels and you learn to categorize them through excruciating trial and error processes that sometimes end up in an electrified prod to the back and sometimes don't. Just like human cognition, and Yue is working on the vanguard of that.

Attached: aus.png (364x313, 7K)

"Art" is just the exploitation of certain patterns to instigate a feeling in humans, be it good, bad or neutral.

If there are patterns on it, in principle, a neural network can learn it. Such a NN may not exist yet, but i'm 100% sure it will one day exist.

Guess how I know

Data models aren't a result of patterns, they're a result of reasoning.

what is that even supposed to mean

>machine learning is AI
Truly no worse meme

>You'll have this single WYSIWYG interface with a hyper sophisticated backend to design any application imaginable

Meanwhile in the real world we can't even get infinite scroll working properly, and practically every OS and program in widespread use is barely working hacked-together garbage. Forget a WYSIWYG programming interface, Word can't even get a WYSIWYG document interface to work without layout glitches.

We're a long way from your dream world bud, and it doesn't look like we're really getting any closer over time either. We just keep adding more hacks on top of the pile.

unless you have PhD in any of those fields and work as a researcher in a university, you will be jobless

exactly this.

anyone who disagrees is a brainlet.

Deep Learning Is Not AI.

wtf is this reddit garbage

didn't even bother reading, sage

>And only by doing that is capable of displacing hundreds of millions of jobs already
We gotta keep bringing in all these illegals, visa workers, and immigrants though!

Boomer here. You zoomers need to learn your history.

en.wikipedia.org/wiki/AI_winter

Hey guys, what's going on in this thread?

Attached: 1537149565743.jpg (565x505, 47K)

i want to nakadashi satania

You fool, Deep Obama will destroy us all

Attached: Deepfake_Fin.png.jpg (582x388, 29K)

Is it a good idea to use recent news articles to affect the weights on a stock market predictor? I'm making one for a class and thought it'd be neat to incorporate.

i'm 35 and make $120k as a fullstack dev. i feel like i should switch to ml/ai, but maybe it will take too long to learn, and i wouldn't have credentials to get hired anyway. i hate my job but maybe i should just grind out whatever i can so i can save and invest as much as possible while i have the $120k job... ugh. :/

Where do you live?

texas

It will solve certain problems better than humans, but will likely just augment their jobs instead of replace them. Hft took over stock trading, but it's just auomating their existing trades which they still have to program.

>he thinks AIs won't self-replicate, making AI developers useless
>he doesn't realize that everything AIs do is through interfaces I design
>he doesn't realize his entire concept of a computer is through the lense of the things I create

You need 10000 samples to get anything and 100000 samples o get anything worthwhile. Most ml projects have >1000000 samples.

I focused on compiler, os, and other systems courses in university.
I don't regret my decision at all, but where should I start if I want to get into machine learning?

Specifically, what kinds of math do I need to brush up on and how do I start learning?

fast.ai has a pretty cool intro to deep learning series that seems to be accessible, you should check it out. It's aimed at people with programming experience but not ML experience. You might want to review some linear algebra and calculus, but assuming you've been out of school for a while, you'll probably find it to be a more enjoyable experience if you try just picking up what you need along the way

Linear algebra and advanced applied statistics. There are ml books on mit ocw course lists.

Thanks guys, I will check out your recommendations

> i am deadbeat idiot only capable of jobs easily replaceable by rural poor people that can barely speak the language

Is it time to spot the entitled amerilard child already?

>not studying ML because its literally the only field of science where you can publish research about cute anime girls
see arxiv-sanity.com/search?q=anime

You can just upload to arxiv my dude.
But yes there's conference papers on anime.