AI General - /AIG/

General discussion about Machine Learning, Deep Learning and AI.

The Deep Learning textbook:
deeplearningbook.org/

Deepmind Reinforcement Learning & Deep Learning introduction playlist:
youtube.com/playlist?list=PLqYmG7hTraZDNJre23vqCGIVpfZ_K2RZs

Arxiv Sanity Preserver:
arxiv-sanity.com/

Deep Learning frameworks comparison:
en.wikipedia.org/wiki/Comparison_of_deep_learning_software

Attached: TF2.png (1190x676, 130K)

Other urls found in this thread:

youtube.com/watch?v=7ELRZrjCFd0
en.wikipedia.org/wiki/TrueNorth
pdfs.semanticscholar.org/05ce/b32839c26c8d2cb38d5529cf7720a68c3fab.pdf
make.girls.moe/#/
hackernoon.com/learning-ai-if-you-suck-at-math-8bdfb4b79037
artoriuz.github.io/mpv_upscaling.html
twitter.com/SFWRedditVideos

has ai improved your life?

Attached: 1494380697404.gif (580x536, 19K)

Soon about to graduate (master) from compsci and the closer the graduation the less I have any idea how anything in AI actually works and why and every model accuracy validation method seems like something pulled out of ass. And if that applies to me, I bet overwhelming majority of AI businesses have no idea either and just roll with it selling hogwash. Or I have just suddenly became blackpilled mentally exhausted brainlet.

Attached: 1533440050677.png (632x720, 300K)

yeah. i don't have to download images to their respective folders. eyy ayy does that for me.

>the brain barely consumes 12W
>state of the art AI needs a fuck ton of power and many graphics cards, doesnt even produce intelligent behavior


Explain yourselves right now

youtube.com/watch?v=7ELRZrjCFd0

12 Watt is the equivalent to 10'000 calories per hours. you're wrong kid

>that ethernet cable

I assume theres a whole server running in the background

Its actually 20W but still

There is not.

en.wikipedia.org/wiki/TrueNorth

Yeah but human body as a whole takes 75W.

Pretty cool user but an artificial 'neuron' barely haves the capabilities than a real one, we now know they are pretty much an entire circuit not just the transistor of the brain.

Thats the point we are not even close to scale for AGI

What's there to explain. We suck at it. We're doing our best to get better.

if you don't understand
then you haven't dived into the math.
get away from shit like tensorflow, theano, etc... until you understand how everything works and can write your own stuff from scratch.
those things were created to let mindlets "jump in" to AI without any understanding of what is going on.
i blame python and google; an entire age of retards that need to be spoonfed everything and handled with kid gloves. what a waste.

>those things were created to let mindlets "jump in" to AI without any understanding of what is going on.
Not entirely true. Tensorflow is used both at Deepmind and Openai.

I know the point of the video is the AI accelerator chip, but is the recognition supposed to be impressive?
After image-net being beaten by networks without fully connected layers, I thought the next step would be adversarial images.

Can anyone provide a good roadmap to deep learning?
Which books? Which vids?

can I jump on this devilish train even if I have almost zero computer programming skills (never been interested in coding) and high school calculus/algebra knowledge?

I'd mostly just like to understand how the shit that will wreck our lives do work

Combinatorics, discrete math and computational algebra.

This. Although the "real" objective with Google's tensorflow and Facebook's Pytorch might actually be to be the AI's monopoly framework, I'm quite glad they opensourced it.
Imagine doing all the computation graph and backpropagation by hand or using some half baked GNU knockoff.
Although I know it used to be produced by universities rather than industries, I remember having a look at Caffe back then and couldn't understand a thing.
Thinking about it, it scared me from majoring in machine learning. If I had brainlet friendly tools available back then, I would have jumped straight into the ML memefest.

There are always proofs available for every algorithm that has been academically published. It's just applied statistics, so basic probability and linear algebra are needed.

I'm a guy from a mathematical background, I had a supervised learning module during my MSc, it was literally just statistics and linear algebra. Whatever I could write to replicate the algorithms would be less optimized than the commonly available libraries. Basically, even though I know the math, I am not technically capable of writing something as efficient as the collective of people working on the python libs.

What? You just need probability, linear algebra and basic calculus. What you need from combinatorics will be included in any half decent probability course.

How do I make me my own holo waifu so I stop being so lonely and horny ffffffuuuck

Attached: 5in8rz7t6wc01.png (799x422, 314K)

bump

To keep the thread interesting whats the most complex thing an ai you've worked on can do.

Also how much can 1 person do in ai?Or is it a waste of time.

>I'm a guy from a mathematical background, I had a supervised learning module during my MSc, it was literally just statistics and linear algebra.
congratulations. you understand it.

>Whatever I could write to replicate the algorithms would be less optimized than the commonly available libraries. Basically, even though I know the math, I am not technically capable of writing something as efficient as the collective of people working on the python libs.
but, you could write something even if it performed poorly.
the python+tensorflow generation couldn't even do that. they will never be able to contribute further because they don't understand even the most simple underlying ideas.
for example, why was everyone using sigmoid neurons for so long? the answer is this:
pdfs.semanticscholar.org/05ce/b32839c26c8d2cb38d5529cf7720a68c3fab.pdf
however, they have no idea. they have no idea what formal methods are, let alone how to do them.
congratulations on your background in maths; now, go contribute something or forever be forgotten.

>brute-forcing general

Currently working on texture generation using convolutional neural networks
Pretty cool stuff, bants how computers are probs gonna take artists' jobs too one day

Is there any difference between TenserFlow for Python and the one for JS?

Attached: 1538404423534.jpg (600x600, 81K)

Nope, both are worse than Caffe

Fuck off, Samuel

That's interesting, I only thought people used sigmoid functions as an analogue to Fermi's distribution (0 or 1 spin), which is nature's own way of classification of particle state. You could have chosen polynomials or trig functions as well.

People confuse calories and kilocalories, average human needs 2000KCal per day.

bump

I'm trying to create my own neural network in C++. I've got a perceptron working and I can test it out with linealy separable dataset. I don't really know how to start with the multi layer perceptron part. I think the problem is that i haven't looked at derivatives and other math for a while and I'm not sure how to implement it in C++. Anybody know any resources for either the math or MLP.
Also, I'm doing text summarization with deep learning as my final year project. Just started and want to learn a good foundation.

Last year I trained a model to classify images taken by a robot. It was a classification problem and depending on the image the robot went forward left or right.

math, math, math.
learn the math.
it's not particularly hard math, just learn it.
in my opinion, the math is much more important than the implementation. technology will eventually catch up to the ability to implement the theory. look at the back propagation algorithm for example; the hardware at the time it was developed was such that it was impractical to utilize.
learn the math.
>Last year I trained a model to classify images taken by a robot. It was a classification problem and depending on the image the robot went forward left or right.
>I don't really know how to start with the multi layer perceptron part. I think the problem is that i haven't looked at derivatives and other math for a while and I'm not sure how to implement it in C++.
this is exactly what i'm talking about.
you were able to train a NN, but you don't understand what's going on under the hood even though it's not complicated.

>t. mathfag gaylord

Bullshit, learn the hardware so you know what actually can be done and whats efficient. Mathematicians inventing crap models that look 'elegant' are dogshit. Reality is not elegant.

Okey. It seems like the deeplearning book is a good start for the math. We had another DL project last year and I think the teachers aren't really going about it correctly when it comes to teaching. They accepted "brute-force" as to why we chose the models we did.
I must say that the frameworks are a good way to get your feet wet but if you understand the theory I imagine you can get to a better model quicker or have different hypothesis as to why something works or doesn't.
I didn't know this general existed. I hope to see it more active.

Is that one user that was working on an AI to decensored Japanese porn still on Jow Forums? How's that project coming along?

youll need to learn a scripting program like python if you want to make some sort of AI, but understanding the general concepts and different types of AI is for babbies

>jump into ML and AI
>do some coursera courses
>have fun
>do lots of ML and AI
>realize it's all just linalg, statistics and lies
>most daily work is data engineering and button counting
>become disillusioned
I had a job offer and an opportunity to get into the field and I fucking walked away from it. Did I perform career sudoku?

You did the right thing, just sit down and watch the AI winter with all the pajeets crashing down.

>using the latest in AI technology to make anime real

they're almost there
make.girls.moe/#/

Attached: 1544164001633.png (749x632, 159K)

maybe not

Attached: 1524154236552.png (256x256, 110K)

>the absolute state of linear algebra

>AI winter
Tell that to this dog.

Protip: he's not real.

Attached: dog.png (516x519, 305K)

Pic related for you faggot

Attached: 6636463236263.png (491x335, 341K)

can i get into AI dev as a dumb undergraduate?

>mix pictures of dogs
>Wow it made a dog skynet incoming XD

You are like babies with too much computational power in your hands

>Its shit now so it will be shit in the future

Just shut the fuck up

Yes! Technology will stop progressing completely in the next years friend :)

Learn Python learn TensorFlow and let your imagination go wild

>progress

We had this shit since the 90s on paper. If anything is hardware progress, but Moore's Law is already false. AI may improve, but history will remember deep learning as a dead end false start.

Mandatory reading for me.

hackernoon.com/learning-ai-if-you-suck-at-math-8bdfb4b79037

you sound like the idiots who claimed the internet and personal computers were a fad

Is there such a thing as waifu2x but for videos? Is anybody working on something like it? Is anybody even interested?

Share code, pls.

use waifu2x on each frame. I think people tried this already

Yeah, but that's clearly not the same. You know what I mean, a neuralnet adapting to the codecs and standards used in video technology, not transcoded images from the video.

It is only used for advertising and spying on people.

Attached: sad.png (253x252, 145K)

>proofs
>in ai
yeah no.

would you like translating this for me?

Attached: 1539457084067.png (840x819, 280K)

by all means
>blah blah blah likelihood blah blah blah kullback leibler blah blah blah gradient descent
statistics is not science and it absolutely is not mathematics

How long until AI winter 2.0 when everybody realizes that all the tech right now is only used because computers got good enough to run ideas from papers from the late 80s? Can somebody redpill me on unsupervised learning because that seems like the only place where novel shit is actually happening?

John Shawe-Taylor was my professor, and I can definitely recommend all his books on supervised learning, they are a good reference.

It's math. Optimization proofs are proofs nonetheless.

Any really entry level AI resources?
I'm mostly interested from a conceptual side not necessarily an implementation standpoint.

Attached: 1517379698977.jpg (410x646, 24K)

this randomly generated dataset is pleasing to the eyes.

Attached: 1516325301136.png (256x256, 103K)

Applied mathematicians and physicists are the ones developing the approximated models as well, don't be delusional.

I think it's important for everyone to also study statistical learning theory in general. What is an hypothesis, how to define an error, how to compute error between data output and hypothesis output, etc. Basically, truly understanding what you are doing in abstract.

I had a similar path, but I did a more general MSc with just one big module on Supervised Learning. Glad I didn't go all the way in, linear algebra is boring as fuck. I only discovered manifold methods this year, but that would have been something that I would have found much more interesting.

>How long until AI winter 2.0
I thought the money was already drying up?

Statistics is a lot less intricate and polished than real maths, and a lot less shocking and fantastic than theoretical physics.

But it IS useful.

Is deep learning the future of procedural generation?

Attached: 1530515583620.png (256x256, 95K)

people used sigmoids because they have nice derivatives. But sigmoids are between 0 and 1. So they used tanh instead, which ranges from -1 to 1 and also has a nice derivative.
Now the hottest activation in the hood is max(0,x)

Wow it haves the capabilities of a flash anime character creator!

>flash script
>pre-baked images layered together

vs

>1000's of lines of code
>billions of transistors slaving away
>neural nets fighting eachother to discern reality

this is the future

Attached: 1541388354905.png (256x256, 104K)

>Now the hottest activation in the hood is max(0,x)
You know, when I first heard about how this "relu" activation was so great, I expected it to be a lot cooler than it is. It sounds so fancy, rectified blah blah blah. I was disappointed.

artoriuz.github.io/mpv_upscaling.html

Absolutely amazing. Thanks, nice to know that it's coming to existence. I can't imagine what else the future holds.

Yeah, ReLu is too fancy a name for what it really is. At least it is closer to the "real" activation of neurons in our brains, or so I've been told.
You can still use the ELU (some sort of smooth relu) so you get to have a function that is differentiable everywhere. On the downside it is more complex to evaluate.

waifu2x is still literally the best, just not fast enough for RT

Bump

Any point to doing that though? Relu seems to work fine. And in practice it's differentiable anywhere - the only place where it isn't is measure zero, so why care?
I still don't have a great intuition for how relu works so well though. It's just a linear function that can shut off - you should only be able to create piecewise linear functions with it, but I've seen some pretty damn smooth curves from relu networks. Stacking layers must do something cool but I don't quite understand why desu

I dont like tom cruise tv, play mission impossible but star danny devito in the lead role
It's going to be very tiresome

and where do i get some resources to understand such concepts?

>tfw forced to buy NVIDIA GPU because they have better Machine Learning/AI support.

Well, I we will see botnet worthy AI co-processors before we will realize.

>In the future, people abandoned thinking how to solve problems in favor of throwing data to machines praying to be solved

What a shitty future

are there good deep learning online courses with certificates that are free or cheap? not necessarily introductory ones, can be intermediate level onwards

I’m trying to get into Computational Linguistics from a non programming background, am learning Python at the moment, and I wanted to learn some natural language processing.

Any of you do this shit? I will look into TensorFlow

Attached: 5E54CFEE-5773-4339-A987-98B12740329F.png (450x222, 27K)

>google using machine learning to remove "toxicity" from chat

How to define toxicity?

how to create pseudo-perfect information game out of imperfect domain?

You can create smooth curves by superimposing piece-wise smooth curves that "overlap", no?

When I first saw neural nets (actually, logistic reg) I assumed they were using arctan.

I don't know what relu is, but you could have (numerically) catastrophic behaviour is your element of measure zero is approached asymptoptically, no?

Standford's classical coursera course is still the one I always refers absolute beginners to.

Andrew Ng coursera courses
Keras is more popular in language processing afaik