Want to learn deep learning

>want to learn deep learning
>watch deep learning lectures online and take notes
>you need powerful Nvidia GPUs to do the homeworks since everything is built on CUDA
>no money to buy $1000+ GPUs
>course forum says to pay cloud hosting companies to borrow GPU servers by the hour to do homework on
>the cheapest GPU server is like $0.90/hr
>someone did the math and the cost to do homeworks is at minimum $100 if you don't get distracted or stuck ever while doing homework
>if you forget to shutdown the server, it keeps running and you could be billed as much as $650/mo

tl;dr Why are deep meme capable GPUs so goddamn expensive?

Attached: 81a2aBCgDML._SX425_.jpg (425x425, 42K)

Attached: 1556385405322.jpg (901x853, 119K)

It's mindblowing how you know what deep learning is but still wonder why it's currently expensive.

Maybe you shouldn't be on this board.

Attached: h64byvg3xss21.jpg (960x733, 70K)

Maybe you idiot could use an older second hand card?

>$100 class fees
>thinking that's expensive
lmao

>70% of the current gen price
>25% of the current gen neural net training speed

one of my friends is currently working on optimising neural networks and manages to get everything done on a shit tier 6 year old nvidia card, you're either a dumbfuck or haven't paid enough attention to how things work

6 year old Tesla is not the same thing as 6 year old GeForce

it's a geforce, try harder

I built a 5000$ PC to do deep learning. Have been playing vidya since. No big project since them. I was better kf using cloud. Dont build a PC.

the 2060 has as much deep learning performance as a 1080ti, and only costs 300 on ebay, or you can scoop one up for $334 new right now.

Like its really gay that everyone optimizes DL code for cuda and tensor even though AMD cards are capable of the same amount of flops as their nvidia counterparts. But the 2060 is hardly a $1000 gpu and is very capable.

What cloud provider(s) were you using?

Take the free online Stanford course on deep learning if you can (I think it was CS213).
It's very difficult, but you learn a lot and you don't need a GPU.
Also, you don't necessarily need the newest GPUs to do deep learning if you utilize techniques like transfer learning.
For example, I am using a GTX 1070 to learn how to detect the keypoints of a hammer in a picture.

I saved some money for a while working on some wagecuck jobs while I was at uni and managed to get a second hand TITAN X to do all my CUDA stuff.

I use a shitty GTX 750Ti and it works just fine. You can get a used one for like 50-60 bucks.

use CPU

jetson nano?

>proprietary garbage for his compute
sucks to be you
>not developing open AI

None of the major deep learning frameworks run on Radeon. The only actively maintained framework that natively support Radeon/OpenCL is Apache Singa, and it's a specialist framework for niche applications (healthcare mostly).

there are ROCm ports of tensorflow and pytorch at least. They work almost great.

>almost
How so?

navi is suppose to have dedicated hardware for deep learning maybe you should wait a bit