Columbia Med gave me a budget of $1500 to build a deep learning rig for research (MD PhD program)...

Columbia Med gave me a budget of $1500 to build a deep learning rig for research (MD PhD program). I do not game on the PC, and I have not built a PC since I was 12 years old. However, I am excited to build this. Can you tell me what you thing?

>GTX 2080 (not Ti; just as expensive as 1080 Ti)
>threadripper 1950X
>ASRock - X399 Taichi ATX TR4 Motherboard

the rest isn't really important. i'll keep memory low at 32GB.

thoughts!?!? this is an investment that will improve over the years. Most important part is the GPU here. First time not using Intel, though. But I went off the benchmarks on cpubenchmark.net

Attached: file.png (990x624, 172K)

Other urls found in this thread:

pcpartpicker.com/list/ycKQNQ
twitter.com/NSFWRedditVideo

See if you can get your hands on a Vega card, instead of a locked down memetx.

Sounds reasonable

> GTX 2080
> deep learning rig
I don't really know, but aren't consumer cards gimped compares to quadros et al?
Also evaluate how much CPU performance you need, there are rumors of 16 cores CPU for AM4, so TR 1st gen may become obsolete. Or not, if you're aiming at getting Rome, but then again, why get the most expensive soon-to-be-mediocre-among-HEDT CPU.

>GTX 2080 (not Ti; just as expensive as 1080 Ti)
user...look at CUDA cores and memory speed.
He can't. Tensorflow uses nvidia gpu.

1500 isn't enough for any research worth shit. You'll get 2008 level results. If you really want to know what can be done with current-ish state of the art machine learning you'll need at the very least a couple Quadro GP100s NV-Linked together.
Maybe ask around if you can get time on some already existing GPU cluster wherever you're getting your PhD.
You wouldn't ask somebody who's studying medical imaging to put together an MRI machine with 2000 bucks, well, same thing here. Training research level models is expensive.
If you have to absolutely make do with 1.5k, spare the thread ripper, cpu is not too important for ML. And see if you can get two used Quadros and NVLink them together to have more speed and extra memory for your models.

If you don't mind me asking, what sort of software will you be using to train your NN's?

Attached: 1465884421452s.jpg (231x218, 9K)

>MD PhD program
Doctor Doctor Anonymous?

Just buy a macbook bro its that easy haha

Attached: IMG_20180701_121919.jpg (1023x770, 53K)

This thread reeks of neet larping but oh well...

>deep learning rig for research
You need Quadro for that shit, brainlet. GTX is purposely gimped in drivers for compute tasks. Blame Nvidia for being jews.

this

Why did you post this?

...

It's not an issue of gimped drivers, it's just that it doesn't have enough memory for ML models.

Haha faggot. I just got an 8000$ budget for my deep learning rig. Ameripoors beat the fuck out by Dutchfag.

Mine is for cancer research.

Found the gamer

you need a quadro p5000 or better, not rtxgimp 2080

kek.
This, get the shinniest fagbook and go back to your supervisor and claim that the apple ceo turbofag said that the macfag pro is for professionals(duh, it's called pro)

I'd be amazed if you can do that with 1500 dollaridoos.
830 for rtx 2080
600 for tr 1950x
32GB of memory is also gonna cost you at least 200
With that alone you're already over budget

>user changes genders 13 times during the conversation

theres a fork called tensorflow-opencl, so you can use anything that does opencl pretty much now

>tensorflow-opencl,
Yep, just to have 8 times worse perfomance.

pcpartpicker.com/list/ycKQNQ
best you can get with your budget.

>You need Quadro for that shit
Explain further
I thought GTX/RTX meme are good for some Computer Vision tasks.

Attached: explain further.jpg (625x468, 69K)