Server options

Newbie wanting to self-host a website, emails, file storage/backups and also general heavy computations (machine learning and such).
My current options are a Dell R210 II or an HP DL585 Gen7, both for the same price of 500 dollaroos.

What option go with

Attached: FAE506AC-4665-43BD-9CD3-3E2C94C38CDB.png (900x676, 384K)

Other urls found in this thread:

analyticsvidhya.com/blog/2017/05/gpus-necessary-for-deep-learning/
zdnet.com/article/how-the-gpu-became-the-heart-of-ai-and-machine-learning/
youtube.com/watch?v=W2F8Wa65_B4
youtube.com/watch?v=vzojwG7OB7c
cnx-software.com/2018/07/13/rockpro64-dual-sata-nas-enclosure/
youtu.be/oXmqlDJTL5o
github.com/basicmi/AI-Chip/blob/master/README.md
twitter.com/NSFWRedditGif

lol

>self-host a website
Is it just static content? You can literally do that for free on Github or any number of other places. Even hosting a site through an S3 bucket with a fucking CDN will cost you pennies a month.
>emails
Seriously, it’s not worth the effort and your ISP probably blocks port 25 anyway. Get a Fastmail subscription for like $45 per year.
>file storage/backups
How much storage are we talking here? It doesn’t take much horsepower to run a NAS and you could save your cash for hard drives and just get some low end mini-ITX board and case.
>also general heavy computations (machine learning and such).
A lot of these cheap server boards suck for that. First because the processors are nothing special and second because a lot of them won’t boot with a PCIe board that draws more than the max power provided by a PCIe slot (in other words, a decent GPU, which you’ll want for machine learning.

The thing with these big rack mounted boxes is they look cool in the “hey I’ve got a server” sense but unless you REALLY need serious power or a fuckload of disks they suck for home use. The power draw is insane and they usually sound like a fucking jet engine. Most people would be better served getting a modest mini-ITX board, a NUC, whatever.

I'm going to get a PowerEdge T340 for a business I have as a client. They want to share files between Mexico, Colombia, Brazil and Austria. It has RAID support and we will add cloud storage for backup, the thing is, you should not get a real server yet, you are better with a virtual one or paying hosting. Get a normal pc, maybe mini-ITX to save space and have good functionality. You don't need a server.

I run a website hosted on my home network. I have pass through a firewall and it's easy for development. ISPs really don't give a shit if you punch a hole in port 80 as long as the traffic isn't huge.

Very fair points, thanks. Assuming I still wanted to run my own stuff as opposed to buying a service (just for hobbyist sake), what lower power/less overkill things would you recommend? Could something as simple as an older desktop run all of this stuff without a hitch?

Firstly, any computer can be used as a web server with the right hardware & software. Hell; people have run web servers using Contiki on a Commodore 64, with a NIC cartridge. People have even run DOS-based web servers over a 56k modem with SIOUX.

Depending on what you want to do, you can go with a Raspberry Pi to host static or even somewhat dynamic websites (as long as it's lean). Stronger computers (including laptops & desktops from the mid-2000's) can be used well enough as NAS machines. Personally, I once ran a shell server with LAMP on an Acer Aspire One netbook with only 1 GB of RAM & a 160 GB HDD... With Xubuntu running with its full DE.

whenever I see somebody who "needs" something for "machine learning" its always obvious they dont know fuck all

Eh, I’ve just done some data science courses here and there and was interested in having the capability to dick around with ML if I felt like pursuing it. Admittedly I’ve not done anything involving massive datasets, so the hardware needed is foreign to me.

Get a desktop and a decent GPU. You'll be doing machine learning on the GPU, not the CPU. Servers make sense if you're practicing for work, not so much if you're just doing this as a hobby.

>You'll be doing machine learning on the GPU, not the CPU
wtf am I reading?

analyticsvidhya.com/blog/2017/05/gpus-necessary-for-deep-learning/

>analyticsvidhya.com/blog/2017/05/gpus-necessary-for-deep-learning/
>vidhya

If you think GPUs are only useful for gaymen, you'd best get back to /v/

>Taking video game enthusiast blogs as actual information
>Tells others to go to /v/
I kek'd

>literally no background in AI or data analytics
>insists he knows what he's talking about
Jow Forums has turned into /v/....
zdnet.com/article/how-the-gpu-became-the-heart-of-ai-and-machine-learning/

>Taking zdnet and other blogs as valid sources
You're just as retarded as /lgbt/ and /mlp/ now.

Plebeian

See pic

Attached: 382.jpg (403x394, 34K)

Attached: 1549496726514.png (500x501, 90K)

>Still responding
He took my fishing pole :^|

>hey guys I want to do X
>don't do X.
>use a literal botnet instead
Every fucking time
Unless you have a separate room or closet I don't suggest anything smaller than 2U because of noise.
If you want to have a few vm's I recommend the dell R710 since it's 2U and has more expansion slots.
Keep in mind if you're adding gpu's you might have to mod a way to get pcie power.
I haven't used a lot of hp gear but if you plan on doing ML I'm assuming you plan on adding a few gpu's so more slots is probably better.
Depending on where you live ebay is a great place to find cheap old server shit.
There's no need for the latest and greatest like said.

Raspberry Pi.

youtube.com/watch?v=W2F8Wa65_B4

youtube.com/watch?v=vzojwG7OB7c

Attached: Screen Shot 2019-07-01 at 10.25.16 PM.png (843x674, 563K)

Rosewill or whoever has one of those coin mining 4u chassis made for a bunch of gpus, less than a hundred. Retrofit a xeon w/ board from ebay maybe 300+,then spend thousands on gpus. Sounds shit tbhwu

Even a SBC cover most of your needs, plus low power consumption, low maintenance and very low noice.
You can make a NAS: cnx-software.com/2018/07/13/rockpro64-dual-sata-nas-enclosure/
And even put a decent GPU for machine learning on these things today: youtu.be/oXmqlDJTL5o
An alternative to this would be a mini-ITX or an used laptop motherboard, or whatever, without spend more than $150

>rent cheap server with port 80 open
>redirect every request to cheap server to locally running web server that has port >1024 open

What's the flaw in this approach?

>newbie

use a cloud provider's free tier to spin up your own web server and such
as an aside you'll learn azure/amazon or whatever provider you use
if you want to roll your own, and are just experimenting you don't need server-grade hardware to make an internal network with your own little DNS, DHCP, web and email servers
if you want to dabble in machine learning there is a single-board computer called the Jetson that is $100, has great software support from Nvidia and allows you to experiment with machine learning
don't get a server until you have a plan to completely fill it up, like which services and VMs you are going to run
remember that hardware is driven by software and if you don't obey this you will spend money and have shit laying around
nearly every server is too loud unless you are single

Attached: jetson.png (1341x653, 188K)

>GPU for machine learning
Any actual proof of this beyond individuals' blogs and YouTube vlogs?

I don't know quite right, since isn't my business.
That said, that question is very ambiguous to answer properly, there is a heck of dedicated hardware and frameworks for machine learning.
Even RockChip have released a iteration of the RK3399 (the one that powers the Pine64 NAS), the RK3399pro, focused on machine learning (enabling next generation SBCs for the task)

Here more AI chips:
github.com/basicmi/AI-Chip/blob/master/README.md

So, it depends largely on what you want to do and how optimized are the algorithms you would use.
If you can run GTA 5 on a SBC with a graphics card attached, it would be possible to do most machine learning related stuff.

You have to go back.

Just get a Syno bro

Attached: 3DCA329C-7F95-4C0E-B016-29F65B44879F.png (1030x210, 136K)

It is very obvious you don't know what you are doing as you want to do ML on a CPU so my advice is don't.

Never self host unless you really need it and if you'd really need you wouldn't be asking this question.

most of the (((popular))) machine learning frameworks such as tensorflow can be CUDA accelerated.

There is nothing with using a reverse proxy, just make sure that you don't lose any https encryption after the cheap server and youlre good.

CUDA basically.

you know you're wrong and you're just trolling at this point.

>it’s not worth the effort
don't listen to this fag

>machine learning
oh do I have the chassis for you

Attached: r910-inside-web-do_8.jpg (940x587, 85K)

does anyone have a cost effective solution to attach a bunch of drives to my main server? I have 45 3.5" hard drives.

I currently have them in three cheap $120 4u cases that each hold 15, and run straight sas cables to my main servers sas cards. Its messy as fuck and takes up so much space. Is there something that can deliver sata iii (6.0gb/s) speeds with like a proper backplane? Im seeing some expensive solutions that hold 24-48ish drives in a 4u spot

Attached: rsvl4500.jpg (640x480, 27K)

>>it’s not worth the effort
>don't listen to this fag
don't listen to this neet

(1) get a linode instance for the web server and (2) buy a high-core count computational machine for home (for your deepfake/deepnude computations).

>self-host a website

fucking why

Attached: 1534537363164.jpg (113x125, 2K)