Why there is no ARM desktop hardware available?

Why there is no ARM desktop hardware available?

Attached: 91zSu44+34L._SX355_.jpg (355x228, 22K)

Other urls found in this thread:

cavium.com/news/gigabyte-announces-thunderxstation-industry-s-first-armv8-workstation-based-on-cavium-s-thunderx2-processor
nvidia.com/en-us/autonomous-machines/embedded-systems-dev-kits-modules/
socionext.com/en/products/assp/SynQuacer/Edge/
chip1stop.com/web/USA/en/dispDetail.do?partId=SOCI-0000001&cid=SOCIEB
twitter.com/SFWRedditVideos

Because ARM a shit.

In a couple of years everything from apple will use a sort of ARM cpu

Because it doesn't allow for bloated and poorly written code. Use something like RISC-OS and you'll be amazed how quick it is on such limited hardware.

The hardware is perfectly fine, it's that software has been written like trash for about 20 years that's the problem.

Because it's designed for lower power devices so it makes little sense in a desktop form factor.
Yeah but can you even call the iMacs "desktops"? I wouldn't be surprised if they never came out with a new Mac Pro, and I doubt they're ever going back to the old tower form factor.

>so it makes little sense in a desktop form factor.
It's many times more powerful than computers that once were desktops.

because all relevant desktop software is x86 or amd64

Hopefully all computers will do the same, and with Microsoft's Qualcomm partnership for those laptops, we might be getting closer to that future.
No more CISC bloat. No more Intel Management Engine or other such botnet

Can't they just recompile it?

Even the software is not a problem.

Pentium pro 180mhz runs latest Office no problems at all.

It's the super bloated web that makes slow processors unusable for everyday use.

MacchiatoBin

*why is there no...

So?
What matters is current litography. 14nm x86 beats 14nm ARM in high performance tasks because it takes higher loads.
Even if it's more efficient, it's the difference between 15W and 95W.

At the consumer-grade, nothing beats x86 at generic computational tasks.

>No more CISC bloat

Attached: 1485130830315.jpg (645x968, 55K)

there's no demand to recompile proprietary applications for ARM. to do that, one would need people to use it, and for people to use it, they need to have software for it

>t. intel internet defense force

Luckily there are 10 nm ARM chips available.

And there are 12nm x86 and next year, 7nm.

>Microsoft's Qualcomm partnership for those laptops
Too bad smartphone chippies are so slow compared to what amd and intel can do.

>smartphone chippies
CPU Clock Speed: Up to 2.45 GHz
CPU Cores: 8 x Kryo 280 CPU
Looks pretty solid.

Intel Atoms are quad cores and ran at up to 2.4GHz
Still much slower than a kaby lake dual core i3.

>no demand to recompile proprietary applications for ARM
*blocks your path*

Attached: tDDKY4Z4azfoAYL4fMzigX.jpg (3111x1750, 2.75M)

Existing ARM cores are designed for power consumption in single watt range. They're too slow for desktop while their low power consumption is near-irrelevant, and since desktop software typically has shit multithreading, stuffing dozens cores into one chip isn't gonna help either.

>They're too slow for desktop
But they're the same as what was a normal desktop not all that long ago. Again, the problem is bloated and poorly made software,not the hardware.

>Existing ARM cores
*blocks your path*

Attached: Cavium-ThunderX-2-in-Microsoft-OCP-Project-Olympus-Server.jpg (2416x1265, 518K)

>locked down botnet garbage
no thanks

Will this be another Windows RT?

As usual, the problem is Web (n+1)

>Will this be another Windows RT?
Perhaps, if you're asking whether or not it will be a failure. No, if you're asking whether or not it will be as locked down.

>As usual, the problem is Web
It's the most blatant offender, but there's also a massive increase in the general system draw, despite no real added features since the 90s (in terms of the OS and UI itself).

aarch64 support is still shit

There are ARM HTPC. They mostly run Android. A few run Linux.

Windows software can't really into ARM, so reaching the remaining Windows users with that is not going to be practical. Plus its a shrinking - perhaps even slowly dying- market anyhow.

Did you even read my post?
>since desktop software typically has shit multithreading, stuffing dozens cores into one chip isn't gonna help
1st gen Cavium had 48 cores in a ~100W package; this is great for server tasks but not for most desktop ones.

>$1000+ for a slow as shit ARM machine
not thanks. This is even worse than when most laptops moved to ULV cpus.

fpbp

Didn't Apple's ARM chip outperform kaby lake i5s?
Also, these are octa cores not quad cores.

I have already blocked your path

Attached: cavium-original-vulcan-versus-haswell-xeon.jpg (950x469, 113K)

>"projected"
>"will exceed"
>comparing to Intel architectures from fucking 2012/2013

You gotta try harder kiddo. 1st gen Caviums promised a lot too, but turned out pretty mediocre at performance per watt even in perfectly multithreaded tasks.

>slow as shit
nope! It's not gonna be usable for CAD or gaymen, but it should be perfectly reasonable for most other things. Especially if you can install GNU/Linux on them (perfectly possible given they'll likely use Qualcomm's own Wifi chipsets which have great support), as you'd get access to a whole repo of software.

Vaporware.

Attached: 5HETCN[1].png (552x689, 104K)

>Didn't Apple's ARM chip outperform kaby lake i5s?
Doubt it.

It seems to be still alive one way or the other, but I wouldn't expect any breakthroughs.
cavium.com/news/gigabyte-announces-thunderxstation-industry-s-first-armv8-workstation-based-on-cavium-s-thunderx2-processor

In geekbench only, not in anything relevant.

Pretty sure there were dozens of threads being spammed late last year about it

*blocks your path*

Attached: Cavium.png (641x212, 28K)

Why would I pay that kind of money for something that performs like a Celeron?

I use raspberry pi + HDMI 7' monitor for flashing bios. Nothing else.

Because it doesn't. see Also, no IME

These supposedly have been around since 2017 but I can't find a single datasheet, review or user impression.

Attached: 7e3.jpg (600x750, 31K)

soon

Attached: Capture.png (1170x923, 682K)

>merge iPad, Mac apps

They're already turning Macbook Pros into facebook machines, has Apple given up on professionals? They might not be a huge market, but "Apple is good for design/prepress/music/video" meme was a good argument for sales to normies.

>Why there is no ARM desktop hardware available?
Nvidia Jetson TX2 dev board
nvidia.com/en-us/autonomous-machines/embedded-systems-dev-kits-modules/

Attached: jetson.png (713x430, 37K)

Dev board =/= desktop hardware. And it's essentially a smartphone CPU, the real meat is the big number of GPU cores.

because desktop Windows for ARM isn't out, and there's pretty much no desktop market outside of Windows boxes

Define "desktop hardware"

A CPU that doesn't run in the single digit watts and that has decent single-threaded performance.

This looks like something for self-navigating robots or something like that. At first blush I assumed it's meant for consoles or tablets, and that's possible. But support for 6 cameras? That's interesting.

OwO what's this?
socionext.com/en/products/assp/SynQuacer/Edge/

Attached: 139851381934981.jpg (370x460, 65K)

chip1stop.com/web/USA/en/dispDetail.do?partId=SOCI-0000001&cid=SOCIEB
user! It even has a 16x PCIe slot! And DIMMS!

Hardware that is designed to run tasks traditionally performed on desktop computers.

That's a server, as clearly stated in the description. You can nigger-rig a workstation from that, but it's not gonna be cost-efficient.

>a 5-watt CPU in a mATX tower

Attached: laughd1.jpg (162x138, 6K)

>see
Geekbench isn't real life.

It'd usually include being modular, having ports like USB3, SATA, PCI-e and most importantly outperforms laptops.

>24 cores of ARM® Cortex-A53
That's a 2012 low performance low power core. The same used in a RPI3.

>Waah! Your benchmarks don’t matter because I say so!
(You)

Attached: 2B839C9C-281B-4312-963A-27249E72D9C6.png (653x726, 84K)

>T-those benchmarks aren't faked, I-i promise
(You)

Attached: 1505068432917.jpg (813x1402, 161K)

>arm
>snsv
>touchscreen
The ideal facebook machine lol

Attached: snsv.png (582x394, 58K)

>I-It beat Intel, so it m-must be fake!!
(You)

Attached: 6C8857C9-8AF7-4D3F-9352-9F9B88775C2C.png (672x794, 422K)

>g-goybench is trustworthy, I-I mean it!!!
(You)

Attached: 1508383880802.jpg (675x827, 95K)

wonder what pricing will be like on the arm stuff

i don't see a point in running windows on it but a lightweight linux setup could be comfy as fug

It's geekbench. Meaningless numbers.

If it wasn't for Apple's walled garden anyone could prove you wrong, all one had to do was run ffmpeg to convert some video to AVC or HEVC or whatever, and use that as a real workload benchmark.
I'm dead sure an iphone x wouldn't beat an i5 or any Ryzen (even the low end ones) at that.

honestly, this
I can use my fucking raspi to do damn near everything else my actual main machine does... except browse the web comfortably (like, I can do it, but it's on par with the kind of experience I had back in 1999, where I'd open up something else in the background while the browser was busy choking on the page).

MS released a Windows 10 for ARM image that has x86 emulation built in for older programs.
only runs 32-bit x86 software and native ARM stuff, but they're expected to add 64-bit support (especially since the thing only supports 64-bit ARM cpus anyway)

possibly
but RT really only failed because people didn't fucking want metro apps, they just wanted their ordinary desktop software
this latest attempt seems to be MS learning from that failure (namely, people only care about windows because their software runs on it and for no other reason)

it's a long story but you really dont want them. they need some botnet blob binaries to even start running. plus there's a great fucking mess across all the different arm socs implementations around.

hope for risc v