AAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHH

wccftech.com/amazon-graviton-cpus-to-displace-intel-xeon-in-ec2-cloud/

HHHHHHHHHHHHAAAAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHHAHAHAHHAHAHAHHAHAHAHAHAHAHAH*inhales painfully*BRWAAAHAHAHAHAHAHAHHAHAHAHAHAHGHAGHAGHAGHAGHAGHAGHAGHGAHA

Attached: 1535180411693.png (1920x1150, 129K)

Other urls found in this thread:

archive.is/v0R1z
techpowerup.com/250020/intel-candidly-discusses-troubles-at-credit-suisse-22nd-annual-tmt-conference
youtube.com/watch?v=s1Ww2vNAjN0
twitter.com/SFWRedditGifs

Pretty cool in theory, but unless there's heavy evidence that they're actually faster (doubtful), the offerings don't seem very competitive.

It also sucks that they're providing no architectural details at all. Who'd want to invest in a platform that there's no information about? Pretty douchy desu.

>unless there's heavy evidence that they're actually faster (doubtful), the offerings don't seem very competitive
That's not the point.
Amazon already cucking Inturd by going Zen 2 for main servers, now they're going to ditch Zionist turds in the EC2 cloud systems. It's fucking over. Inturd is completely kicked out of Amazon's server space.

Point being, if they're not significantly cheaper, nor significantly faster, what would be the reason for their customers to leave x86 for, relatively speaking, unknown territory?

Meltdown, Spectre, other security holes, Inturd generally being shit housefire for gorrilion buckzoids when you can get cheap as fuck Zen and ARM that's efficient AND performs amazing.

Attached: 1535526563072.png (2000x1543, 58K)

>Amazon throwing Intel away
>Applel on the verge of throwing Intel away
>AMD back in action

is it finally happening?

Attached: 1541374984016.gif (250x300, 785K)

>Applel on the verge of throwing Intel away
Already did in mobile/notebook, lel.

Because Amazon will be basically giving away free compute resources for you to use it.

What do the security holes even matter if they're using an ARM core that is even slower than Intel with fixes? Given that they're providing literally zero details, we don't know whether it is.

I feel bad for those companies who actually bought a shitload of Skylake Xeons in advance, only to get cucked by Meltdown, Spectre, L1TF, etc.

It's a new ARM, not those shitass old mobile ones.

What if security is more important to you because you're already running FORTRAN code from 1987 and 1.1Ghz is already 10,000 times faster than your current hardware.

Sure doesn't look like it.

Attached: ss.png (842x915, 101K)

How would you know? Again, there are literally no details apart from the name.
Because with fixes, they are basically secure. Slower than default, but as secure as with fixes. If "slower than default" is still "faster than this ARM offering", there are no benefits to either security or performance.

* Higher is better.

>Longer is better

Attached: 1541224029266.png (1900x900, 19K)

>How would you know?
ARM was making new models recently, which were aimed at Snapdragon 8150 and better systems.

Ha! Intel wins again!

>amazon graviton cpus
can i play heartstone with that?

MORE CORES
MORE RINGBUS
BIGGER IS BETTER

Attached: Intel-Broadwell-EP-Ring-Bus-LCC.jpg (579x864, 156K)

EVEN MORE RINGBUS

Attached: v4_24coreshcc.png (1499x847, 116K)

meltdown and spectre affects all architectures ever made. It has nothing to do with Intel.
It's been like two years, how do people not know this?

>ARM
I'll wait until a bit more development on the software side of things has been made. Right now, x86 is the only sane option for compute.

intel thinks they can kill zen2 with an overpriced 10 core bingbus processor lmao

poorjeets will be out a job soon.
maybe amazon will hire you.

>Point being, if they're not significantly cheaper, nor significantly faster, what would be the reason for their customers to leave x86 for, relatively speaking, unknown territory?

They'll be cheaper to run, which is all Amazon cares about, higher margins per node.
Cost to the customer stays the same, and on 90% of workloads the customer won't notice any performance difference.

Wait so they are switching to EPYC for the AWS and now their own ARM for EC2?

Did they just purge Intel out of their entire server business?

kek

archive.is/v0R1z

NONONONONONO
INTEL 10NM SUPERPOWER IN 2020

Attached: 1527629778452.jpg (679x758, 54K)

Sure, shlomo, sure.

Attached: 1543201660279.png (631x310, 14K)

...

>Meltdown
>Affects all
>AMD
Full retardation right here, folks.

>Did they just purge Intel out of their entire server business?
Yes, but not from "entire" yet, just personally theirs.

>2020
>already delayed to 2021

techpowerup.com/250020/intel-candidly-discusses-troubles-at-credit-suisse-22nd-annual-tmt-conference

Attached: 1539152405891.gif (200x150, 2.75M)

Lol and like that Intel is dead
Based arm and amd Zen chiplets killed them in under 2 years
Eat shit and die Intel the only market they even matter a fuck in now is pc gaming which is also in decline

>the only market they even matter a fuck in now is pc gaming
2700X matches (+/- 2%, which is a margin of error territory) 8700K in gaymen and utterly OBLITERATES it in serious productivity/multitasking.

Attached: MEGATASKING.png (300x300, 60K)

I have a 1080 and 2700x basically a perfect match anything over a 1080ti like a 2080ti gets bottlenecked by a 2700x at any res under 4k
Oh well 3700x next year and 7nm ddr5 4700x after that
Gonna be a bloodbath in 202x yo

>tfw gonna get Ryzen 3000 or 4000 and 2nd or 3rd gen RTX unless AMD sweeps in with something great

Attached: 1491930149065.jpg (852x912, 140K)

>2700x basically a perfect match anything over a 1080ti like a 2080ti gets bottlenecked by a 2700x at any res under 4k
No it doesn't, fuckshit. 2700X is more than enough to fully sustain even two 2080 Ti in SLI at 8K, I've tested it.

Well Vega 20 was the first 7nm gpu to market so next year there's big navi and then Arcturus in 202x
I'm happy with my 1080 for now it's plenty for 1080p

>1080
>for 1080p
Full retardation is at hand, ladies and gentlemen.

It's not though the latest benches I saw from gn and unboxed basically showed the i9 housefire pull ahead by 25% at 1080p 1440p but only with a single 2080ti
It struggles at 1440p and 4k is off the table
2560x1080 resolution I'm waiting for the gtx2560 7nm refresh

>the latest benches I saw from gn and unboxed
Hidden and saged.

>struggles at 1440p
>1080
U INSTALLED LATEST GIMPWORSE DRIVERS YET, EH, GOYIM?

Attached: GIMPWORSE VS FINEWINE.png (736x736, 1.55M)

Oc it craps on a 56
>bfv
Nooe I hated Bf1 and bfv was so bad I didn't buy it fuck that game

Sure, kid, sshhhuuuurrree:

Attached: 674858568.jpg (1265x1416, 655K)

Oh yeah poost the only game besides doom and wolf that actually uses Vega hahaha k mate

Holy shit you're in full buyer's remorse denial of reality damage control, lol. Pathetic and sad.

Attached: 1539299904198.gif (320x384, 2.23M)

I owned a 56 as well it was so slow I went back to my 1080

Vulnerabilities GOOD
Housefires GOOD
Orange logo BAD

Before you throw shill stones maybe you should take a look at who the worst offenders in this utterly wretched thread are.

Stay mad and dumb, brainless marketing victim milking cow.

>Orange
Wow, your monitor is utter shit it seems.

But again, then, why would their customers choose these instances over x86 ones?

>higher is better

So Vega being a hot flop for gaming outside of literally 5 titles makes me dumb and brainless?
I'd buy amd again but I'm not touching 14nm Vega

Did you read the fucking article? They're ARM chips and Amazon is going to offer their usage at an up to 45% discount compared to Intel.

Why is Jow Forums fucking illiterate?

Yes, yes it is.

Attached: 1543195198682.png (745x281, 18K)

>thread about enterprise server usage of processors
>15 year olds go bananas and have to argue about video games because they haven't used a computer for anything serious in their entire lives
off my fucking board REEEEEEEEEEEEEE

>performs better than 1080 and 1080 Ti after latest driver updates
>flop for gaming
You're an absolutely brain-damaged cretin. A total fucking idiot.

More like vomit lake LMOA

Attached: 1543202335624.gif (226x195, 1.94M)

>performs better than a 1080ti
Post benchmarks or fuck right off this is the biggest fucking lie in this dumb fuck thread that would put it on rtx 2080 Ti tier perf
FUCK OFF RAJA CRAPACURRY INTEL SHILLSHITTER
Who the fuck cares about what Amazon does apart from set trends? On the upside they are ripping market share away from incel but they are just going to become the new monopoly along with arm/amd
Fuck jeff bezos and his weird bald head I ain't no corporate cocksucker

>Wow, your monitor is utter shit it seems.
Are you trying to tell me the AWS logo is not orange?

Attached: aws.png (720x720, 51K)

Please needfuly delet sir

Attached: 1539796017419.png (691x750, 33K)

>Post benchmarks
He literally did, twice, you brain-damaged cocksucking imbecile. 56 performs better than 1080 after latest driver updates, so 64 now performs better than 1080 Ti considering the massive gimping that took place with latest noVideo drivers. If you can't count 2+2, that's not our problem.

In one game? Nobody plays dirt 4
Bfv is the same it's just a glorified rtx tech demo the underlying game is shit nobody bought it
Also nobody has a founders edition 1080 mostly aftermarket

>1080ti slower than Vega
Either post proof or fuck off.
God the Intel raja shills are waaaaaay off point

Attached: RAJ.png (595x654, 312K)

>user posts pic with red bars and red logo of [H]ardOCP site
>responds with "orange logo bad"
>gets called out on incorrect colors
>backpedals and changes goalposts to make it seems like he's not retarded, but others are

>SKL/14nm++
It is quite strange how Intel has pushed no architectural upgrades at all, whatsoever since Skylake. I could perhaps understand it for one generation if it were that they'd already designed their new architectural features for the new node and wanted to wait until they got the node working instead of spending time on backporting it, but with Comet Lake it will be like four generations after the original Skylake. Are they actually totally out of architectural innovation?

Attached: INTURDED.png (732x312, 1.75M)

Not even him, but obviously noone would call the HardOCP logo orange, whereas the whole thread is about Amazon.

Nice non-argumentation and sperging, kid. Stay mad and reality-denying.

sir i cant take the abuse sir plese

I'm still waiting for those 1080 Ti < Vega 64 benchmarks

Attached: 37071366_1797826666963569_1927026172151988224_n.jpg (419x412, 36K)

>It is quite strange how Intel has pushed no architectural upgrades whatsoever at all since Sandy

>Are they actually totally out of
youtube.com/watch?v=s1Ww2vNAjN0

>he's an Aussie
That explains almost everything, alright. World's news come to your barren shithole way slower and later than to the rest, I guess.

I am still waiting for those benchmarks showing Vega is faster than 1080ti?

Post the benchmarks. While you're at it maybe you'd write some drivers for ryzen apoos, because the customers are pretty mad, rajesh.

>It is quite strange how Intel has pushed no architectural upgrades whatsoever at all since
Nehalem

>He can't do simple math
Stay mad, brainlet.

AMD math used by special boys like you doesn't work in the real world. Just curious, are you that retard who though that he got RTX working on vega 64?

>owners are mad
u fockin wot cunt? my 2400G hums along nicely, getting top tier performance for iGPU, stop being a fuckwit Lao

>14nm+++++++++++++++++++++++++++++++

mobile ryzen apus

Haswell and Skylake were changes, at least. No matter how great you may debate they were, they were at least something. Since Skylake, *literally* nothing at all has happened.

That being said, Haswell in particular was actually a fairly solid upgrade over Sandy Bridge. It brought fairly major things, like 3-input µops, reg-reg move elimination, AVX-2 and a quite significant extension of issue width, and there is no real dearth of benchmarks that show a somewhat significant IPC uplift (10%+) with it. Skylake is perhaps more mediocre.

NOOOOOOOO

Attached: 1515405625810.png (675x827, 35K)

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

>40 shitkikes unironically voted shittel

netbookcheck has been posting great results, you're fucking reaching now Chow

Everything released after Sandy, is a reharsh/rebadge of Sandy.

When will this be available? I want to lower my costs ASAP. I don't care if I have to rebuild multiple machines.

QUADS OF TRUTH
INTEL SHITTERS BTFOERVER

>Ryan Shrout, JayZ, Tom Pabst, François Piednoël...

But it's literally not. I just explained some of the differences between Sandy Bridge and Haswell, for instance. Everything released after Skylake, on the other hand, actually is just a rebadge of Skylake.

>it's literally not
Coping an denying harsh truth of reality hard, I see.

intel for only high frame rate games.
if you play 60fps.
it doesn't matter.

If you read my original post, you'd see that I'm not arguing that the changes since Sandy Bridge were great and revolutionary. I'm just saying that they do exist, and have concrete results (no matter if you deem them small or great). This being a contrast to Intel's timeline since Skylake, in which literally nothing at all has happened. Questioning, therefore, why Intel literally has done nothing at all in that time.

INTEL FAGS BTFO