Is it over Intel/Nvidia bros?

>32 core 64 Thread Threadrippers
>3GHz base 3.4Ghz Boost
>32 Threadrippers are 250W (as compared to Intlels 28 core housefire)

>7nm Epyc CPU's
>new Vega 56 "Nano card"
>New vega GPU at 7nm with 32GB of HBM2
>1.35x Performance increase
>2x more energy efficient

Meanwhile Nvidia's CEO doesn't want to speak about new card releases and Intel is trying to make it's move in the nuclear power industry.

is this it bros?

Attached: leaked picture Intlel HQ.jpg (1072x650, 142K)

Other urls found in this thread:

en.wikipedia.org/wiki/Comparison_of_deep_learning_software
en.wikipedia.org/wiki/Dennard_scaling
twitter.com/SFWRedditImages

Attached: t-threadripper 2 will be DOA r-right.png (501x462, 77K)

Nvidia still dominates scientific computing/deep learning

>32 core 64 Thread Threadrippers
Which will be beat by the lower core higher clocked intel equivalent.
.35x Performance increase
Which will be beat by the new nvidia cards coming out in a few months. The ceo was making a joke, when he said in a long time.

IF what we heard about rcom is true and amd managed to create another HIP tool that directly translates cuda to its own without modifications

then its over for nvidia they simply wont be able to keep up with the superscalar desing at any level

where's your source

>streetshitter thinks nvidia has competition

i bet you think vega was a success

Attached: b044444ac6.jpg (490x554, 66K)

nice filename :)

>i bet you think vega was a success

Attached: market share.png (1464x555, 82K)

It's happening
Hopefully my 1080 g1 lasts til 2020 by then mcm 7nm will be mature enough and I can comfortably upgrade to 4k 250hz

That's a really dumb take. RTG remains terrible and the Vega nano is still on shitty 14nm by gloflo. Until amd gets real yields on a non-garbage fab don't expect them to remotely compete with Nvidia. CPU is a completely different story, and they're poised to lose big time again when Intel pulls bullshit lock-in agreements in the hpc space to prevent the superior amd offering getting traction. It's all happened before, it'll all happen again.

Nvidia is safe, I doubt AMD will be capable to salvage the garbage that is vega. Intel though is in full PANIC mode.

What source do you need if only one manufacturer makes CUDA GPUs.

Nvidia still dominates overfunded institutions who are stupid enough to invest 100k+ into a cludgy gaming platform rebadged as a research tool.

>What source do you need if only one manufacturer supports a niche, proprietary API

What did he mean by this?

Please read again what we are talking about.

>Intel will calmly execute their master plan to defeat Ryzen
>rushes an overclocked Xeon pulling 1500W on stage to try and grab a headline
>gets BTFO the next day by Kikeripper
>immediately demands the overclocked Xeons back from Asus and Gigabyte
Truly the image of a calm, collected company with a long-term plan for dominance.

Attached: sorry.gif (978x478, 427K)

>It's another AMDrone thread from Jow Forumsack crossboarders who think their favorite multi billion dollar company isn't jew tier due to mental gymnastics episode.
Protip they are there for profit so you all should stop giving them free marketing, at least ask for something in return you dumb NEETs

You wrote precisely why Nvidia still dominates the scientific field. Why act so obtuse when you know yourself what exactly is happening.

But we are getting something in return. Awesome CPUs at a fair price. And your tears. That's worth a lot, user.

>3GHz base 3.4Ghz Boost
engineering sample

Thats a good one marketeer kun.
Upboted
Dont forget to buy the advertiser rights on Jow Forums, send an email to hiroshimoot.

>Ackshually
But I know a good upgrade when I see one. Come August, Threadripper mine.

Attached: Selection_022.png (702x317, 37K)

Please explain what exactly is happening.

>Nvidia still dominates the scientific field.

oh, I guess that's why Intel put FPGAs in their newest niche product.

I really have no idea what you're trying to say. Not everyone thinks the same way you do.

Wait they demanded it back? Source I haven't seen this.
T. Intel to Zen cuck

Thanks user

Scientists use CUDA because they can't be bothered to learn anything else(these are the people that stop use Fortran ffs). Only one company makes CUDA GPUs. CUDA is proprietary. Therefore Nvidia dominates the scientific field.
en.wikipedia.org/wiki/Comparison_of_deep_learning_software

Give us Vega40 7nm 2.2ghz AMD

Intel still rules the third world countries in cpus market. A few people in these countries know about amd.
So even amd make the super duper micro circuit, intel still have better marketing for poor countries. So... intel still relevant.
>t. Sopa de Primata

Well you got me there. You have sources and everything.

Attached: Darwin4athiests[1].jpg (1565x913, 294K)

>Which will be beat by the lower core higher clocked intel equivalent.
Only on liquid nitrogen LUL

Attached: 43awr5p06d211.jpg (1536x869, 74K)

There's nothing wrong with FORTRAN, brainlet

wow

I like this timeline

Attached: dRINhkW.jpg (711x457, 51K)

>placed on a trolley cart
Truly cascade lake.

Attached: 1528257832783.png (823x520, 436K)

Nice stock cooler there, Intel. Nothing like those shitty ones that AMD packages for their cpus.

This is what AMD demoed with.

Attached: threadripper_cooler_vladsavov_computex18.jpg (1200x800, 110K)

Attached: Everything's fine.png (653x726, 84K)

core 64 Thread Threadrippers
>half of dies does not have direct access to RAM

Someone should add one extra layer to this

Attached: 1526745411872.png (882x758, 468K)

You want to game on a 32 core machine?

So intel doesn't have legit PCIe because half of the lanes are not direct?

2/10

How big a deal is this for anything that isn't memory intensive?

Nvidia has proven they do think about amd when they liked epyc fag

Get epyc if your are concerned.

I don't know if I should be concerned. Nothing I do is specifically memory intensive, it just likes moar coars.

Worst case scenario you can just limit tasks to the 16 cores with direct memory access.

the cards arent the problem the software is user
amd doesnt have anything to compete with cuda and gameworks

I'll put it like this. If you make money with your computer and you're worried TR might not be enough then go for Epyc. I mean, if we're talking money here then what's the difference an extra grand for the CPU make? In my case, I need the cores but the memory bandwidth isn't that important and even though I make about 7 grand a week on my computer, I'm going 32 core TR in August. If I needed the memory bandwidth though, I wouldn't hesitate to just go Epyc. I'd go Epyc now if the single process speed was competitive with TR.

Like I said, as far as I know it shouldn't be an issue. I'm just not that educated on the impact this is going to have to latency.

And I'm holding out until 7nm for an upgrade. I'd love to see a 48c or even 64c 3950x.

>7% increase over a year
>still 65% nVidia

>Which will be beat by the lower core higher clocked intel equivalent.

1) The stock clock is listed in the photos as 2.7GHz which is lower than AMD's
2) They have a phase change cooler needed to overclock it to 5GHz which isn't that impressive. Threadripper can do the same with enough cooling too.

How incredibly mismanaged do you have to be to fuck up as badly as intel did?
It just doesn't end.

Attached: 60cd5ad54b906bc9aa7d6c8ed6d71226b1eaf828c8b683b5194f0f42eac1ec6a.png (765x586, 240K)

>all that shit and Poozen still can't game and has shit single core perf

IAEA classified it as a potential threat and they had to remove it?

>32 cores for MUH GAYMEN
Kys

>New vega GPU at 7nm with 32GB of HBM2

HOREY SHEEIT
Are these official claims!??

PLEASE BE REALLL

SEETHING

Attached: 1525554090644.jpg (1079x784, 161K)

>Intel is trying to make it's move in the nuclear power industry.

Kek'd

Intel was demoed with an industrial cooler that's massive enough to have its output measured in horsepower rather than watts.

Yes and it's already being sampled to partners. It's business part and it's still an old Vega shrink.
>only games are latency-sensitive
It's the first CPU to have dies without direct memory access. Remeber what GTX970 was BTW.
I really do not know. One could probably simulate that on current Threadripper by using other die's memory only.
Non-overclockable, slower RAM support, different motherboards.

I wish that AMD still made a two die TR though.

What was the point of this?

AT LEAST
>Gets molested by original Ryzen
WE
>Gets penetrated by Threadripper
STILL HAVE
>Gets slam dunked by Meltdown and Spectre
OUR
>Gets dunked on EVEN HARDER by Spectre
SINGLE CORE
>Gets brutalized by Threadripper 2
PERFORMANCE!

Yes, but they are for workstation and server use. Consumer 7nm announcements later.

Something I've always wanted to know, why doesn't the speed of processors increase with new models? A 20 year old processor is still 3Ghz.

As I understand it, that is how many instructions it handles, right? So why doesn't it improve?

That's the 1700w cooler Intel used for the demo.

They weren't.

Attached: Screenshot_20180606_163636.png (881x188, 24K)

35% of the market is a healthy share, especially for a company where resources are stretched much thinner than their competition.

But that wasn't the subject anyway - it was whether Vega was a success, and AMD's market share jumped 6% in the quarter following Vega's launch.

Attached: Amada.jpg (964x960, 107K)

Yes but why? Just to show what their CPUs can do at extreme conditions?

>another trap mascot
kys

I see, so they do increase?

More recently we've been coming up against limits that we can't quite break, at least not until quantum computers become viable.
So the second best course of action is to spread out the workload and do as much as we can at the same time.

Because otherwise this housefire will throttle to 10% speed.

Thanks user.

>jumped 6%
That's what I call a success!

>Need a dedicated 15A circuit for the CPU alone and another 15A circuit to power the cooler and accessories with little headroom left over

You can't make this shit up

Attached: RipperZ.png (1166x925, 996K)

HAHAHA WTFF

THE STATE OF INTEL
THEY NEED A FUCKING MINIFRIDGE FOR THEIR HOUSEFIRES AHHAAAA

Quantum computers are entirely different beast, they can't even do "normal" computations like normal computer do - not an upgrade over transistor based technology.

We can however still go back to vacuum tubes - litographically print them in silicon and copper like normal microtransistors. They can be made smaller and can operate at much greater frequencies.

COPE

Attached: Amada!!!.jpg (1616x2048, 277K)

Also because the gate has nothing in it (literally vacuum) it doesn't heat up, the only heat dissipation would be from current going through copper traces - that would be several orders of magnitude less heat output than semiconductor based CPUs.

None of the dies have direct access to ~all~ the ram in threadripper 1. It's not that big of a deal.

>As I understand it, that is how many instructions it handles, right? So why doesn't it improve?
Frequency of CPU is frequency of internals and also the maximum possible instruction execution frequency. Modern CPUs can execute more instructions per cycle at same frequency.
Memory latency reduces instruction throughput.

Only RISC processors can execute one instruction per cycle and they do not have that high frequencies.

It not that big deal because OS kernel maps virtual memory onto physical memory accordingly among other reasons.

>he needs a 1200W PSU for his cooler
>he needs active cooling for his fucking cooler because it itself is a fucking housefire
I wonder if the two PSUs needs their own water cooling

HAAAAAAAAAAAAAAA

1700w+*

If Vega 64 and 56 are anything to go by, AMD is dead when it comes to GPUs, especially high end GPUs

no one was supposed to have noticed that. we only know about it because anandtech managed to get a short hands-on with the system. it litelarily is just an overclocked $9k server chip

Honestly it's not quite as bizarre as people think, the first 1ghz PCs had fridges built into the case, but obviously this set up is not practical at all

>Yes but why? Just to show what their CPUs can do at extreme conditions?

Because intel has nothing, this was an attending grab hoping to steal thunder from amd.

Because dennard scaling is over. Power dissipation problems goes up the roof over 5 Ghz even with the Intel's 14nm

en.wikipedia.org/wiki/Dennard_scaling

Attached: 1525082832211.png (645x773, 87K)

Nice picture, Jordanian Peterson cock sucker

Attached: 1528208645545.png (800x450, 971K)

amd is moving vega to 7nm, which is cutting power cost significantly, while increasing power by 35%
even if everything stayed exactly the same and its just a 35% increase in power, that puts it at 1080 ti territory.

if this becomes a 300$ part, its very hard to argue with the performance, if it stays a 400$ one, its nvidia's pricing which will determine how good a deal amd is.

Power=2π * frequency * voltage * gate capacitance
voltage ∝ frequency

Increasing frequency increase voltage too leading to quadratic growth
Increasing cores just linearly increases the number of transistors and their capacitance

Attached: Muh Power density.png (768x539, 41K)

The reason they snatched the CPUs back from Asus and Gigabyte is because they let the cat out of the bag about the chiller and the overclock. Intel were trying to present it as running at stock and without exotic cooling. I guess they forgot to tell their partners that they weren't supposed to take the system apart and show it to everybody.

Attached: homer.png (335x455, 196K)

they wont on "10nm" compared to 7nm amd

It did, but only to the dumb normalfags(and that buttmad French shill ex-engineer on Twitter) who won't even buy this, anyone with a speck of silicon knowledge would notice the red flag as soon as they saw the clocks.

Pure publicity stunt.
I fucking wonder how half these 'tech sites' didn't notice until a day latter..