Nvidia still dominates scientific computing/deep learning
Aiden Murphy
>32 core 64 Thread Threadrippers Which will be beat by the lower core higher clocked intel equivalent. .35x Performance increase Which will be beat by the new nvidia cards coming out in a few months. The ceo was making a joke, when he said in a long time.
Landon Ward
IF what we heard about rcom is true and amd managed to create another HIP tool that directly translates cuda to its own without modifications
then its over for nvidia they simply wont be able to keep up with the superscalar desing at any level
It's happening Hopefully my 1080 g1 lasts til 2020 by then mcm 7nm will be mature enough and I can comfortably upgrade to 4k 250hz
Jaxson White
That's a really dumb take. RTG remains terrible and the Vega nano is still on shitty 14nm by gloflo. Until amd gets real yields on a non-garbage fab don't expect them to remotely compete with Nvidia. CPU is a completely different story, and they're poised to lose big time again when Intel pulls bullshit lock-in agreements in the hpc space to prevent the superior amd offering getting traction. It's all happened before, it'll all happen again.
Carson Allen
Nvidia is safe, I doubt AMD will be capable to salvage the garbage that is vega. Intel though is in full PANIC mode.
Brody Flores
What source do you need if only one manufacturer makes CUDA GPUs.
Levi Gray
Nvidia still dominates overfunded institutions who are stupid enough to invest 100k+ into a cludgy gaming platform rebadged as a research tool.
Hudson Anderson
>What source do you need if only one manufacturer supports a niche, proprietary API
What did he mean by this?
Nathaniel Brooks
Please read again what we are talking about.
Evan Brooks
>Intel will calmly execute their master plan to defeat Ryzen >rushes an overclocked Xeon pulling 1500W on stage to try and grab a headline >gets BTFO the next day by Kikeripper >immediately demands the overclocked Xeons back from Asus and Gigabyte Truly the image of a calm, collected company with a long-term plan for dominance.
>It's another AMDrone thread from Jow Forumsack crossboarders who think their favorite multi billion dollar company isn't jew tier due to mental gymnastics episode. Protip they are there for profit so you all should stop giving them free marketing, at least ask for something in return you dumb NEETs
Kayden Carter
You wrote precisely why Nvidia still dominates the scientific field. Why act so obtuse when you know yourself what exactly is happening.
Gavin Reed
But we are getting something in return. Awesome CPUs at a fair price. And your tears. That's worth a lot, user.
John Hughes
>3GHz base 3.4Ghz Boost engineering sample
Jacob Rivera
Thats a good one marketeer kun. Upboted Dont forget to buy the advertiser rights on Jow Forums, send an email to hiroshimoot.
Hunter Lee
>Ackshually But I know a good upgrade when I see one. Come August, Threadripper mine.
oh, I guess that's why Intel put FPGAs in their newest niche product.
I really have no idea what you're trying to say. Not everyone thinks the same way you do.
Jack Carter
Wait they demanded it back? Source I haven't seen this. T. Intel to Zen cuck
Nathan Adams
Thanks user
Brody Lee
Scientists use CUDA because they can't be bothered to learn anything else(these are the people that stop use Fortran ffs). Only one company makes CUDA GPUs. CUDA is proprietary. Therefore Nvidia dominates the scientific field. en.wikipedia.org/wiki/Comparison_of_deep_learning_software
Aiden Davis
Give us Vega40 7nm 2.2ghz AMD
Camden Cruz
Intel still rules the third world countries in cpus market. A few people in these countries know about amd. So even amd make the super duper micro circuit, intel still have better marketing for poor countries. So... intel still relevant. >t. Sopa de Primata
Adam Edwards
Well you got me there. You have sources and everything.
So intel doesn't have legit PCIe because half of the lanes are not direct?
2/10
Justin Adams
How big a deal is this for anything that isn't memory intensive?
David Morgan
Nvidia has proven they do think about amd when they liked epyc fag
Nathan Cruz
Get epyc if your are concerned.
Hunter Barnes
I don't know if I should be concerned. Nothing I do is specifically memory intensive, it just likes moar coars.
Worst case scenario you can just limit tasks to the 16 cores with direct memory access.
Hunter Brown
the cards arent the problem the software is user amd doesnt have anything to compete with cuda and gameworks
Joshua Jones
I'll put it like this. If you make money with your computer and you're worried TR might not be enough then go for Epyc. I mean, if we're talking money here then what's the difference an extra grand for the CPU make? In my case, I need the cores but the memory bandwidth isn't that important and even though I make about 7 grand a week on my computer, I'm going 32 core TR in August. If I needed the memory bandwidth though, I wouldn't hesitate to just go Epyc. I'd go Epyc now if the single process speed was competitive with TR.
Colton Martin
Like I said, as far as I know it shouldn't be an issue. I'm just not that educated on the impact this is going to have to latency.
And I'm holding out until 7nm for an upgrade. I'd love to see a 48c or even 64c 3950x.
Cameron Jones
>7% increase over a year >still 65% nVidia
Alexander Anderson
>Which will be beat by the lower core higher clocked intel equivalent.
1) The stock clock is listed in the photos as 2.7GHz which is lower than AMD's 2) They have a phase change cooler needed to overclock it to 5GHz which isn't that impressive. Threadripper can do the same with enough cooling too.
Oliver Roberts
How incredibly mismanaged do you have to be to fuck up as badly as intel did? It just doesn't end.
>Intel is trying to make it's move in the nuclear power industry.
Kek'd
Andrew Perez
Intel was demoed with an industrial cooler that's massive enough to have its output measured in horsepower rather than watts.
Jaxon Torres
Yes and it's already being sampled to partners. It's business part and it's still an old Vega shrink. >only games are latency-sensitive It's the first CPU to have dies without direct memory access. Remeber what GTX970 was BTW. I really do not know. One could probably simulate that on current Threadripper by using other die's memory only. Non-overclockable, slower RAM support, different motherboards.
I wish that AMD still made a two die TR though.
Tyler Hernandez
What was the point of this?
Dylan Clark
AT LEAST >Gets molested by original Ryzen WE >Gets penetrated by Threadripper STILL HAVE >Gets slam dunked by Meltdown and Spectre OUR >Gets dunked on EVEN HARDER by Spectre SINGLE CORE >Gets brutalized by Threadripper 2 PERFORMANCE!
Nathan Davis
Yes, but they are for workstation and server use. Consumer 7nm announcements later.
Jonathan Price
Something I've always wanted to know, why doesn't the speed of processors increase with new models? A 20 year old processor is still 3Ghz.
As I understand it, that is how many instructions it handles, right? So why doesn't it improve?
Yes but why? Just to show what their CPUs can do at extreme conditions?
Elijah Flores
>another trap mascot kys
Camden Howard
I see, so they do increase?
William Howard
More recently we've been coming up against limits that we can't quite break, at least not until quantum computers become viable. So the second best course of action is to spread out the workload and do as much as we can at the same time.
Gavin Morgan
Because otherwise this housefire will throttle to 10% speed.
Oliver Hughes
Thanks user.
Noah Hall
>jumped 6% That's what I call a success!
Jordan Thompson
>Need a dedicated 15A circuit for the CPU alone and another 15A circuit to power the cooler and accessories with little headroom left over
THE STATE OF INTEL THEY NEED A FUCKING MINIFRIDGE FOR THEIR HOUSEFIRES AHHAAAA
Anthony Allen
Quantum computers are entirely different beast, they can't even do "normal" computations like normal computer do - not an upgrade over transistor based technology.
We can however still go back to vacuum tubes - litographically print them in silicon and copper like normal microtransistors. They can be made smaller and can operate at much greater frequencies.
Also because the gate has nothing in it (literally vacuum) it doesn't heat up, the only heat dissipation would be from current going through copper traces - that would be several orders of magnitude less heat output than semiconductor based CPUs.
Anthony Fisher
None of the dies have direct access to ~all~ the ram in threadripper 1. It's not that big of a deal.
Leo Phillips
>As I understand it, that is how many instructions it handles, right? So why doesn't it improve? Frequency of CPU is frequency of internals and also the maximum possible instruction execution frequency. Modern CPUs can execute more instructions per cycle at same frequency. Memory latency reduces instruction throughput.
Only RISC processors can execute one instruction per cycle and they do not have that high frequencies.
Christopher Gray
It not that big deal because OS kernel maps virtual memory onto physical memory accordingly among other reasons.
Juan Moore
>he needs a 1200W PSU for his cooler >he needs active cooling for his fucking cooler because it itself is a fucking housefire I wonder if the two PSUs needs their own water cooling
HAAAAAAAAAAAAAAA
Matthew Morgan
1700w+*
Liam Murphy
If Vega 64 and 56 are anything to go by, AMD is dead when it comes to GPUs, especially high end GPUs
Colton Ramirez
no one was supposed to have noticed that. we only know about it because anandtech managed to get a short hands-on with the system. it litelarily is just an overclocked $9k server chip
Aiden Sanchez
Honestly it's not quite as bizarre as people think, the first 1ghz PCs had fridges built into the case, but obviously this set up is not practical at all
Brody Anderson
>Yes but why? Just to show what their CPUs can do at extreme conditions?
Because intel has nothing, this was an attending grab hoping to steal thunder from amd.
Henry Campbell
Because dennard scaling is over. Power dissipation problems goes up the roof over 5 Ghz even with the Intel's 14nm
amd is moving vega to 7nm, which is cutting power cost significantly, while increasing power by 35% even if everything stayed exactly the same and its just a 35% increase in power, that puts it at 1080 ti territory.
if this becomes a 300$ part, its very hard to argue with the performance, if it stays a 400$ one, its nvidia's pricing which will determine how good a deal amd is.
Easton Young
Power=2π * frequency * voltage * gate capacitance voltage ∝ frequency
Increasing frequency increase voltage too leading to quadratic growth Increasing cores just linearly increases the number of transistors and their capacitance
The reason they snatched the CPUs back from Asus and Gigabyte is because they let the cat out of the bag about the chiller and the overclock. Intel were trying to present it as running at stock and without exotic cooling. I guess they forgot to tell their partners that they weren't supposed to take the system apart and show it to everybody.
It did, but only to the dumb normalfags(and that buttmad French shill ex-engineer on Twitter) who won't even buy this, anyone with a speck of silicon knowledge would notice the red flag as soon as they saw the clocks.
Pure publicity stunt. I fucking wonder how half these 'tech sites' didn't notice until a day latter..