Intel will be entering the dedicated GPU market in 2020

Intel will be entering the dedicated GPU market in 2020.

What are your expectations, Jow Forums? Will Intel destroy Nvidia and AMD in terms of performance? Will their GPU department be dead in a few years?

Attached: 2019-03-21-image-6.jpg (4032x2268, 478K)

Other urls found in this thread:

youtube.com/watch?v=5FDN6Y6vCQg
pcgamesn.com/amd/radeon-rx-580-review-benchmarks
twitter.com/SFWRedditImages

Couldn't give a shit really but the more competition the better for everyone.

I'm intrigued. They're coming from a lower power background. If they're competitive in performance per watt, then they could build a 300W monster

If it's manufactured on a good process and launches before next gen Nvidia it will be performance per watt leader for a while. Intel can afford to buy market share too so the pricing could be good if they go with that strategy. Drivers will probably be immature though.

That's the part I like as well. Just imagine the price wars. I wish Nvidia would make desktop CPUs as well.

intel are massive jews
don't have the best history with drivers
and they botched larrabee
I'm not getting my hopes up

user I like the way you think. Nvidia pumping out x86 chips would probably be good for our wallets.

Always great to have more competition, but since it's Intel I expect them to be way more expensive than the competition, no matter their performance.

I could care less even if microsoft or jewgle started making their own gpus

never ever happening

>hopes
We go back to sane GPU prices.
>expectations
Nothing.

Considering the fact that Larrabee was scraped it probably shows that they will try to implement a more conventional design. But given the current ray tracing craze they could try to squeeze in more SIMD units in to atom cores again. Would not mind a GPU with less cores but beefier branch predictors and bigger caches.

It's just going to be an updated and upscaled version of their IGP plus ray tracing hardware. Anything bold and new would be a recipe for disaster when the software doesn't benefit from it at launch.

If Intel CPU and AMD GPU are the dumbest combination, would the reverse make AMD CPU and Intel GPU the smartest combination?

Attached: 1544296389497.jpg (959x527, 58K)

Attached: 8o.png (130x126, 21K)

I'm guessing they will intially mostly be more specialized GPUs, I doubt they will be competing with AMD and Nvidia in consumer GPUs much at first.

Perfectly summarized. Absolutely this.

finally now we can delid our gpus

>Will Intel destroy Nvidia and AMD in terms of performance?
no
it will be cheap china knockoff tier
intel has no experience in gpus

also this

>What are your expectations, Jow Forums?
it will have a heat density/mm higer than spaceship re-entering atmosphere

Intel aren't even remotely competitive with AMD in terms of perf/watt for their integrated solutions. It's not really relevant though, as it's not like they're scaling up their current dog shit graphics architecture for discreet cards. Not that I'm expecting much from it either way.

I don't expect much in terms of gaming.
They might be great for computing.

Will happen turning the ARM desktops though.

Look at the power of an AMD integrated GPU. Compare that to the power of an AMD dedicated GPU. Now look at the power of an Intel integrated GPU and extrapolate.
Things don't look very hopeful for Intel.

>Will Intel destroy Nvidia and AMD in terms of performance? Will their GPU department be dead in a few years?
Or will they cancel the project just as they're getting ready to put the first one into a box, and resurrect it a couple months later, minus display output, as "Xeon Phi Next" or some shit?

considering most of their experience will come from poo-in-loo, expect the usual.

so the R300-2? they got the money to pull the miracle here and half of ATI staff

Unless they support CUDA they are dead on arrival. The only realistically positive thing coming out of this might be some even lower prices for the low and mid segment but I don't give a shit about that. I finally want an alternative to nvidias fucking jewish pricing of their Quadros and high end GTX/RTX cards.

Ok calm your tits. Literally no one but the shills expect that. I would be happy enough with lower priced GPUs, but honestly even that is doubtful.

They won't even bother with consumer chips and go straight for the CAD / Quadro market segment. They're gambling on rajit being able to churn out certified drivers though, so it's all gonna crash and burn.

I think AMD will die in the GPU market.

Every OEM mysteriously dropping Nvidia and AMD GPUs from their pre-builds and laptops.

God that better not happen. I will be pissed if the switch is made to ARM and not RISC-V.

>primary suplier of GPU for google, sony and microsoft
>die

Pci-e housefire

>They're gambling on rajit being able to churn out certified drivers though

Attached: 1527816283535.jpg (1280x720, 137K)

How do i delid a gpu?

Why not. AMD sucks in the GPU market.

I HAVE A SINGLE EXPECTATION: INTEGER SCALING.

I'll never get my wish of an ultra low TDP gpu just for extra monitors and basic hardware accel in the current year tho.

>I'll never get my wish of an ultra low TDP gpu just for extra monitors and basic hardware accel in the current year tho.
God I fucking wish. But as you said that'll happen sometime after the heat death of the universe.

Intel has the money to employ an army of driver coders that can rival Nvidia in a way that AMD never will, they just have never desired to put the effort in before. I'm actually optimistic.

Who the fuck needs CUDA when you can just use vulkan?

If Intel wishes for a competing standard, they might actually succeed. They are Intel, some companies simply listen to them.

I don't think they will be able to compete with AMD or Nvidia, they will probably be the worst of the three for a little while at least

>vulkan

Attached: 1555526319768.png (1085x1217, 1.66M)

Nvidia just needs to make CPUs now

>AMD CPU with Intel GPU

Attached: 1560644154772.jpg (258x544, 32K)

So will actual gaming GPUs be in 2020 or will it just be processors for blockchain stuff?

Are they going to try and fab the GPUs themselves on their already capacity-constrained factory lines or contract out to Samsung/TSMC? Intel is a prideful child and control freak when it comes to process nodes.

Get with the times, grandma.
No sane person actually wants to use that proprietary garbage that only works on nvidia when good alternative is available.
>b-but muh libraries
>>>/webdevgeneral/

>CUDA
>webdev
also
>muh proprietary
Just fuck off already.

Attached: 1555521326228.jpg (585x398, 18K)

Nvidia needs to produce a CPU now. Then I can put an Nvidia CPU and an Intel GPU.

Attached: gs5mwtyfxy121.jpg (1932x2576, 496K)

Nothing is confirmed at all, and all of the noise made about Intel coming into consumer GPU is entirely speculation. They are definitely making accelerators for datacenter. Consumer desktop dGPUs is entirely hypothetical.

If they do it though, it would be pretty awesome, assuming it competes.

Can't be worse than the current state of AMD's GPU division.

Attached: Raja-Koduri-Fabrica-de-Samsung.jpg (900x1200, 258K)

They do make CPUs. Denver had a dynamic decoder that translated ARM instructions to its native VLIW format, somewhat like Transmeta's designs. Nvidia never discussed the details of Carmel but it may still function like that. This design would be easier to make accept another ISA if they hadn't added a fallback hardware ARM decoder to improve performance.

im expecting under 1660 ti performance for their top end model. and this is being optimistic

Oh, it can... Just compare how Intel's integrated graphics perform compared to AMD's.

Name 3 (three) reasons why you would want to use CUDA in 2019.
You cannot.

I'm expecting well less than that. Real question is why are they pouring money into this while watching their CPU business implode?

Intel has one massive advantage though: Their R&D budget.
AMD only recently started turning major profits again thanks to Zen, while Intel has been raking in cash non-stop for the better part of a decade now.

"dIvErSiFiCaTiOn" it's a last ditch effort by an unironically dying company lacking any and all competent management and leadership. They are on life support and they know it. It's desperation and nothing more.

they might as well do something else which waiting out the 10nm failure. it's not like they can magically make 10nm work.

No it isn't hypothetical, they already confirmed as such when they announced they had hired Raja. They are making three product lines, one for gaming, one for workstation, one for blockchain.

Their CPU market isn't imploding. They are hitting the limit of Moore's law and their attempt to get out of it with stacked CPUs is 3 years behind schedule. They're just running into design issues, ones they can't solve later and have to solve now.

Why do you think their new CPUs are just the old CPUs running at insane voltages? Their engineers are working in the new chip.

They could buy trade secrets off AMD's manufacturers.

This isn't Intel's first attempt and will probably flop again.

I'd like a GPU that is cooler than my NAVI housefire

Attached: 1556999175342.jpg (653x726, 138K)

Video editing rendering.
3D.
Machine learning.
Checkmate tranny.

this GPUs have largely stagnated and AMD can't keep up. NVIDIA has ever dev by their dick by forcing performance through GPU drivers. The entire market is awful.

>machine learning
lol
>3D
double lol

Just admit that you are incompetent codemonkey who can only use someone else's libraries.

>tranny
nice projection btw

AMD can keep up. Look at the 1060 vs the 580. The 580 is better, but most benchmarking websites don't show that because they're all two years old when devs were intentionally making games for nvidia. Once the money stops rolling in you see the Radeon actually pull ahead because nobody is creating artificial bias.

>lol
>double lol
Truly arguing skills second only to Socrates or Plato. With arguments like that you'll have the entire board convinced before dinner and the entire industry by tomorrow evening.

amd has been doing fine not because of zen, but because they power all the consoles + the new ones are all gonna be amd again

not him but almost all of autodesk and adobe packages support cuda.

Maybe third time will be the charm? They've tried and failed twice already.

raja and keller went to samsung, so they only could have gotten nvidia secrets

>AMD can keep up. Look at the 1060 vs the 580. The 580 is better, but most benchmarking websites don't show that because they're all two years old when devs were intentionally making games for nvidia. Once the money stops rolling in you see the Radeon actually pull ahead because nobody is creating artificial bias.
I just watched a video making this argument:
youtube.com/watch?v=5FDN6Y6vCQg
Going forward, AMD will pull ahead of Nvidia because next-gen consoles will all be AMD-hardware based so the console ports on PC will advantage AMD GPUs. I don't buy it as the argument from forward-looking rendering tech/hardware like with Mantle/HBM
and present console wins on the PS4/XB1 hasn't helped AMD before but supposedly "THIS TIME IS GONNA BE DIFFERENT YOU'LL SEE."

Both previous times they didn't have perfectly adequate IGPs. Gen11 is surpassing Vega 11.

>half of ATI staff
Raja mostly took marketing guys with him, not engineers.

Pretty ironic considering that you did the exact same thing twice with your pics.
Well, it's good that you admit defeat.
Nvidia bribing the devs. Nothing special there.
But i'm talking about programming with CUDA directly.

Honestly "this time it's different" is the tl;dr for every fucking AMD release. Alternatively "good for budget builds shit for everything else".
I wish there was competition at the high end, especially the Quadro segment, but AMD has nothing, absolutely fucking nothing beyond budget GPUs.
At least ZEN seems to be catching up so I can ditch my Swiss cheese CPU without sacrificing performance.

Cuda is legit. Adobes premiere work like ass without it. Especially when doing the 4K edits.

>Pretty ironic considering that you did the exact same thing twice with your pics.
>Well, it's good that you admit defeat.
I'm not even him but sure that truly is another very valid argument, another ad hominem is a great argument, just as lol, double lol and ignoring things the other guy said.

p-please guys a little more optimism.... im tired of funding nvidia for 15 years........

well you should have skipped fermi

Except I didn't say it's gonna be different. I said the 580 was, out of the gate, stronger than the 1060, but nvidia paid devs to put effort in otherwise.

When you buy nvidia you are literally paying for a temporary performance boost owing to market attention that will evaporate after a year.

>I'm not even him
Sure, sure.
>ignoring things the other guy said
You said fucking nothing, you fucking nigger.
But it's fine when you do it, right? Fuck you.
Want a proper conversation, then don't turn it into shit yourself first.

With AMDs GPUs there is a very fine line between "optimism" and deluding yourself.
I genuinely wish there were viable alternatives to Quadros but alas we're in for another massive disappointmentâ„¢.

More ad hominems, name calling and generally "no u". Fantastic arguments there dude. I can't imagine why people have no fucking interest in talking with you. Truly a modern day mystery why no one engages you and your genius arguments.

You started it, so fuck you.

Sure "I" did, there is only one person engaging your bullshit, because there can't possible be more than one person in the entire world disagreeing with your bullshit. Get over yourself and get fucking help for that paranoia of yours.

Black people can keep up. Look at the Blacks Vs Whites. The Blacks are better, but most IQ websites don't show that because they're all racist and old when scientists were intentionally making IQ only for whites. Once whites stop working you see the blacks actually pull ahead because nobody is creating artificial bias.

Do you actually think you've convinced anybody here of anything?

>Jow Forums shit
pcgamesn.com/amd/radeon-rx-580-review-benchmarks

HURP
580 vs 1060 in modern games.

Gt 1030?

30W power draw, runs relatively hot for what it is and only one HDMI port and either one DVI-D or VGA port.

HDMI and DVI-D is all you need, converters are cheap and you're talking about low end.

You think i care about educating autists who cannot even engage in conversation properly?