Let's take a moment to laugh at all the idiots who bought Nvidia or AMD cards instead of waiting for the true king to...

Let's take a moment to laugh at all the idiots who bought Nvidia or AMD cards instead of waiting for the true king to arrive

Attached: 1566890721975.jpg (1600x960, 678K)

Other urls found in this thread:

devblogs.nvidia.com/introduction-turing-mesh-shaders/
nvidia.com/en-us/geforce/news/nvidia-adaptive-shading-a-deep-dive/
twitter.com/NSFWRedditImage

flos give an approximation, but they are far from 1:1 company to company. for all we know intel has better flop performance than nvidia does, or intel shits the best on the first gen and gets fuck all till drivers catch up.

That's not how GPU works

AMD has 4096 cores in Pooga but slower than Nvidia

Intel at least has the money to hire a competent horde of programmers for their driver team.

Software support has always been AMD's Achilles's heel even when their hardware was good. You'd think the opposite would be true given how developers have to optimize their games for AMD hardware by default since all consoles except the Switch run on AMD APUs.

intel... they have fuck all for money and are in 80 billion in debt, with 3gn ryzen... I dont think intel has the wiggle room it use to, before they could just lean on xeon sales, but now they cant rely on them to make up margins.

Im up for it as long as it has god opengl, wayland support,etc

It's going to be, what 2021? before they're finally able to escape 14nm++++++++++ purgatory on the CPU side so why not try to open up another market segment to diversify in the meantime?

They've already performed the sunk costs of major hires like Raja Koduri & Tom Peterson now they just need a office full of comparatively-cheaper warm bodies to do game and driver optimization. They already are showing promising signs on the driver front with announced integer scaling support months ago which Nvidia only just finally announced support of a few days back. I think Intel understand the sink-or-swim situation they wil be in at launch unlike the seemingly pet project/skunk-works project that Larrabee ended up being.

larrabee was apparently really god vs other gpus at the time, but intel saw a higher margin use for it and axed it at final hour.

what intel sees here is a potentially self sustaining market with an inroad to gpu compute to compete with nvidia and amd, once intel is off 14nm, they will be on par process wise with everyone else, no more magic 'we are ahead in the process node, bullshit.

Nvidia sits on their ass for so long that yea, I understand intel could catch up, intel could be a force to go between amd and nvidia, but this will trigger a price war that nvidia and intel cant play ball with.

on a side note with integer scaling, what does that mean real world? would it be a gpu effect that has to be enabled in games, or is it something I can force? I don't like fullscreening games but I have a few that max out at 720p and 1080p windowed, would be nice to play them at larger resolutions windowed.

If Intel doesn't get their drivers in shape then they'll never be king.

>larrabee was apparently really god vs other gpus at the time, but intel saw a higher margin use for it and axed it at final hour.
What could have been...

>what intel sees here is a potentially self sustaining market with an inroad to gpu compute to compete with nvidia and amd, once intel is off 14nm, they will be on par process wise with everyone else, no more magic 'we are ahead in the process node, bullshit.
That's true, which is why I'm hoping Intel will put more effort into IPC improvements going forward rather than relying a historical node advantage as a crutch. It's the inverse situation of Nvidia's Turing architecture battling and mostly beating AMD's RDNA-based products despite being on an inferior process.

>Nvidia sits on their ass for so long that yea, I understand intel could catch up, intel could be a force to go between amd and nvidia, but this will trigger a price war that nvidia and intel cant play ball with.
Let's not kid ourselves, the GPU market has been in desperate need of an old fashioned price war ever since Maxwell launched.

>on a side note with integer scaling, what does that mean real world? would it be a gpu effect that has to be enabled in games, or is it something I can force? I don't like fullscreening games but I have a few that max out at 720p and 1080p windowed, would be nice to play them at larger resolutions windowed.
It's forcible even when there is no in-game support, which is why it made the news.

i hope they get 256 color modes
and native character modes

quite funny koduri uses same tactics for intel he used for vega

Are you fucking retarded?

Nvidia has been releasing new GPU architecture after new GPU architecture

Turing supports features that most other GPU vendors don't even support like mesh shaders, real time raytracing and variable rate shading tier 2

Nvidia has been innovating constantly

devblogs.nvidia.com/introduction-turing-mesh-shaders/

nvidia.com/en-us/geforce/news/nvidia-adaptive-shading-a-deep-dive/

80 billion? What the fuck are you smoking tard? They have under 30 billion in debt and over 10 billion in cash on hand. They're in a better place than AMD

So is this the reason why Intel CPUs have stopped progressing after 2015?

They threw all their weight on Xe?

They also have the NSA and it's (((allies))) to keep it afloat nomatter what.

First GPU with a backdoor and hardware level telemetry?

They stopped progressing because (((intel))) couldnt manage to steal anymore government secrets to use.

There are two reasons Intel stalled. One is that they had no competition after Sandy Bridge so they steadily increased the profit margin on all their products. The other is that 10nm fell through so they had nothing substantial ready after Skylake. That's why we've gotten so many Skylake refreshes.

Yeah, don't buy hardware, ever! Just die waiting.
Upgrade every few years by price/performance ratio, can't go wrong, anyone giving a fuck about brand is literally an idiot.

>implying Intel is going to be cheapskates like AyyyMD
>implying Intel has real 10nm for this thing
I want to hear what 3rd party manufacturers have to comment on these, they're under AyyyMD and Nvidia's pockets for eons

The enemy of my enemy is my friend so AyyyMD and Nvidia will do their best to keep Intel out of the GPU market

>Chiller not included

Snake oil always looks good until you use it.

Hope there's a massive price war so I can buy a fast cheap Nvidia card again.

You don't really believe that Intel is anywhere near Nvidia in terms of performance on their first try, do you?

I want nvidia because CUDA

>Xe
>naming your GPU after an awkward alternative pronoun

Does this take the security patches into account?

You have been visited by Vixen Dyatlov.

This thread is currently reading 26 replies (not great, not terrible).

Attached: Laura.jpg (1275x637, 107K)