Muh ai hair

wccftech.com/nvidia-gpus-ai-rendering-power-hairworks-aaa-games/

Attached: Screenshot_20180708-092933.jpg (1080x1193, 486K)

Other urls found in this thread:

wccftech.com/nvidia-gpus-ai-rendering-power-hairworks-aaa-games/
twitter.com/NSFWRedditVideo

You mad AMDloser poorshit?

Just like their tesselation shenanigans it won't look any better but will cripple their GPUs less than AMD

nvidia closing in hard on the furry market segment

>wccftech.com/nvidia-gpus-ai-rendering-power-hairworks-aaa-games/

Non-aliance here: what kind of forecast does this news bring for AMD, Nvidia?

>hairworks
work on that ray tracing.
fucking chink.

At first I thought this would be a tool for the developer's to take 2d images and make hair models that would be rendered normally in game.
As in it just generates the hair for implementation.

It is all marketing non-sense. Nvidia is trying to sell the non-gayming aspects of its Volta architecture to gayming crowd.

No it's cock blocking AMD and Intel to a lesser degree. The usual anti-competetive tactics. Nvidia are worse than Intel by a mile.

>AI powered hairworks
I don't want gimmicks like that. I want more pixels and more frames per second

>technological advancement is bad because chink company #1who developed it is not my favorite chink company #2 or #3

Can't we get good open drivers for Linux instead?

No you will get AI powered hair simulation and YOU WILL LIKE IT GOD FUCKING DAMNIT!! Now bend over and pay up you cuck

>creates a technology that will cripple all our lineup except our high-end
>Pay companies to put it in their games
>In the end, no one uses
>Framerate is still the most people target while balancing graphics options
>It will only be used by 10 people playing old games, with a 2xxx generation card that can handle it like nothing

It's physx all over again

What is the successor of PhysX?

Attached: 1518044601283.png (1026x746, 58K)

There is no direct successor, it's a dead proprietary tech. I bet that hairworks is a spin-off/rebranding.

Nvidia is trying so hard to be King of property tech that die an they rebrand. G-Sync is following the same path, but worst, because there is a upfront payment of 500$ for the "module" that goes on the monitor. Soon it will die because of the wider adoption of free-sync, and in 5-8 years they will rebrand, make other snake oil to put in, and without module(maybe)

more like next gen ai to BOOST my ETH hash rate

as long as Nvidia can keep that 75%+ market share, Gsync will exist.
Gsync is the superior option as it uses hardware not shit software to solve a problem.
kinda like Nvidia using software Async VS AMD's hardware scheduler.

I agree, a lot will pay for the premium option that is G-Sync, probably will repay the dev+research time. The model that they are proposing is good, on a economic standpoint.

But, I have to argue that the free-sync model have a better chance of wider adoption, not because is better, but there is almost 0 adoption fee for the consumer(the only one would be the gpu price) and for the brand of monitor, and maybe with software mods and shit, consumers could make non compatible monitor compatible(this is only a supposition, there is no info in said mod), and probably with very little performance setbacks.

the price to pay with freesync is having to use a Power hog AMD GPU that's always 3rd best

if AI started to make games feeded it some gta 5, witcher 3 and some illusion porn games, then i would be impressed.

oh it can somewhat make the game look slightly better? literally dont even give a fuck.

Attached: 1531005764655.jpg (640x854, 222K)

>AI
>machine learning
well, retards think that even simple regression should be considered as "machine learning"

Power rog? Well, depends. Vega yes, Rx line, not so much. And only if you pay a lot of shekels you will get access to G-Sync(even more if you wanna have the best hairwork Jewery experience), and at that point, you probably don't care about power consumption. You argument have a good base, until a certain point.

>AI is basically linear function crunching
>new nvidia can compute linear functions
Wow, what an innovation. Maybe soon we'll have CPUs emulating Turing machines.