THANK YOU BASED NVIDIA

videocardz.com/77895/the-new-features-of-nvidia-turing-architecture

THANK YOU BASED NVIDIA

>new NVDEC decoder with HEV YUV444 10/12b HDR, H.264 8K and VP9 10/12 HDR support.

Attached: NVIDIA-TURING-TU102.jpg (1979x1150, 456K)

Other urls found in this thread:

youtu.be/EFLimT4ik_8?t=2026
on-demand-gtc.gputechconf.com/gtcnew/on-demand-gtc.php?searchByKeyword=&searchItems=&sessionTopic=&sessionEvent=11&sessionYear=2018&sessionFormat=&submit=&select=
graphics.cg.uni-saarland.de/papers/woop-2006-rt-asic-design.pdf
anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy
twitter.com/AnonBabble

i dont fucking care about this gay shit, i just want fortnight to run at 6000000 fps

hnnggg based

yes,
thank you for charging 600 bucks for entry level 2070, which should've called a 2060 by all accounts.

Attached: 084ecd9824.png (948x317, 460K)

the 2060 became a 2070 while the new 2070 also got a price bump of just 221 bucks.
I wonder if there's a trend there?
probably nothing to see.
We should all just give thanks to leatherjacketman and keep playing it like it's meant to be.

>leatherjacketman
He is, certainly, playing his customers like he wants it to be.

FUCK YOU AMD THS IS ALL YOUR FAULT

>no mention of raytracing
I still haven't seen any explanation whatsoever, even high-level, of how raytracing is actually implemented in hardware.

wait guise, the angle here is that we're basically getting 3 1060's worth of graphical performance only for the price of another 1060 atop a 1070. So we're getting, like, a stack downgrade for xx70's while effectively paying one impossible sli for a xx60+xx70 for only thrice the price of a 1060?. It's simply that the more you buy the more you spend so you spend more money to buy more xx60's, but they're so damn good now that we should call them a 2070, while we justify the premium for an entirely new low end entry level chip.

Attached: a69035538e.png (1686x404, 95K)

600 dollars for a x06 class chip.
which sold out on preorders before any reviews, to boot.
tell that to your chibi self from the mid 2000's and see what he will think of it.

>no AV1 support
It's already outdated.

AV1 only just finalized this year while Turing chips would have taped out months ago

>1070 = GP-104-200-A1 .(379)
>2070 = TU106 .(599)


/thread

Why do you fucking care about the decode capabilities of a GPU unless it's for mobile shit.

repeat after me guise:
>the more you pressure the price trend on halo flagships the more you can downgrade your mid-end stack while also artificially increasing the price across the entire sku stack.

Attached: nvi12_a.jpg (840x485, 75K)

wowee, still no reason to upgrade my 280x

where's the Hi10p support nvidia

Attached: 1532115659571.jpg (640x640, 36K)

which is so fucking accurate it hurts.
the 2080ti is 1.2k, titan prices.
the 2070 has a 106 gpu, for xx80 prices.
the 2080 has ti prices, and is now cut to the level of what a xx70 had always been.

Frankly I don't believe they will even bother with a TU 2060, they probably gonna bump a Pascal 104, for refreshed 2060 "GP-204". Also considering the rumors for their inventory overflow, kinda makes sense.

Only in Mali-V76.
:^)

in the same closed cabinet under leatherjackman's table, locked together with the hdmi 2.1 to prevent cannibalization from the next tv sets against their new g-sync modules which will acompany their soon to be release BFGD displays.

Stop kvetching and KEEP BUYING NVIDIA

Enjoying that Kekler purchase?

>decoder
woww nobody gives a shit

it's not, they have some sort of blurring filter to throw over a half assed ray traced image.
And a dedicated part of the chip to figure this filter out in realtime, on the "tensor" core
It doesn't actually do RT properly.
youtu.be/EFLimT4ik_8?t=2026

I wonder how bad things will get when we remember that the based hd 5870's were being outsold by gtx 480's POS.

Attached: Add-in-Board-GPU-Market-Share-2002-to-Q2-2018.png (1486x555, 84K)

on-demand-gtc.gputechconf.com/gtcnew/on-demand-gtc.php?searchByKeyword=&searchItems=&sessionTopic=&sessionEvent=11&sessionYear=2018&sessionFormat=&submit=&select=

Turing begin between 6 to 8 times faster in raytrace account Arnold render GPU talk.

Do not reply to AMDrone. it will hurt yourself

dig up from the archives that thread with the Ray Tracing discussion right after Turing was announced.
Apart from the denoiser, DLLS, Vega apparently won't be that far behind in the same new gigaray metric Jensen Huang took straight out of his ass. I hope we'll get to see some apples to apples comparisons without driver lock out -- on the fence on that one, since on one hand it's a hybrid dx12+dxr thing, on the other it could very well be restrided by the developers themselves to detect and run only on Nvidia hardware. That 6 to 8 figure was most likely over exaggerated even when comparing against Pascal itself, let alone Vega which is quite capable at compute tasks.

>t. the guy whose shilling thread failed msierably.

jensen didn't say anything about "rayworks", and we saw microsoft's involvement with directx, I think ray tracing will be vendor neutral.

graphics.cg.uni-saarland.de/papers/woop-2006-rt-asic-design.pdf

Nvidia buy in 2008 RT hardware company from Utah university

tensor core and rt core is a wast of space

Attached: TURING-SM.jpg (1200x1640, 151K)

No scale infography

>it's not
In that case, what does the "rt core" bit in refer to?

THANK YOU, BASED NVIDIA! RTG ON SUICIDE WATCH WITH THEY SHIT HYBRID DECODE.

>>new NVDEC decoder with HEV YUV444 10/12b HDR, H.264 8K and VP9 10/12 HDR support.

congratz nvidia you finally reached amd

Yes, these cards are seriously overpriced. I guess people are somewhat used to it since the previous generation was also overpriced and GPU prices have been generally high for quite a while. NVidia is making money, but this does hurt the PC gaming market. Why the hell would you buy a PC if you're just interested in playing games? A Playstation will cost you less than a high-end GPU & you need the rest of the components in addition to that.

Yes. It really is. Look how Intel magically decided that 4 cores weren't enough after all when AMD put Ryzen on the market. There's no other GPU players (though Intel is supposedly planning something) so it's up to AMD to put something out there which could force NVidia to lower prices. AMD's got nothing, specially right now - what would you rather do, buy a 1080 or pay $120 more for a Vega56? Buy a Vega64 or pay $50 more for a 1080ti? NVidias large stock of last-generation chips have made prices of those come down a bit - while AMDs products are still overly expensive. And even though NVidias stuff is cheaper now it's still vastly overpriced compared to what a GPU should cost. This ain't going to change unless AMD somehow manages to put something that matches the 1080ti at the price of a 1070.

Hardware decoding really does matter for mid/low-end GPUs and APUs. If you're buyig a RTX 2080 then you're probably going to put that in some system which is capable of CPU-decoding everything. But here's a niece use-case where it matters: I use an old Athlon 5150 for my HTPC. It's that simple really cheap SOC system AMD made some years back. Quad-core 2 GHz APU with two SATA ports somewhat limited connectivity overall but it's fine for 1080p video. That CPU can't handle 1080p HEVC and it sure as hell ain't capable of 4k. So I put a RX560 in it (happened to be on sale at RX550 prices) and now that box can play 4k HEVC just fine. Nobody would pair an Athlon 5150 with a RTX2080 but it can make a difference in the low-end.

I can get a Powercolor red dragon vega56 für 399. It's over, neger.

>399
399 what? You can clearly get one for 399 pieces of silver but it sure ain't worth anywhere near that much

is that like when I bought my 6800GT and none of this ever worked?

Buying a GTX 480 over a 5870. Man, some people are fucking stupid.

The 5870 was like one of AMD's best executed architectures, and the GTX 480 was one of Nvidia's worst.

Won't the RT cores make it super attractive for professionals and movie studios? I would assume this would greatly decrease rendertimes for 3D graphics.

And you can get a 1080 for 460 cuck dollars which is better value.

Vega was a flop just like bulldozer and anyone can come to that conclusion themselves just by looking at the benchmarks, let alone the price.

AMD have a chance to make vega great again with 7nm, but they're focusing on the professional market for that and not consumers.

So basically bow down to leatherjacketman and pray he does not increase prices further.

>THANK YOU BASED NVIDIA
WE'VE HAD THESE THREADS TEN THOUSAND TIMES AND YOU IMBECILES STILL RAPE THE ENGLISH LANGUAGE BY OMISSION OF THE VOCATIVE CASE COMMA.
CORRECT: "THANK YOU, BASED NVIDIA.".

LOL NO

GTX 480 is faster and has drivers, 391.35 WHQL gives Fermi OpenGL 4.6 and WDDM 2.3 support

anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy

AYYMD gives you the middle finger if you bought HD5xxx/6xxx

>AYYMD gives you the middle finger if you bought HD5xxx/6xxx
the Radeon HD 6800 series was released on October 23, 2010.
But that is legacy in tech 5 years? what are you on

So it can natively do 444? and 12bit output?

Don't you still need those 16 bit 3DLUT monitors?