Meme-free zone

Honestly? I don't see the value in buying into RTX right now. And I'm actually kinda bummed about it.
>games coming in 1-2 years from now won't use raytracing in any meaningful way, instead it'll just be slapped on and exaggerated
>Nvidia (deservedly) will leave this to be closed source, IF AMD decides to try and do raytracing too it'll be with an open standard (FreeSync, anyone?) but Nvidia won't allow their cards to use it rendering it useless
>developers will INEVITABLY choose only one of the standards, killing AMD in the process
>Nvidia gets a monopoly and AMD is once again only bought by poorfags, Nvidia cards get even more expensive
>literally only one upcoming game uses it in a meaningful way (Metro, for real, realtime GI)
>$1299 (+tip)

Attached: NVIDIA-GeForce-RTX-2070-Feature-2.jpg (1603x803, 247K)

Other urls found in this thread:

blogs.msdn.microsoft.com/directx/2018/03/19/announcing-microsoft-directx-raytracing/
en.wikipedia.org/wiki/Algorithm
twitter.com/SFWRedditGifs

cool blog

>t.buttblasted nvidiot shill

Basically just like half-precision in AMD hardware.
Majority of games are made for console anyway. And console doesn't have any ray-tracing hardware. So yeah, probably DOA.

gsync doesn't work at all for amd cards
and next gen consoles are right around the corner, probably being announced sometime next year. they might have partial raytracing like we're seeing with the new cards

>tfw 1060 6gb
>tfw 1080p 75hz screen
>tfw game on medium to low GFX settings
>tfw completely content with my GPU.

>970
>1440p144 monitor
just fuck my shit up

Keep an eye out for a 1080 or w/e you might get a brainlet dumping his card for the new version

Console garbage is not relevant

Ray tracing will be widely adopted and there's nothing you can do about it

never buy 1st gen of tech.
will be buying a used 10 series card since they're so cheap, debating between 1060 or 1070.

If I can't do anything about it and it will be widely adopted, doesn't that mean it will be good? I don't have anything against better technology.

Well, this is Nvidia it's probably closed source. AMD probably follows Nvidia with Navi and open-sourced their ray-tracing version.

incidently killzone 4 on console already use ray tracing reflections. Nvidia is literally selling soft shadows at a ridiculous price.

Why is ray tracing such a big deal?

Since all consoles use AMD GPUs won't the open standard prevail?

so nvida can justify raping his customers in the ass with shattered glass.

Man I don't know what to think about all of this. It seems weird that they didn't brag about the gaming performance of these cards. Usually at these events we get a bunch of graphs boasting about the performance gains over the old cards and their competition. This time it was a bunch of impressive looking charts with *raytracing performance at the bottom. Also fuck the prices.


I'm not sure If I should hold out for a 2070 or just buy a used 1080ti for $500 right now.

its literally a cheap cash grap before 7nm hits, just like some anons said
they didnt even mention performance gainz over previous gen, well aside from Raytrace per Secondz

Radeon has raytracing, it's written in open cl. The GPUs are just not powerful enough for realtime raytracing.
Though leatherman will fuck both amd and nvidia customers again with gimpworks because they locked their (((hardware acceleration))) to RTX cards only.

>Radeon has raytracing, it's written in open cl.
not memetracing with hybrid mode
and of course it'll be locked to nvidia cards

AMD has better sync and better hair rendering. They also promote modern graphics APIs and competitive free physics engine. They first released 3d sound (also free), though novideo's proprietary counterpart flopped too. None of those technologies prevailed because novideo has larger mindshare and they simply boycotted free standards, and since they have larger market developers don't want to spend resources on supporting something novideo boycots. And consoles didn't help either, because when games are ported developers focus on nvidia and won't even use something already implemented in consoles if nvidia doesn't support it.

game developers (aside from a few who nvidia pays) will not make ray tracing mandatory for many years, most gaymers are poorfags, it does not make sense to make your game only playable on a gpu most people won't have.
Also note that previous and accurate leaks have stated that the lower end cards, which the majority of people buy, are still on the gtx line, e.g no ray tracing.
not even the RTX20xx series will be able to run ray tracing at a reasonable frame rate anyways

Attached: dontgetjewed.jpg (600x600, 29K)

Why is ray tracing so demanding?

Battlefield 5 will have raytracing before the cards are even released. OP BTFO

To jack up card prices, goy.

it has to do pic related for every beam of light, for every object in the scene
it's a lot of vector operations

Attached: Ray_Tracing_Illustration_First_Bounce.png (960x1078, 233K)

AMD will be using the DirectX Raytracing API.
Why do you think Microsoft and Nvidia went this route?
Because Nvidia went balls deep into this technology and dedicated a lot of silicon die to it.
If it fails then that's Huang's head on a platter.

It's Microsoft's job to ensure that raytracing is accessible to both AMD and Nvidia. AMD doesn't have much choice but to follow suit.

Why shouldn't I just buy 2x 1080TI's for the price of a single 2080TI? What were they thinking?

think of rasterization as taking an image, performing an affine transform on it, and then figuring out whether it should be in front, behind, or in-between another image.

think of raytracing as throwing a few hundred rocks at a pixel and seeing how many other things you can hit with that rock
then multiply that for however many pixels you have.

I hope the difference is obvious now.

So basically it's like simulating actual photons and how they physically interact with things?

it will be same as with PhysX

>yo negro, game werks fine but buy new nvidia card to have those flying coats muh nigga

They're thinking that you may want to make use of hardware accelerated AI inference or hardware accelerated raytracing in the near future, something that doesn't exist on the previous generation.

This is a much bigger difference than Nvidia pushing crud like GSync or moar RAM or double-precision float computation; this is literally several orders of magnitude more performance for a specific use case, one of which is super fucking hot and not going away anytime soon.

>AMD has better sync
Stopped reading here. Their sync is dogshit.

Can someone explain to me why the AIB partners are charging FE prices for their cards. I haven't bought a new card at launch in a long time, but back in my day you could actually get cards day 1 for MSRP

Simulating photons in reverse.
Instead of shooting photons from a light source, we are taking the narrow subset of photons which reach our camera and back-tracing them to a light source.

This light source can be an actual light, another surface, another surface reflected onto another surface, a transparent surface i.e. leaves or fog, or even nothing (atmosphere/background)

it's far away from that

ALL THE LEAKS ARE WRONG ITS ONLY 3000 PAYMENTS
SO WE INVERSE RAY TRACING WE ARE TRACING RAYS THROUGH THE EYE THERES NO REASON TO TRACE THE LIGHT IN TRIANGLES WHEN YOURE REFLECTIONS
AND THATS BECAUSE OF OUR GIGAJIGGARAYS PER SECOND
BUT THEN OF THE TEN BOXES OF TEN BOXES I CAN IGNORE THE ONE BOX WITH THE OTHER BOX IS ANOTHER TEN SMALL BOXES OF MY TEN TEN EIGHTY TEA EYE BOXES OF TEN BOXES WITH BOXES OF TRIANGLES AND BOUNDING BOXES SO WE KNOW WHAT BOUNDING BOXES SO WHICH ONE OF THE TRIANGLES IS IT
ITS VERY QUICK MATHS
WHICH IS REALLY REALLY HARD PARALLELISM WHERE WE CREATE INCOHERENT THINGS AT THE SPEED OF LIGHT
GOD THIS IS HEAVY DEEP LEARNING

Attached: 1534795878209.png (1199x773, 559K)

What monitor? I have a VG278Q and I'm looking for a good 1440p upgrade.

>better sync
kek, no.

>ray tracing
I know someone who did realtime ray tracing at more than 60 fps at 4k on a r9 nano card. He wrote it in GCN asm and it worked very well.

Doing it in a way that actually looks good is difficult because things like light diffusion are not efficient at all if you only do ray tracing.

for those prices they can fuck themselves. they are doing a favor to miners that have to unload.

calculating raytracing and doing it with hundred other things to render shit are 2 different things

There's a fucking sticky

oh no condense your shit we need more room on this board for garbage generals like my fucking desktop thread
go fuck yourself

Why the fuck does this not have HDMI 2.1?

im disappointed with their pricing.

No doubt its going to be a better GPU but 2x better than what a 1080 is now selling for ?
For people playing in 1080 older gen might be the best value for them

Literally every AAA title in the early 10s was being held back by the 7th gens outdated hardware. You really think it won't make a difference that 8th and likely 9th gen is all AMD based?

why do you need hdmi 2.1 when your market will be scientists

same, was a dumb decision cause mine crashes on certain games.

the rumors were that 2080 would be out in spring so I was planning on that, so I've been limping along till now

Will we see a new Crysis-tier game to show off the power of this card?
Seems like nobody really tries to push the edge of the envelope on PC these days.
I hope people who buy it get something fun to show off using it.

>Crysis-tier game to show off the power of this card?
No. Ray tracing is a meme. For that power it needs to calculate shit, the result is barely noticeable. DirecX9 era games are good enough and DirecX10 were the peak. Since 10s onwards the graphics went downhill.

t. almost boomer

ok

Attached: hmmmm.jpg (741x568, 73K)

People told me DX12 would be the second coming of Christ, but all it seems to do is make my computer run hotter for the same quality of graphics.

A lot of what Crysis is known for is doing a lot of things that other games just aren't known for, even today over 10 years later.
Graphics are only a part of it.

Those devs will really have their work set out for them if they want to match Crysis in terms of interactivity and features.

Fancy area lights and shadows are something we've had for a long time now since CryEngine 3, the children that are the target audience of most games nowadays won't even notice.

better off seeing how the 2070 shapes up then go from there. if it gets anywhere near 1080 numbers then it sounds like itll be worth nabbing
>or at the very least, itll push pascal prices down

Take a look at the API yourself. It's clusterfuck and DirectX11 was somewhat better even though the performance differences are questionable.. What is worse is that they get rid of DirectInput and introduced Xinput: Pajeet Edition and I guess DirectSound is also obsolete. What was good though, in DX11 you needed DX10 elements to render text (DWrite), DX12 improved that.

What about Star Citizen? I read an article that said they were designing the game targetting the specs of a top end computer in 2025.

>Nvidia (deservedly) will leave this to be closed source, IF AMD decides to try and do raytracing too it'll be with an open standard (FreeSync, anyone?) but Nvidia won't allow their cards to use it rendering it useless
>developers will INEVITABLY choose only one of the standards, killing AMD in the process
>Nvidia gets a monopoly and AMD is once again only bought by poorfags, Nvidia cards get even more expensive
These do not apply.
The libraries and sdk's used will be drx (directx raytracing), vulkan and wahtever other api. Nvidia is only providing hardware acceleration for those and helping to further optimize them for their platform, but by nature they can't simply lock AMD out of it.
The algorithm and specialized techniques are agnostic and hybridized by default.
blogs.msdn.microsoft.com/directx/2018/03/19/announcing-microsoft-directx-raytracing/

>Why is ray tracing so demanding?

Shadows are hard, you need a super GPU to make them blurry.

Will vulkan introduce its API for raytracing, is it already there or they will maybe have it or not?

SC is a scam

Star Citizen is definitely going in the history books no matter what potato or absolute crown jewel they release

that's the physX way.
I hope that Nvidia doesn't fuck RT up like they did with physX.

Did any games really use PhysX for gameplay? Seems like it was always just for making big collections of small particles that bounced around.

For some minor eye candy

Novidia pascal can't async
>dx12 deded, dx 11 is enough for everything anyways
Novidia can RT
>drx will be the 2nd coming, Rt all the way choochoo muthafucka

can you spot the pattern?

>2025
baiting me softly

there were many actually. mirrors edge 1 was one of them. iirc about half the triple a titles had it back then.

Because they can

Attached: f95cee97a9860dbfd790a6b3e9cd8abf636a00907843beaddb3cdab801e12c62.png (750x749, 369K)

It was hyped around 10s, games like Mafia 2 had it f.e. and other games from that era. They it just died.

Maybe if you didn't wrote it like nigger I would get your point.

about games, is it me or dice always releases a gameworks for odd numbers and gaming evolved for even?

Yep sucks ass I'm dabbing out of the yearly upgrade bullshit.
Zen 2700x and gtx 1080 should be enough for a very long time if I am already getting 60fps at 3-4k 100-200fps at 1080p
Just want some good screens to come out.
These new rtx cards are practically useless in current games 3-4+ years old that everyone plays no real benchmarks

And nvidia follows AMDs open source ray tracing by threatening devs to make their games run like shut on nVidia hardware unless they exclusively use nVidia’s raytracing implementation.

I'm torn between going for a "cheap" 1000 series around black friday, or waiting for the 2070 to drop below £500

Though i'd rather wait and see real world benchmarking before pulling any triggers.

take your time

That's not how this works.
Nvidia advantage here is from hardware and drivers. The api itself is vendor agnostic.

>1050 on 1440p
Shit card. Can't even maintain 60fps on lowest setting.

Microsoft owns the API, not Nvidia. AMD has to implement it anyways if they want to claim to support DirectX.

could they add some feature that retroactively adds ray tracing to old games? I'd like to play some doom 3 with raytracing.

Attached: mpv-shot0063.jpg (917x693, 86K)

the most obvious raytracing benefits come squarely from area lights, you'd have to redesign the game's levels to include area lights, and I imagine most engines don't have support for them anyways.

You could realistically do raytraced Ambient Occlusion and global illumination. Making mods for the set of Bethesda games is probably what you're going to see.

well doom 3's engine is open source right? Someone could add them in.

Ray tracing is an algorithm.
en.wikipedia.org/wiki/Algorithm

Nvidia can't Intelectual Property or copyright block AMD out of peforming mathematical calculations in their gpu's.

I suppose there's also ~proper~ reflections, but I imagine that you'd also have to do a lot of asset hacking to get it done.

that's exactly why nvidia went open source on material libraries.

>1060 6GB
>1440p144

fingers crossed 1070 Ti - 1080 Ti prices plummet hard and fast if enough good goys start snapping up overpriced RTX cards because OMG 6X FASTER AT MEMERAYS

but if they come with a very minimal performance bump against Pascal cards for this retarded price premium I think word will get out pretty quick and they will have to start dropping prices on Turing pretty quick

1080's are selling for 420$ rn

think it's going to drop much more? idk, especially with how expensive the new ones are

Seeing 1080ti for 620-650$

Which card is best for 1440p144hz? I'm fine getting either Pascal or Turing.

I was looking forward to upgrading a 5 year old pc running a 970...but these prices are making it awkward. I have a 144hz 1440p monitor so I need to push higher frames to get that benefit. Would a used 1080ti maybe be the best middle ground? Or is that another half measure since 7nm cards may come out late next year?

I'm in the same situation lol, probably just gonna get a 1080 cause $420 is cheap and the new cards were kinda disappointing.

my 970 crashes some games on 1440p

They will probably keep dropping steadily as they have been, maybe a little slower since Turing launch prices are so high, but it all depends on benchmarks once Turing is actually available everyone will know what they stand to gain by upgrading, raytracing memes aside.

what do you guys think a realistic bottom price is for a 1080 and ti?

I'm tempted to pull the trigger cause my 970 ain't getting it done really, and i was waiting for 2080 but fugg those prices

The 2080 is only supposed to be marginally more powerful than a 1080ti right? I'd rather buy a new 1080 Ti than a 2080 if true.

hard to say until benchmarks come out but I wouldn't doubt that the 2080 is only marginally better

No ones knows.
And leatherjacketman not providing any sort relatable benchmark, in any known title comparing relative performance between the generations didn't help either.

>I'm actually kinda bummed about it.
If I were a gamer I'd feel the same way, but I'm actually feeling super hyped. The 2080 ti has just as many tensor cores as the titan v and faster vram than the 1080 ti. The RTX cards will be excellent entry-level machine learning cards. My 1080 is terribly anemic, so I'm planning to upgrade to dual 2080 TIs.

Attached: 1512841953333.jpg (641x534, 80K)

>buying at launch
This is the best way to buy, unless you absolutely need bleeding edge hardware because your career depends on it
>buy xx80 series one year after release
>skip next generation or even the next two generations
>repeat step 1

well if past benchmarks are an indication what kind of jump in performance we are in for..

Expect the 2080 to be about 20% faster than a 1080TI and 40-50% faster than a 1080
You can likely bump the numbers to 40% and 60-70% respectively for the 2080TI

that would be my best guess

>not providing any sort relatable benchmark, in any known title comparing relative performance between the generations

This is what's making me think it's not a huge boost at all

How risky is buying used GPUs? Is it just down to "if it looks too good to be true it is", or do you have to really be looking at only the most well reviewed sellers?

>2018

>not owning a GPU with Ray Tracing

Enjoy your stuttering desktops animations

Nobody knows because nvidia told us jack shit in their two hour long marketing campaign.

But 3584 (1080 Ti) vs 4352 (2080 Ti) CUDA cores, albeit at a lower clockspeed, 1480 MHz vs 1350 MHz. Based on that and with a pretty significant increase in memory bandwidth because of GDDR6, I think it's going to be the same kind of 20%-30% performance increase that we usually see when there isn't a gigantic node shrink like from Maxwell to Pascal.

All of this can be yours for %100 more money. Fucking nvidia.