Nobody needs ray tracing r-right?

nobody needs ray tracing r-right?
only scientist and data analyst need that

Attached: 1526823375320.jpg (552x661, 71K)

Other urls found in this thread:

youtube.com/watch?v=tjf-1BxpR9c
zhihu.com/question/290167656/answer/470311731
gpuopen.com/gaming-product/radeon-rays/
raytracey.blogspot.com/2018/07/nvidia-gearing-up-to-unleash-real-time.html
pbs.twimg.com/media/Dkq4XsTVsAAPJRN.jpg
youtube.com/watch?v=Vdnwrt3Xdak
youtube.com/watch?v=jY28N0kv7Pk&t=1143
twitter.com/NSFWRedditGif

Got a useless 56 sitting on the shelf as a ornamental decoration lol
Gonna skip gtx rtx bullshit tho 12nm 300watt cards are a joke just wait for 7nm next year and 5nm in early 202x
Apparently mcm gpus are gonna be a thing as fuck hueg dies can only get so big before nvidia and amd go the Ryzen ccx route

>amd
*NEW* Beach/street to shit on

If i was Nvidia i'd be working in the background to develop a new Crypto that relies the raytracing portion of the GPU and meme it into being the new etherium.

Raytracing? Professionals could use it.
Gimpworks? No thank you.

nvidia is in the position that they can dictate what people think they are going to need
if they want raytracing to be the future then they will make it the future
it could all be a gambit to make rtg eat shit for a couple more years, and by the time amd has finally caught up raytracing will actually a viable and logical way forward

One year after Vega release primitive shaders still don't work. amdfags told me that we'd get 40% performance increase with new drivers last fall.

T-that's right! N-nobody needs realtime global illumination or raytraced AO.

Attached: 1496131353191.jpg (364x404, 26K)

Raytracing was always the holy grail and the future retard, holy shit were do you people even come from? RTRT has been what they've been gunning for for decades.

My Vega64 is good for another 2 years at least, r-right guys?

Attached: 1534577800034.png (623x808, 418K)

Dude, not a single AMD fanboy will tell you that AMD is better in the GPU front. we know Nvidia has the better hardware. its fact.

but we are hoping AMD brings the heat, so Nvidia can lower its prices.

But i am curious tho, Will AMD be able to do raytracing like this? or is this a proprietary tech that only Nvidia has?

Posted from my 1080ti

nowhere in my post did I claim otherwise

>Will AMD be able to do raytracing like this

No. Nvidia has spent the last 10 years working on this in tandem with Microsoft and AMD just realized that "oh shit they weren't fucking around and they're actually going to sell it"

Expect $800 cards to be the new norm.

>nobody needs ray tracing r-right?
In gaming?
Not for the next 3-5 years.
>but we are hoping AMD brings the heat, so Nvidia can lower its prices.
You are the problem.

Raytracer nvidia begins
Async computer, shaders compute integre and float in same time, special hardware for triangle intersection and hierarchy render position plus denoiser based in deep learning acceleration using matrix multiplication cores.

PowerVR use in prototype triangle intersection and Hierarchy scene position.

I don’t know if nvidia had patents on theses hardware design.

>You are the problem.

How exactly?

look at amd vs intel. Amd releases a good competitior, it rustles intels jimmies and intel actually brings better cpus out.

thats what im curious about. will amd be able to replicate this through their own hardware, or are they completly locked out.

Vega was obsolete when the 1000 series came out. Plus it's not using industry standard Nvidia technologies.

>it rustles intels jimmies

Yeah, they're rustled as fuck...

Attached: 1534415988415.png (933x104, 25K)

>How exactly?
You expect someone to waste money engineering something then taping out 3-4 dies so you can buy from competition for cheaper.
>look at amd vs intel. Amd releases a good competitior, it rustles intels jimmies and intel actually brings better cpus out.
Ryzen is a happy accident of Zeppelin (the server die) being okay for client stuff.
Things like that don't work in GPU land, HPC dies are all oversized.

yeah no...

6700k
7700k
8700k
9700k

z170
z270
z370
z390

all within a 2 year span...

But nah dude, go ahead and post that steam survey that has been proven to be incorrect.

>industry standard Nvidia technologies.
Industry standard patented technologies that they are forbidden to implement, and nvidia's near monopoly allowed them to deploy them as standards.

thats not what i said.

I said that if AMD brings competition, Nvidia has to put their products in the correct pricebrackets again.

their 104 gpus are really their mid range cards.

right now you buy their mid-range cards for the top of the line prices. Nothing wrong with wanting cheaper cards.

Very heavy rebuild GCN or new arch.
If AMD don't had raytrace hardware in developer maybe need 3 or 5 years.

If novidia market it, it's real.
You faggets don't realize that there's no h/w capable of ray trace.
Also there are no gaymes to support ray trace.
You are going to get a byproduct for a mere $1k.
$1k just for another 14+++ iteration.

>I said that if AMD brings competition, Nvidia has to put their products in the correct pricebrackets again.
So you or the others can buy NV GPUs for cheaper.
You're the problem.
>Nothing wrong with wanting cheaper cards.
Everything is wrong with that.
Engineering is not free.

>will amd be able to replicate this through their own hardware, or are they completly locked out.
RT core is an ASIC attached to every SM.
It's silly easy, they need their own RT core first, though.

All amd h/w matures very nicely because it relies on brute force and not on s/w gimmicks.
You'll be keeping your vega more time than a faggot keeps his 1080, because 1080 will start getting gimped once drivers and game libraries stop being built around it.
We've seen this shit on every iteration this decade

Nvidia is charing way more than it used to when they had competition from AMD.

>Actually defends Nvidias insane prices

Are you for real? a GPU costing 3k that should cost 1k is retarded.

Hey, Nvidia don't have a monopoly, that's like saying Android has a monopoly on android phones. There are many manufactures creating Nvidia products, and that market is plenty fragmented, it's simply a diverse market of Nvidia derived products.

And consumer are voting with their wallet and choosing the award winning Nvidia architecture over the irrelevant, absolute and poorly optimised AMD architecture.

>It's silly easy, they need their own RT core first, though.

Ok so they do need this. I thought 10 gigarays/sec was just some Marketing BS.

>Nvidia is charing way more than it used to when they had competition from AMD.
And?
People still buy NV, like they always did.
Why would AMD bother wasting money on new dGPU lineup?
>Are you for real? a GPU costing 3k that should cost 1k is retarded.
Engineering is expensive.
And people buy it anyway, so nothing's wrong with it.

>And consumer are voting with their wallet and choosing the award winning Nvidia architecture over the irrelevant, absolute and poorly optimised AMD architecture.
They also voted with their wallet during GT200, Fermi and Kepler.
Does uArch superiority matter?

none of the new shit even can output analog, so who cares.

Novidia has zero h/w to exploit dx12 and vulkan features, that's why amd gave literally many features to dx12 and vulkan from their implementation of mantle... They even shared the documentation.
Amd has better relationship with ms and linux that novidia has with itself.
Even their oems and aibs hate novidia for the "fe" editions, the 9/10 gbps versions of the cards and so on. Even evga had to sell rebranded chink shit to survive.

>raytrace hardware
There's no such thing.
This is a flop, just like the "tensor" cores.
All these shit rely on plain 16-bit precision ALUs.
You need an enormous amount of number crunching power, which is the butter on amd's bread for over 2 decades now.
Gcn is built around brute force, from their encoders to everything else.
That's why amd has 1 gpu for computing gaming and gpgpu.

>but we are hoping AMD brings the heat

Oh they've got that covered for sure

>There's no such thing.
There is, you moron.
Each RT core is an ASIC attached to a SM.

>Being anti competition
Spotted the /leftypol/

did wccftech amdrones migrated to Jow Forums

>Spotted the /leftypol/
I'm just against AMD wasting R&D funds for no reason.
They should focus on making better CPUs.

r/amd is in damage control

>r/AMD is in massive meltdown again

TOP KEK

The vast majority of GPUs are sold in pre-built computers, you arn't saying that nvidia have forged back room deals to lock AMD out of pre-built market are you?

>it's a /v/tards try to understand technology based on marketing material thread

Attached: all night.webm (720x720, 1.39M)

>Does uArch superiority matter?
No.
All what matters is leds, colours, shilling on game studios and screwing your customers.
Novidiots have the apple mentality, it's never the product's fault when a driver fries the gpu, it's not the products fault when the 780 doesn't support dc5.0 and gets doa on new titles, it's not the product's fault when the vram is 0.5 short, it's not the product's fault when it doesn't support dx12 or vulkan and it actually loses performance when it runs on a faster api instead of the one from 2010, it's not the product's fault when you are keep getting the same features since 2010 on the gpu, yet you pay even more for every iteration, it's not the product's fault when the gpu is becoming slower once a new iteration is out and it's not the product's fault that you have to buy a shitload of proprietary products in order to get the same experience as someone else is getting with purchasing only products that implement the standard.

youtube.com/watch?v=tjf-1BxpR9c

EXCLUSIVELY ON NVIDIA TURING GPUS

The RT core essentially adds a dedicated pipeline (ASIC) to the SM to calculate the ray and triangle intersection. It can access the BVH and configure some L0 buffers to reduce the delay of BVH and triangle data access. The request is made by SM. The instruction is issued, and the result is returned to the SM's local register. The interleaved instruction and other arithmetic or memory io instructions can be concurrent. Because it is an ASIC-specific circuit logic, performance/mm2 can be increased by an order of magnitude compared to the use of shader code for intersection calculation. Although I have left the NV, I was involved in the design of the Turing architecture. I was responsible for variable rate coloring. I am excited to see the release now.

zhihu.com/question/290167656/answer/470311731

Yes, the GoyWorks implementation of ray tracing will certainly be artifically locked to Nvidia's new cards.

gpuopen.com/gaming-product/radeon-rays/

show me a game that can do real time ray tracing that uses nvidia...

carefull dont use unity games since amd already has its ray tracing tech up and running there

>maximum damage control
Where's the primitive shaders, pajeet? Also amd fired gpus with driver update too.

raytracey.blogspot.com/2018/07/nvidia-gearing-up-to-unleash-real-time.html

pbs.twimg.com/media/Dkq4XsTVsAAPJRN.jpg

Interesting tell us more

kekd. Should have worded myself better

>RT core
Ahahaha, the marketing parrots.
It's plain ALUs idiot.
Don't repeat to me what the marketing idiot says on stage.
You are supposed to believe that it's a "new core" and (roflmao) attached ASIC to something, but it's not.
Tensor cores is marketing bullshit for 16bit precision alus....
And even google built an asic like that that could outperform a $20k tesla gpu...with a cost of less than $1k.
Please spare your power point presentation faggotry with me.

>The vast majority of GPUs are sold in pre-built computers
The opposite.
Most dGPUs are discrete board sales.
And something for laptops.

These false flagging threads dont help your cause, nvidia bro

...

you sound like a marketing parrot who can only speak in buzzwords
i bet you dont know how any of those things actually work

>Please spare your power point presentation faggotry with me.
It's cute when amdrones posted and continue to post power point presentations as some kind of white papers.

AYYMD has no Real Time Ray Tracing that can do 10 Giga Rays a second(10 billions rays a second)

AYYMD being years behind and with no vision or foresight now scrambling hard to copy Nvidia's Real Time Ray Tracing innovation for post Poovi GPU

How about we wait until we suck off nvidia or amds dick? its new tech, of course it will take time.

But if it can do half of what the new dance video does in real time, then it will become a useful tech for us consumers

Name five games that use RT.

why would they use raytracing without the appropriate hardware support

So nVidia is selling me a bunch of useless hardware that MIGHT get used in the future?
Incredible.

pretty much
apparently they're for developers or something

>apparently they're for developers or something
But these are not devkits, these are real, very expensive cards.
$800 for a fucking 2080.

>So nVidia is selling me a bunch of useless hardware that MIGHT get used in the future?
Is this your first graphics card launch?

>But these are not devkits, these are real, very expensive cards.
what the fuck are you talking about?

Neither Maxwell not Pascal tried to sell me or some magical useless hardware.
2070, 2080 and 2080ti have useless raytracing hardware.

Are you this underage? Do you think that games used programmable shaders when GF3 was released?

hardware needs to exist first for software to take advantage of it

>dedicated pipeline (ASIC)
>ASIC-specific circuit logic
terms that us hardware designers never use... how many asics does this new gpu have... or is it one (1) ASIC?
aside from the mumbo jumbo terminology,
what you are saying is that in order to add a new instruction to each core, you have to add elements to the pipeline.
ofc you are going to add logic to the control circuit and ofc you are going to add something to the pipeline.
All what I am saying is that that "something" is plain fucking 16-bit precision ALUs.
that's the term google iirc first used for ALUs that have one fucking configuration.
There's nothing special to ray tracing, you can already do it on existent GPUs, you can fucking generate verilog code via matlab and fucking place it on an fpga.
matlab mostly revolves around openCL cores, but nonetheless you can have the fucking same result, you could even offload your "ray tracing" bullshit to your "dedicated" "ray tracing" card.
I know that this is all bullshit about ray tracing.
how?
youtube.com/watch?v=Vdnwrt3Xdak
you can't integrate the current generation's $150k worth of horsepower into a gpu that's coming 6 months later for under $1k.
I am waiting for the white papers to prove me wrong... which I doubt...

When those developers are going to release the very first game with nvidia's ray-tracing release, 2080 will be already obsolete

see Fucking underage, I swear.

why is it always this /v/ console war tier shit
cant you just talk about the new thing comming out and not compare it to something else

GF3 didn't cost $1000.
Not on consoles = no one would actilvely touch it for a while.
I don't think it's even on mainstream Turing cards like 2050 or 2060.

how do you expect new hardware to come into mainstream use if nobody takes the first step exactly?

No one would bother to implement something used by three halo cards.

they would if it gave an increase in quality

>did wccftech amdrones migrated to Jow Forums
>>r/AMD is in massive meltdown again
image if we did have an amd vs novidia thread...
here we are just proving the marketing lies from novidia revolving around this "real time" "ray tracing" fud, and novidiots are seeing amdrones and pajeets over from wccftech.
back to reality faggots, this is novidia vs current technology.
>novidia foosion reactor
b..b..but it's impossible to have sun-like conditions under control on earth
>don't you dare say the opposite amdrone, back to Jow Forumsamd or to pajeetfftech. shut up, they added special asic to the bibeline.

Jesus fuck, you're both underage AND retarded. I've read the same comments when DX8 and GF3 were released.

Programmable shaders didn't explode until DX9 and PS/VS 2.0.
Stop with this meme.

Translator(chine to English) is for one ex engineer nvidia From chine start up self driver cards.

Here bechmark by SEED in EA.
3 to 6 faster in raytrace over Volta.

raytracey.blogspot.com/2018/07/nvidia-gearing-up-to-unleash-real-time.html
Blog having a lot things about raytrace and some articles about PowerVR raytrace hardware.

technology takes time to be adopted
what are you even arguing
nobody should try because it wont catch on immediately? how retarded are you?

No one fucking touches first-gen shit, that's the point.
RTX is 98% marketing gimmick.

but the first gen shit needs to exist for the second gen shit to exist, so again, what is your point

Yet someone had to take the first step. Also "DX9" hardware was still slow and most effects were still SM1.x and fixed pipeline. It wasn't until DX10 cards when you were able to use DX9 without perforamce issues.

>first step exactly
and by first step you mean marketing fud.... because other than that there's no magic in making sudddenly real time ray tracing possible when a few months ago costed more than a house to have it.
>very first game with nvidia's ray-tracing release
the libraries for ray tracing exist, unlike the gaymen faggots in this thread, there are people who have been working on ray tracing for decades. there will be no single gpu in the next 10 years to handle real time ray tracing.
complex scenes in games with tens of shadows and light sources are fucking expensive in terms of operations to render.

In competative online games, quality is lowered to increase FPS, refresh rate, and to make it easier to spot the other team as shit like foliage doesn't hide them well on lowers settings.
I think ray tracing is really only useful for single player games.
>t. nvida 970X

Are you fucking retarded?

Nvidia showed that same demo running on a SINGLE Turing card

youtube.com/watch?v=jY28N0kv7Pk&t=1143

>other than that there's no magic in making sudddenly real time ray tracing possible
isnt that what the new raytracing hardware is supposed to be?

Not a gimmick, someone needs to gamble, if they fail, too bad. If not and they introduced something that will be used for years then nice.

>new demo, now with more wooden screws

Keep on SEETHING as you have nothing :^)

(You)

Attached: amdloodie.jpg (633x758, 162K)

Now when said first-gen shit bloats the die and makes prices do a moon mission.

Just like amd dx12 vulcan right?

funny thing is looks like dx11 is the last one and no game dev touches dx12, theyre all switching to ue4

>complex scenes in games with tens of shadows and light sources are fucking expensive in terms of operations to render.
Hence why they're pushing hybrid rendering.

This is the mindset of an eternal failure. I bet you dont try anything new in life and is afraid of failing.

>Just like amd dx12 vulcan right?
These don't eat die area for breakfast, it's just a bunch of APIs.
>funny thing is looks like dx11 is the last one and no game dev touches dx12, theyre all switching to ue4
They're switching from API to engine?
What?
Did you forget your pills, ESL-kun?