Ray tracing can make Minecraft, one of the ugliest 3D games in existence, look like this

>ray tracing can make Minecraft, one of the ugliest 3D games in existence, look like this
>ray tracing is only being pushed by the second biggest Jew company on the planet, Nvidia, meaning it is destined to fail and we'll never actually get ubiquitous, proper ray tracing
fucking why

Attached: 1566496765444.png (1680x1045, 2.46M)

Other urls found in this thread:

en.wikipedia.org/wiki/AMD_TrueAudio
github.com/GPUOpen-LibrariesAndSDKs/TAN
youtu.be/2xCF-JJdhNY
youtu.be/n68s9I28AvY
youtube.com/watch?v=DWK_iYBl8cA
vocaroo.com/i/s1t1OCk1XHCW
twitter.com/SFWRedditGifs

ray tracing is already used for offline rendering and it will be used for realtime rendering as soon as hardware is powerful enough regardless of what company does it

Texture packs and shaders can already do that.

Shader packs (built into optifine for Java edition) do not use Ray tracing and even worked back on my gtx 650. They also have much more stylistic variety. They are what you really want.

Attached: Sonic-ethers-unbelievable-shaders.jpg (1920x1080, 365K)

I don't get it it's still fucking minecraft it looks goofy as hell

>one of the ugliest 3D games in existence
just because it consistent of 16x16 pixels textures doesn't mean it's ugly, pixelart is a thing

>>ray tracing is only being pushed by the second biggest Jew company on the planet, Nvidia, meaning it is destined to fail and we'll never actually get ubiquitous, proper ray tracing
Their implementation is all DXR. As usual AMD is the one dragging things down, just like when they refused to license CUDA and physx for literally pennies per GPU

>built into optifine for Java edition
I thought selecting "internal" with Optifine was still just the default shader?

The value add of ray tracing is cutting down on the amount of shader development you have to do to make things look photorealistic. Your indoors shader isn't going to work well outdoors, or in a sunset, etc. Plus you have to go out and fake your god rays and whatnot. All a bunch of fiddly things which cost devs time.

Ray tracing makes it so that whatever you put in, it comes out looking believable by default. One lighting model to rule them all.

>AMD's bad for not licensing PhysX
>Nvidia's perfectly fine for disabling the use of Nvidia GPUs as PhysX coprocessors for AMD GPUs

Are you mentally challenged?
You fucking act like ray tracing is some gaymen fad that is about to fade away.
In reality it has literally always been the endgame of 3D graphics if you fucking take a look outside of your computer game bubble.

If it does not catch on in games just yet, it eventually will when software and hardware finally gets there.

Attached: 1519420624450.png (378x374, 165K)

the question is though, how much fatter are graphics cards gonna get because of it?

No, remodeling, re-texturing, and re-writing the rendering algorithms made minecraft look like that. Raytracing just made the shadows a little more precise.

Never thought about it like that, that's a good point.

Easy access ray tracing when?

Mostly none 3-5% die spaces for BHV hardware accelerators, say ex engineer nvidia

Stop shilling for nvidia, Physx and CUDA are proprietary tech, supporting it means your just asking to be fucked over.

So is you x86 you dumb faggot, yet you don't see AMD fanboys decrying AMD for keeping it locked up

Both next-gen consoles and AMDs upcoming cards will do raytracing, and their implementation is rumoured to be a lot better than nvidia's.

Pic related is the worst possible example of this, texture pack makes it look like somebody made a minecraft unity clone.
Ray tracing is unironically how vanilla MC was meant to be played.

>amd open source raytracing coming next year
>cyberpunk2077 rtx looks real pretty, if it is a good game who knows what happens with rtx
>nvidia will be fine either way they got loads a money
>gamers dont care about screen space reflections
>artists can already do raytraced renders real fast with opencl and opengl
I dont care for nvidia. The tech is cool, eventually ill get an amd card with raytracing for half price used same as always no rush.

Do you have a picture of ray traced vanilla minecraft? I'm curious what that looks like.

It looks like trash. Every game Notch made looks like trash because he isn't an artist and was too stupid to hire one.

....you know that microsoft bought it and there's now a professional designer who changed 90% of the textures right?

Yeah, professionally hired to make sure it still looks as close to the ugly pixel barf everyone is familiar with as possible while fixing up the color balance a little.

Microsoft didn't purchase it to improve it. They purchased it so they could have movie rites.

taste: 0

Mate, are you mentally disabled? AMD has to share its 64-bit license with intel as intel has to share its 32-bit license with AMD. They are mutually dependent on each others tech. People dont complain about AMD's software because alot of their shit is open-source compared to novideos full proprietary shit.

Attached: IMG_2155.jpg (640x578, 32K)

What texture pack is this?

>it looks like trash
low-tier bait, go play fortnite pleb

>AMD has to share its 64-bit license with intel as intel has to share its 32-bit license with AMD
Most 32-bit x86 patents expired over a decade ago you retard. AMD and intel's licensing deal is about maintaining their duopoly so they have the market locked in. None of it stops AMD from open sourcing x86 patents under a copyleft so Intel can't just lock everything down.

>software because alot of their shit is open-source
Virtually none except for their linux drivers and broken dx11 visual effects samples. Meanwhile PhysX in its entirety is open source, including the CPU physics engine. And they contribute tons of work to open source scientific computing and ML tools and libraries. AMD barely does jack shit in comparison

I wonder if nvidia learned anything from its market-isolation failures of physx, gsync, and like a dozen other things.
And yet things that AMD contributed to quickly become standards and nvidia ends up adopting them.
Hmm.

Attached: [HorribleSubs] Joshikousei no Mudazukai - 06 [1080p]_00:18:51_22.jpg (1920x1080, 272K)

>AMD barely does jack shit in comparison
Except for Vulkan, freesync (in reality displayport), DX12, that hardware accelerated 'audio raytracing' that went no-where, and so on

>that hardware accelerated 'audio raytracing' that went no-where
the what now?

That sounds fucking amazing. Audio is another area of games that is woefully underserved.

AMD were working on audio simulations where they'd simulate it bouncing around a room based on surfaces and room structure. Can't remember what it was called but the last I heard of it was in 2014

Are you 10 years old?

Minecraft runs like ass despite looking like ass

Enjoy your 10 fps with meme tracing on

en.wikipedia.org/wiki/AMD_TrueAudio

>completely sporadic updates
>almost no mention of anything since 2013

fucking why

WHY

Why the fuck don't people care about audio? Is everyone except me fucking deaf?

Attached: 1532455369314.jpg (287x293, 16K)

>Vulkan
Which had be completely rewritten by a nvidia chaired consortium because instead of making Mantle open like they originally lied they made it so hardware specific that MS shat out DX12 almost a year before vulkan 1.0 was finished.

>freesync
Free as in no licensing cost to manufacturers. Not only is it just piggy backing off VESA's adaptive sync standard (which was an existing eDP protocol) VESA is a closed consortium that you have to be a paying industry business to be a member of to even have access to their standards. It's "open" in that members can propose changes. And freesync is specific to AMD hardware, also 100% proprietary.

>DX12
Claims of it being a mantle clone were dismissed by Microsoft. The xbox one games were developed for years before mantle was a thing so it's pretty fucking self evident that Mantle was copied from consoles

>that hardware accelerated 'audio raytracing' that went no-where
Which was originally a closed source solution in partnership with a proprietary software company and ran on their black box "trueaudio" DSP. They open sourced it because it was a failure used by like one game. Meanwhile physx is found in both of the most widely used engines today, UE4 and Unity with both the CPU and GPU accelerated components open sourced despite both still being used in commercial titles. AMD throws out rotten meat for the dumb animals when they fail, Nvidia actually contributes successful, fully functional software that they continue to support even now. You can look at their fucking physx commits, since they open sourced it they released an entire major version (v4) and make commits several times a month. Meanwhile AMD tosses Mantle to Khronos after they had like two games support it despite making it with EA DICE's full support. Trueaudio?

github.com/GPUOpen-LibrariesAndSDKs/TAN

LOL what fucking support? AMD's open source love is all bullshit. Intel contributes the most to FOSS by a mile yet where's the love?

Your retarded op

Ray tracing can be used on any gpu its just nvidia made some ray marching stuff and started marketing it as RTX so people get confused and think nivida got some magic stuff for making something that normally can take minutes to hours be done in real time

>If its not epic AAA raytraced meme graphics it looks like shit
go back to /v/

Wrong again, faggot. Feel free to keep trying. Maybe you'll figure out what kinds of games I like one day.

Attached: minecraft-rtx-dxr-ray-tracing-ogimage.jpg (1200x627, 413K)

This

The reason they picked Minecraft for their shitty RTX meme is that they know it's the only game that doesn't show significant frame drops when used.

This is the truth. Ray tracing is used in almost every blockbuster movie these days. At its peak, movies will probably not even require sets or actors and look indistinguishable from reality.

user is saying shaders are built-in to optifine. Not referring to the internal optifine shaders. I think at least.

>movies will probably not even require sets or actors
Looking forward to the revival of silent movies.

the version with ray-tracing is going to be bedrock not java, so performance shouldn't really be an issue with any card that can support rtx.

>implying all movies won't be voiced by jordan peterson's neural network

>Make vastly different assets
>Guys, it's the raytracing, we swear!

youtu.be/2xCF-JJdhNY
meh.

Why Jow Forums is so butthurt about ray tracing..?
It was meant to be happen at some point and everyone kind of knew it. Hardware was the only limitation. And it is still quite limiting (that's why RTX is only used to add a bit more realism to the 3D fakery and not to raytrace the whole thing).
Just admit that you're only butthurt because AMD has no hardware acceleration for it right now.

we are butthurt about prices. I'd be all over 2070S if it was $400

gayest post ive seen all day

>doing benchmark in windowed mode
is it brain damage?

>that one comment saying "omg so smooth"
>28fps avg
kek

>You fucking act like ray tracing is some gaymen fad that is about to fade away.
Who knows? It doesn't really add much to the experience, it's insignificant and no one is thrilled buying $1000 graphic cards for some improved shaders.
youtu.be/n68s9I28AvY
>This is the truth. Ray tracing is used in almost every blockbuster movie these days.
Yeah... As we all know those movies are crème de la crème...
>At its peak, movies will probably not even require sets or actors and look indistinguishable from reality.
And those same movies also exclude actors and are often badly received even by their intended audience which is teenagers.

That's about the worst fucking job I've seen at showing off the graphics.
Farting around on the surface doing jack shit, going away from light sources when they finally get their ass underground, and have the incompetence to both record the rest of their screen in windowed mode with fucked up volume, and poorly, halfheartedly try to act like they're in survival mode despite being in creative mode, looking like a retard who hasn't played more than 5 minutes of the game.

if already you mean for the last 15 years

>Minecraft
>ugly
life must be tough with braindamage this severe

>Is everyone except me fucking deaf?
yes.

True, tho you won't get decent performance on on this gen of GPUs, your best bet is either with G/RTX 3080 or even better G/RTX 4080 series or whatever is gonna be called by then.

> snorts scornfully then goes back to mario

Next time try to turn off the HD texture pack with bump mapping before taking a screenshot of that game. Oh what's that, you can't because it tis auto forced enable when the proprietary raytracing option is enabled? How about you enable the texture pack for the non raytraced version? Oh you can't do that either? That really makes me wonder how the marketing teams decided this was a good advertising move, but then again one literally release a fake graphic card with wood screws in it and the other is full of pajeets in damage control by now. Good stuff.

okay nvidiashills name everything I miss out on for picking a 5700 XT over a RTX 2070 Super
go

>Ray tracing demo with a shaderpack and PAID texture pack
lmao

>java game
>>>r/india

-good drivers
-good hardware
-compatibility
-no memory bottlenecks
-good engineering
-Nvidia ecosystem
-ray tracing
-future proofing
-high quality warranty service
-advanced driver control panel
-OC
-no dent
-superb thermals
-etc.

a chan post is not enough to list all the advantages Nvidia offers

I mean, who really gives a fuck when you can afford the best GPUs money can buy at this point?

The day you realize AMD has been astroturfing much of the enthusiast forums for pc hardware on the internet is the day you stop losing brain cells every time you get on Jow Forums

You do know that Nvidia not and owns physx right user?

English, motherfucker. Do you speak it?

That was sort of a crucial point about my post

- Anti consumer practices
- Cards age like milk
- Will pay publishers to unoptimize games for both their competions and their own old cards just so their new shit looks better
- Closed off ecosystem
- Telemetry built into their drivers data mining ur asses
- Intentional false advertisement of products to mislead consumer into thinking products are better than the actually are eg. 970 only having 3.5g of actual usable ram, GT 1030 DDR4
- Inflation of prices
- Will pressure the smaller card manufactures into following their own shitty practices
- The perfect product for npcs

Did I do it right?

Attached: m9tjkrsmmae31.png (627x637, 341K)

Currently have a Vega 64, had a 970 before that

>good drivers
In my experience, the driver expereince on AMD has been great, with consistent performance upgrades
>good hardware
Vega is really nice, it can actually overclock well and offers a lot of tuning options without having to do shunt modding
>compatibility
Have had zero compatibility issues except for stuff like Minecraft ratytracing mods, which just recently had AMD support added
>no memory bottlenecks
No memory bottleneck to speak of here either. As opposed to my previous 3.5GB meme card.
>good engineering
Very esoteric, the PCB on my Vega 64 is overbuilt and top notch, and the card doesn't look like it was designed by a 12 year old, so I'll consider that a win for AMD
>Nvidia ecosystem
This one is correct, all the good goys buy Nvidia, so developers are quicker to support Nvidia, but it's something I'm willing to give up in order to not fund Nvidia anymore
>ray tracing
Absolute meme unless you have a 2080Ti, any card below that gets unacceptable framerates in any game, plus there's enough RTX games that you can count them in one hand
>future proofing
Yeah, my 3.5GB card was super future proof, stuttering and memory bottlenecks are the ultimate future proofing plus drivers that gimp your performance
>high quality warranty service
That's partner related, not AMD/Nvidia related, the warranty support on brands like XFX, Sapphire and Powercolor is as good as Nvidia partners
>advanced driver control panel
Is this a joke? The windows 98 looking ass clunky and laggy Nvidia Control Panel CAN'T HOLD A CANDLE to the Radeon Settings menu
>OC
Unless you buy stuff like Kingpin cards, you're not going to get much overclocking done on Nvidia, with the locked voltages and only having a very limited power slider to play with
>no dent
Don't buy a reference design
>superb thermals
High end Nvidia cards are housefires, and need triple and larger slot designs to cope with the housefire chips.
-etc.

Nice FUD

Attached: 1459600941939.jpg (960x960, 57K)

It might look cool but you can get close the same experience with some decent shaders and a high quality texture pack.

Yeah, Ive had a similiar experience with Nvidia and AMD cards.

I went from a GTX 970 > R9 390 (Wanted to try freesync) > R9 Fury (Found it for a good deal and sold both other cards). I also own a laptop with a GTX 1050 which has been absolutely grabage with its drivers and just functioning. AMD drivers can have problems, but Nvidia is just as bad.

I also have to say AMD software is just so much nicer to use, though there are some functional differences with them, as the nvidia panel is easier to OC a monitor while AMD has access to overclock settings for the video card.

Also, my GTX 970, an FTW+ model from evga, 390 Nitro and Fury Tri-X have all been equally terrible at OC'ing, though I was able to unlock some shaders of the Fury which was actually really fun to do. I will say while the 970 was nice both of the AMD cards were really well designed in cooling and the PCB, each being absolutely gorgeous cards.

Yes, and it's not to say AMD is perfect either, if you have a cheap Freesync monitor you're likely to have flickering issues.
And just like with Nvidia, doing a clean driver install each year is a good idea. But overall, I've found the whole AMD BAD DRIVERS NVIDIA GOOD to be complete fabricated BS.
Even though I got rid of the 970 and have the V64 right now, my laptop (2017 Dell XPS 15) has a 1050 in it, and honestly, the control panel is even laggier than it was on my desktop computer, plus the GeForce expereince bloatware. Absolutely imbecilic.
At least with AMD I can turn off telemetry, and it doesn't come enabled by default either.
You can OC your monitor with the Radeon Settings menu since a while ago, but I still prefer to use CRU because it's easier.
Also, does Nvidia have something like AMD Link? I love the app and use it all the time.
Look at Nvidia, this week they literally striaght up 1:1 copied the latest AMD driver features. How can drones say that AMD is always behind and inferior, when at this point Nvidia is literally copying their features.

>Yeah, my 3.5GB card was super future proof, stuttering and memory bottlenecks are the ultimate future proofing plus drivers that gimp your performance
970 was aging quite normally actually. By the time 3.5GB wasn't enough for high settings the whole card was kind of outdated anyway and was only able to pull something like medium settings.
Never really understood the point of 8Gb of VRAM on 5 y.o. card. Having shit GPU and lots of VRAM won't make you future proof. It's always about balance.

what a special little nigger, what do you play, quake? faggot.

does ray tracing improve terminal font rendering? then why should I care?
Dont tell me you actually play games..?

Even still, as a matter of principle, Nvidia lied to everyone, false advertising, deceptive product. Even if you didn't need 4GB at the time, being given 3.5GB of working ram and 0.5GB of unusable ram is a fucking scam. And I don't know how anyone who bought that card decided to keep buying Nvidia after being lied to their faces like that.

>You do know that Nvidia not and owns physx right user?
What did he mean by this?

Just your usual macaco/pajeet shill struggling to speak english.

whish g was not riddled with techlets thus make echo chamber of their tech knowledge 2hrs into youtube channel

Attached: MH710_01__86808.1563544824.1280.1280.jpg (1280x854, 77K)

>Nvidia NAND owns
>???
>true

i really have a question..
what make minecraft such a success where you can meet from 6yo girls to 98 years old ready to die edition humans?

Neural Network dude:
youtube.com/watch?v=DWK_iYBl8cA

vocaroo.com/i/s1t1OCk1XHCW

Attached: 1566465329832.jpg (600x600, 42K)

> refused to license CUDA and physx for literally pennies per GPU

Proof that nvidia is doing small potatoes royalties on these things?

TrueAudio Next or something like that.
Read the Navi whitepaper. It is explained in there, although it's not exclusive to Navi.

Every time I see/hear that game now I get flashbacks from 6 years ago. That game was my only legit vidya addiction, I played that fucking autistic block game for thousands of hours over the course of 18 months or so.

Then last year I relapsed for 3 months, when I logged another 800 hours of gameplay.

>tfw I quit cold turkey by giving all my shit away
What an awful day that was.

SEUS ray tracing shaders do actually use raytracing.
The one you've posted are not shaders built into optifine, they are SEUS shaders. It's literally in your filename. But this ones are not raytraced.

>Freesync is specific to AMD hardware
So how does nvidia support it now?

Free as in freedom.

The simple answer is they dont, they just modified g-sync to use adaptivesync like freesync. It's like two different trains that use the same tracks so to speak

Intel does support it too. And current xbox consoles got it as a system update.

>So how does nvidia support it now?
Nvidia isn't using Freesync. It's using DisplayPort Adaptive-Sync.
If you're connected with HDMI, Nvidia card won't be able to use freesync (since freesync via hdmi is a proprietary extension from AMD)

"adaptative" sync is freesync. It got added to displayport and hdmi after AMD contributed it.

No no, freesync is AMD specific VRR that exists on their drivers and GPUs. It sends commands to the monitor's controller using the adaptivesync protocol. Adaptivesync was originally a protocol used in panel self refresh to tell the panel when to stop refreshing or when to refresh again. Hdmi also supports a protocol for VRR which freesync2 uses for VRR which is why even though nvidia's G-sync compatible works over DP's adaptivesync it doesnt work over hdmi yet.