Oh No No No No Noes

Oh No No No No Noes.

Attached: r7_dlss.png (768x534, 210K)

Other urls found in this thread:

pcbuildersclub.com/en/2019/01/the-amd-radeon-vii-supports-dlss-via-directx-12/
twitter.com/SFWRedditVideos

now we know why jacket man was mad

NOOOOOOOOOOOOOOOOOOOOOOOOOOOO BUY NVIDIA GOYIM

Forgot link.
pcbuildersclub.com/en/2019/01/the-amd-radeon-vii-supports-dlss-via-directx-12/

Stop kvetching and buy nvidia dumb goy, rtx is the real deal.

doesn't change the fact that it's a housefire pile of shit

I'm actually selling my 1080Ti to buy this shit.
I know it's a sidegrade, but I hate NVidia for not providing an upgrade at the same price-point.

>NOO IT CAN'T BE HAPPENING IT'S A LIEEE

Attached: 1537862629975.png (900x600, 492K)

WAIT FOR NAVI

ME PROUD AMDRONE

ME FINALLY HAVE ACCESS TO NVIDIA TECH

ME NEVER HAD SOUR GRAPES

Attached: 1497642804318.png (802x799, 49K)

but why

AMD Should just integrate their own OCL based """DLSS""" into the driver and allow and option to force it in the game without explicit support.

DLSS is literally a fucking upscaler. Well a good upscaler but it literally is as simple as upscaling an image.

Advanced video players already employ machine learned upscalers, AMD just needs to add it in the driver.

How long before we get PhysX as well?

My little brother was gonna get a 2070.
So I'm selling him my 1080Ti for a bit less moneyz, while leaving the green ship.

Wow you really showed nvidia whos boss.

Seeing as this and the vega 56 are the only cards worth buying on the whole fucking market I'll be getting one day one even though I have a gsync monitor.
I would much rather get a 2080ti but I refuse to buy something that doesn't support actual overclocking. The day nvidia sells me a full gx100 die without overclocking restrictions for under $1500 will be the day I switch back.

>[smiles in smug]

Attached: lisa-su-smug.png (240x286, 123K)

When navi? I can't wait anymore, my R9 270 is pissing me off, I need a new vga. Might end buying a RX 580.

Next year

Ahhh. Just when you think these motherfuckers slowed down

So what? DLSS is not something you or anyone wants and it's only marketed as if it's a "good thing" to cover up for the horrible performance performance-penalties attached to real-time raytracing.

The simple truth is that DLSS is just a marketing propaganda term for UPSCALING. Supposedly "machine-learning assisted" up-scaling but does not change the fact that it is, in fact, a technique to somewhat hide that the game is being rendered at a low resolution and upscaled because the GPU can't handle rendering at the proper resolution.

Ladies and gentlemen, this is in your face, these psychos are selling high-end products that can't even handle 60fps at 1080p and they are covering this simple fact up with upscaling and pretending it's a good thing by using a marketing buzz term.

Attached: sulittleli_30589782_522809504781926_4062236246729031680_n.jpg (1080x1240, 115K)

meh
Im not upgrading from 1080p until there is decent priced 4K 144Hz OLED or MicroLED monitors

Attached: your ancestors are not pleased.png (156x147, 23K)

So it can upscale? That's all Ayymd got? lmao

Interesting.
Depending on the benchmarks and if there will be AIB options available, I'll probably sell my Vega 64 and go up to the Radeon VII.

4k is pretty much useless unless you play on a 42 inch screen
a 27 inch 1440p screen is going to be more than enough

Well, that's about what I can do to steal them sales.
Everyone's winning, so isn't it great?

why? physx is dead anyways

4K is pretty accessible right now.
I'm actually surprised.
Still 60-75Hz monitors, but 300-400$ budget.

I used to do a 2x27" 1440p+1x1080p setup, now I've got a 3x27" 4k setup. The difference between 1080p and 1440p isn't that much in terms of desktop space. The difference between 1080p/1440p and 4k is significant. It really is a huge improvement even at 27". For watching movies at a distance.. it's not that noticeable if it's a 1440p or 4k IPS. For desktop use close to the screen.. it really does matter.

My advice: If you're on 1080p and you're upgrading: Go directly to 4k. 1440p would be a mistake and you won't use it very long before wanting to go 4k. And there's not much reason to go 1440p anyway, there really isn't much of a price-difference and in some cares there isn't any.

Attached: sulittleli_14272086_330437360634644_1948786920_n.jpg (1080x1080, 69K)

While i'm torn on the eternal debate of 1440p high refresh rate vs 4k 60fps I will be upgrading this year and there is an 80% chance the radeon 7 is the card I will buy so any further features it supports is a bonus.

4k supersampled to 1440p still looks pretty good, when it comes to games that'll run at that resolution fast.

I run 2560x1600 supersampled to 1920x1200 myself.

What about 4K for DPI scaling for improved text and UI rendering?

Attached: ipad_screen_comp_text-4f66a5c-intro.jpg (640x480, 274K)

Ther issue is good 4k monitors - particularly freesync ones - are hard to come buy. The best 4k monitors are gysnc ones and in the scenario of using AMD thats a lot of added cost for functionality you are not going to use. There are a lot of very good 1440p screens with a vast featureset.


Downsampling kind of misses the point unless you have enormous amounts of free performance.

there used to be a 4K IPS Freesync monitor from Asus the MG24UQ, it was only 24 inches but now there are no 4K IPS Freesync monitors, wait till new monitors arrive in Q2

>apologize.jpg
Why does anyone want DLSS or an equivalent anyway? It looks like shit and is just a meme so normalfags can say their consolebabby dogfucker games run in "4k" (upscaled from 50% render)

what is dlssssssssssssssssss?

What are the expensive tensor cores in my 2080 for then?

actually based and redpilled

leather jacket tax

Turning potato into slightly less potato and pretending its really high quality.

Increasing TDP and cost.

fancy abbreviation for meme upscaling.

Not games

>encouraging your little brother to use Nvidia's spyware

Okay weirdos, why is no one mentioning the fact that 1800p runs at same framerate and looks identical to "4k" DL "SS"?
Don't be drones and don't repeat Nvidia marketing brochures.
Tech could have potential but currently it's a nothingburger.

Attached: IMG_20190121_200624.jpg (1928x1080, 377K)

i dunno, try them out on linux, they may have professional use

(2/2)

Attached: Screenshot_2019-01-21-20-07-55-570_com.vanced.android.youtube.jpg (1878x386, 59K)

There is a 300 burger bucks power color Vega 56 on ebay right now. Sold by newegg.

>DX12
INTO THE TRASH IT GOES

Wouldn't it impact performance without dedicated hardware for it? Making it both uglier and slower?

There will be no AIB, it's a stopgap card that's meant to get cannibalized by navi. The reason these are going to sell is because they are binned mi50 cards they need to get rid of. There will be an exponential demand for them based on the performance, and if it reaches higher than the 2080 then the card will sell much more

Attached: e1b26749-7ccd-4d81-b0d3-2f318179acbb.jpg (750x730, 34K)

Actually may be this year, navi codenames were found in a MacOS driver.

>TAA
A straight-up blur filter isn't doing your point any favours user.

> A straight-up blur filter
That's what DLSS is. Smoothing filter for upscaling.

> meant to get cannibalized by navi
True but not even Adored's super hype leaks put Navi in the same performance class as the VII. Depending on the binning of the VII the gap between it and the 2080ti might be able to be reduced with some hand tuning.

Oh no no no, it's same good and not dead thing as HairWorks and RTX

Attached: 1547225749178.png (593x568, 219K)

Lisa said the Navi cards are scheduled for this summer at Computex indirectly, wait for June. They will follow a launch pattern, going on sale 2 weeks after launch.
Navi was delayed because there was a flaw in production finalization and it needed to be sent back to TSMC for finalization but the team here said it performs much better than they expected it to. Testing is underway for those cards right now

Yeah and why the fuck would you use TAA at one of those retard-o resolutions like 1800p other than to make the DLSS screenshot look better than it really is by comparison.

Radeon VII is still going to be best AMD card this year, so 2019 Navi will be more like midrange.

>yfw amd doesn't need any specialised coars to do any of the special features

Attached: 1496913733239.png (1000x1000, 387K)

how i can tell it's the same if picture you provided not 4k resolution? :thinking:

580 is $120 used.
just buy.

The chiplet process and die shrink made things cheaper but Navi will be a tier under VII, by about 8%. We have been prioritizing efficiency over power. The price of the high end Navi midrange card will actually be 325.

Hairworks worked bretty well on AMD due to DirectCompute usage. Also MS trademarked DirectPhysics one year ago, renaming Havok.

Like Gay Tracing. They can technically do it, it's just probably going to be slow as fuck.

Not that it runs very well on nvidia's cards either but hey ho.

where's the native 4k?

it's also jpeg compressed
fake news :)

Hey now, I enjoy paying $350+ so that I can play modern games on medium with slightly different lighting at 40 fps.

There is a point when you cannot Just Watâ„¢ - luckily for me i'm waiting on some good monitors to drop so I can hold out until computex but once Navi is released I will go and buy either top end Navi or Radeon VII (lel if there is any stock left). I don't care too much about cost as i've had money set aside for whatever card I buy for the last 3 years and I am not going to buy Nvidia.

its also a 1080p screenshot.

In that article they said that 4K DLSS looks and performs just like 1800p TAA, so instead of using DLSS you can just use 1800p TAA, you won't see any difference for better or worse in either performance or image quality.

You mean that awesome barely noticeable lighting difference that you definitely won't actually notice during a gaming session? It's most certainly worth the extra mortgage on my home!

THEY CAN'T

based and teamredpilled

We still have Raytracing bros. Wait for Anthem.

I unironically noticed that sweet 85mm lens before the girl's funbags

Now I want it to be trickled down to lowly 290. For maximum nvidiot butthurt.

The problem is one day games will get really demanding and you'll be paying through the nose to try to hit high fps at 4k. Better to stick to 1440p if you want to save big bucks on hardware.

Nvidia be misleading again and again...
(1/3)

Attached: IMG_20190121_205941.jpg (1080x1686, 394K)

>WHY AM I PAYING FOR IT NOOOO

Attached: 3457345734.jpg (320x319, 18K)

you can play above 60 fps at 4k with current GPUs by simply turning down a few settings

(2/3) Full 2160p screenshot.

Attached: F-13.jpg (3840x2160, 455K)

(3/3) Full 2160p screenshot.

Attached: F-12.jpg (3840x2160, 521K)

Here 1800p looks slightly better
And here DLSS looks slightly better

Still I don't get why would anybody upscale 1800p or 1440p to 4k. If anything, upscaling 1080p would make more sense.

NO NO NO NO THIS CAN'T BE HAPPENING NVIDIABROS

Attached: 1487818729490.jpg (796x805, 119K)

DLSS is blurry. TAA is grainy.
~1720p SMAA would probably look better and perform the same.

Well it's meant for 4k displays with high pixel density so you shouldn't notice the difference. Though honestly I doubt anyone would notice the difference between 1440p TAA or DLSS on a 4k display anyway so why even use more expensive and less performant DLSS I wonder.

>Still I don't get why would anybody upscale 1800p to 4k.
I don't know. Why would anyone run "4K" DL"SS" in its current implementation? Both look and perform the same.

It works

Well I don't see a point in resolutions above 1440p for PC gaming at all. 1440p is perfect for 25" displays and getting bigger display for a PC is just dumb. Even if you play on a TV, you would sit farther until you can see the entire screen without moving your head and it will look same.

>jpg for quality comparison

Attached: h8yy12er.gif (600x338, 1.02M)

Bigger displays are more immersive in my experience. I used a 40" 4K monitor for gaming and am now on a 27" 1440p and kind of miss the bigger one. That said I did have to move my head on the big one.

In our pursuit of absolute perfect visuals, we emptied our wallets for leather jacket man. May we burn in hell for our sins.

Attached: 689456.jpg (938x729, 21K)

how can the leather jacket man straight up lie and get away with it

leftover from compute cards

They're for entertainment, user. They're for entertainment.

Attached: 0385 - id1ZbZ1.png (526x437, 101K)

>NVIDIA TECH
nice troll almost got mad

Attached: 1510530830552.png (616x596, 61K)

>yfw AMD fags don't realize NVidia will just pay to developers to lock DLSS to be an NVidia-Exclusive feature that's gimped on AMD cards like goyworks

Attached: 1534735573172.png (1418x2231, 1.84M)

I wouldn't use it anyway, I'd rather use native 4K or upsample from lower resolution if I needed more FPS.

Will it be cooler, more efficient than the 1080ti and provide equally good or better performance at a competitive price point? I seriously doubt that. Radeon hasn't been relevant in the high end GPU department since 2011 or so. I loved their old cards, and the old XFX green/red cards like the 5770. I'd love to try them out again if they get back on track.
Currently using a 1070ti that I got on some flash sale on Amazon.

Hey buddy, this thread is about AMD and DLSS. Try and keep up, ok?