Redpill me on this

redpill me on this

Attached: file.png (626x429, 379K)

Other urls found in this thread:

lmgtfy.com/?q=nvidia now supports freesync
youtube.com/watch?v=54kPNv4s-5M
twitter.com/SFWRedditGifs

It's a GPU.

It hot, slow and loud. Buy nVidia.

you can undervolt to fix that though

wait until you can buy them for ~$400 when the price inevitably drops like every piece of shit AMD product out there

A rushed release of a product to take the spot for first 7nm consumer GPU.

>16GB VRAM
>Very Fast HBM2(better than GDDR at the moment)
>No ray-tracing cuckery
>benchmarks similarly
>you can actually afford a monitor with freesync

It's probably a fine GPU but they should have made a 8 GB version to get the price down. It's too much money. In fact when VII came out I bought myself a Vega 64 (Sapphire Nitro, 3 fans too) since it's almost as fast and a lot cheaper. I'm not going to need 16 GB of VRAM anyway. Of course if I wouldn't have to pay for it I'd take one right away.

AMD drivers are still shit despite people saying otherwise.

>he uses windows

Aren't they even worse on linux?

Doesn't Nvidia support Freesync now?

This or rtx2080 XC? Gonna eGPU it to my x1c

>No ray-tracing cuckery
sounds like a bad thing since it's been popular lately and most devs seem to be interested in adding support for it... I hope this is temporary like with the PhysX meme.
>you can actually afford a monitor with freesync
latest Nvidia GPUs support both G-sync and Freesync iirc.

Don't get me wrong I'm probably gonna buy it in a few months anyways.

You can buy desk ornaments for much less money than that.

frankly my dude, get the rtx 2080, and that's coming from a vega 64 owner
this 7nm vega gpu won't get any magic drivers to boost its performance by much, so just go for what is known to perform well

>The sky is purple in my world I don't care what any one else tells me

>It hot
Absolutely not
>slow
by that definition it's direct competitor the RTX 2080 is slow
>and loud
Not any louder than similarly priced 2080s

>Doesn't Nvidia support Freesync now?
Please someone confirm this. I don't want to waste 700 euro on a 2080 only for it to not fucking work with my monitor.

lmgtfy.com/?q=nvidia now supports freesync

Support is very hit and miss.
With the first Freesync supporting driver my 1080 worked perfectly with my AOC AGON AG322QCX, then they released a new driver which caused black screen every so often for a second or two when Freesync was turned on and I don't think they have fixed that one since.

don't post that vega 56 beat 2080 ti in World War Z. that doesn't count.

Same price and performance as a 2080 without all of the NVIDIA goodies. Had it been $200 cheaper, they could have slaughtered NVIDIA. Fucking AMD.

I wanted to tell you that it makes more sense in Hungary because the last time I checked the cheapest 2080 was 260,000 HuF which is around 866€ and the Radeon VII is 230,0000 or around 766€ but now the cheapest 2080 is 210,000 (700€) or the cheapest one with a decent cooler is 220,000.

looks cool

>you can actually afford a monitor with freesync

VII paired with a Acer Nitro XV273K is smoooooooth.

Attached: smooth criminal.png (1614x1630, 345K)

the only viable card for editing 8K video

utterly useless for anything else

>Nvidia goodies
Like what?
Does Nvidia have proper colors, FluidMotion tech, proper oc/uv tuning controls, actual working drivers, a superior recording/streaming integration or monitoring software?

>Not any louder than similarly priced 2080s
bullshit

imagine a company releasing such a flaky product that it forced the consumers to fix it themselves.

Paper launch, because marketing.
As a consumer, ignore.

Undervolting is playing the silicon lottery, just like overclocking.

Why try to stick an margarita umbrella in a pile of shit? The 2080 is straight up better in too many ways for the Radeon VII to compensate.

Intel buyers seem to love delidding their processors.
Undervolting is a one click task.

>unironically owning a radeon vii
cringe. I hope it was a preorder but I'm not sure if that's much better.

I don't get it, is it good or bad? Please help.

It's a good performance card but an inefficient one that should have been priced better

It's bad. Read the reviews online about it if you're unsure, and it's pretty clear it's not competitive with the 2080.
There are specific applications that it can perform well in, and gaming generally isn't one of them.

Price makes it unattractive.

The only acceptable overpriced hardware for 4k

wait for navi

checked
it's fine but board partner versions of 2080 have quieter coolers than radeon vii's and both costs about the same

>AMD GPU
>Unattractive price
Oh the irony

No, where the fuck did you hear that?

When will AMD finally finish ROCm neural net compute support? I want to try trainings neural nets on this bad boy, 16 GB of HBM would be a godsend. But since AMD equivalent of CUDA is shit, no major framework can use it, except as buggy experimental releases.

Just look at their Ryzen 1400 (basically any first gen Ryzen 5 which isn't the 1600) /2400 if you want more AMD products with bad prices.

This is why being a fanboy is dumb. You should always check individual products instead of saying this company is good or bad.

unlike the green jew, amd supports linux and decent drivers are available both open and proprietary

It's not bad, it's just using special expensive memory which makes the price tag too big. Maybe in the future it can be found on a discount and then it's going to be an interesting product.

You CAN game with that card and it is going to perform well. If you don't have to consider the price tag then you should not be regretting it later. At the moment Nvidia's GPU might be a better purchase since it's using cheaper memory which then shows in the price. The way people are shilling both these options makes it sound like one of them gets 1 FPS and the other gets 999 FPS, but the difference is probably barely noticeable with the naked eye. Just buy what you want.

Nope. I looked at reviews, saw performance numbers, remembered I don't play world of tanks (twice) or fortnite and bought it a week after it was released.

Probably someone trying to install the proprietary drivers on a distro with an older kernel.

the only affordable GPU with 4 HBM2 chips
fuck GDDR

>bunch of useless 'features' some of which are just outright lies
Come on now, mate. I hate NVIDIA and actually go out of my way not to buy their cards but there's still no reason to even look at anything over a Vega56 from their side (as well as there's no reason to look at anything below a Vega56 from NVIDIA's side).

None of them are "bad" cards so just buy whatever is cheaper or, if you plan on keeping it for a long time, I'm sure the VII will last longer due to the massive memory.

Slower than a stock 2080 and oc 1080 ti
Uses more power.
No hardware rt support but can brute force it in software compute
Terrible cooler
Bad bios bad fan curve bad binning
Only plus is the oc suite wattman is probably fixed.
The card gets 2ghz+ under water uv apparently but your still nowhere near a 2080ti stock which is only 150usd more expensive and 30% faster but with less vram
I'd rather have my 2080ti it sips power even oc and amds turbo algorithms are dogshit

better than the rtx 2080 and 1080 ti
uses roughly the same power and is a little louder

Coming from an AMD fan just get a used 1080 ti for $500 or wait for Navi in July

I say this as a ex v56 390x and amd fan
U wish
This

Yeah, remember those nvidia 1080tis that you needed to take apart and apply the fucking thermal pads by yourself? Lol

$700 RTX 2080s are blower or bottom of the baral msi gaming x cards. There are quieter 2080s but they are $800 cards.

Nah mate, fuck off. I've got both Nvidia and AMD systems. The Nvidia drivers are cancerous bloat botnet, and their "exclusive features" are garbage like gimpworks (which I can run on my AMD system anyway with a tessellation level tweak).

i'd take 16gbs of vram over rtx any day

What for? There's no vidya that benefits from it yet and rtx is actually useful right now in actual terms of professional software using it just not in games.
I don't think consumers will see much out of rt.
I tried the demo's and games that use it so far ran like poo and didn't look all that much better than raster.
Reflections and real gi shadows and fully raytraced game engines are a decade away minimum.
I was lucky to see 60fps 1440p so I just turn it all off and go from 30-50 to 100+
Plus raytracing eats memory so the only cars that would benifit from 16gb of hbm2 can't even do it in hardware and is inferior to turing in every way.
I say this as a amd fan with a 2700x

I just don't understand why there wasn't a cheaper version with less memory.

games already need more than 8gbs of vram for 4k gaming
rtx is only supported in 3 games, 2 of them are bad and the other is locked behind a chink spyware launcher
do you think the average rtx consumer cares about editing ray traced stuff? the vega cards including the radeon 7 can do ray tracing and they also have the software to do so, GCN cards have both computing and gaming horsepower, nvidia cards used to have horsepower only up to the turing series where they decided to do the same thing amd has been doing for 8 years and its constantly been improving, try to do something compute intensive on a pascal card and then try the same thing on a polaris or vega card, you will notice the difference right away

There is - its called a vega 64 (basically).

because it would be inferior to the rtx 2080
even if it managed to lower the prices it wouldnt have done much since nvidia has the upper hand with nu-rtx

Nah the vega 64 is 30% worse than the radeon 7
and overclocked vega 64lc might come closer to the readeon 7

fpbp

the 2019 navis will be slower though.

no they wont

Another Massive Disappointment

I don't, matter of fact.

While there are minor core config difference all a radeon 7 is is a hotclocked vega 64 with double the memory bandwidth. If you run a V64 at 1750-1800mhz you will get as much performance as a VII....unless you need more memory bandwidth.

Thats the whole point of why the VII has its performance lead - its not clocked that much higher than a V64 liquid cooled model AND has less cores but vega fucking demands memory bandwidth and the VII delivers.

Redpills cause brain damage.

Attached: 1515957429766.gif (500x245, 1.24M)

Because it's just a neutered Instinct card. It was never supposed to see the light of day. But AMD expected Turing to be more powerful than it was, so when they saw the 2080, they took the MI150 and tried to match it. It's a minimal effort play, meant as a cash grab for AMD fanboys and people who don't know any better.

Your point? 5gb more vram don't make up for the fact Vega 20 is extremely underwhelming all round.
I've yet to see rtt running on Vega besides that crytek demo
Maybe stadia will have it?
I am not defending the poor state of rtx in games but I literally don't use it so why should I give a fuck
14tflop in the console version so around Vega 56-r7 level who knows for sure though specs are up in the air.
If this card came out 2 years ago with 8gb of vram it would be impressive but it's half way to 2020 and it's slower than a card 2 years it's senior from a previous gen

This.
I actually liked Vega 56 but I sold it.
Went back to 1080 then 2080ti
Nothing else can handle 4k unless the games mad optimised in vulkan or dx12 which only a handful are

The 2080 and VII can handle 4k just fine unless you insist on running 8xmsaa. Also: 4k without VRR is retarded.

>It's okay when nvidia does it!

vega 56 performance? if the ps5 is gonna have 14tflops of power then it will use a navi card that has the power between an rtx 2080 and an rtx 2080 ti

Vrr?
Also no the 1080 Ti and 2080 struggle to hit 60fps minimum in well made titles
Also the 2080 only has 8gb of vram

So it has 1080ti oc perf 4 years after it came out? That will barely be midrange going into 202x with consoles and gaming pcs pushing 4k 100fps+
4k 144hz is already here

5% slower than the 1080ti

Attached: 1549848840413.png (1920x1080, 546K)

7% slower than the 2080

Attached: 1549849950733.png (1920x1080, 494K)

That arguement is so fucking retarded dude
the rtx 2080 has 1080 ti performance and 1080 ti price 2 years later and with less vram

So the worse value card having worse performance for the same/more cost is retarded argument?
See
4k is completely off the table lol

Its really neither. It can compete with the 2080, but its very inefficient. You need to undervolt it in order to get better temps and performance, but its 16GB of HBM2 is real a bang for the buck deal.

Oh yeah, its basically just a lesser-binned Radeon Pro MI50. That's it. Nothing special. I'd just get a used 1080ti with a warranty for $150 less, or for Navi for a better price/performance combo.

Attached: 1547316494289.jpg (1200x798, 72K)

Dead/unsolded Mi25 rebranded to Vega VII without some pro feature because AMD don't have fast gayming card

Attached: 1069.jpg (1279x619, 86K)

Variable refresh rate aka freesync and gsync.

Gsync is garbage.
Freesync is based and works perfectly on nv cards
Also I have 4k 3440x1440p 100hz freesync so it's not pointless.
In most games my gpu barely goes above 50-90% usage because of 100fps cap lol

Powerhouse GPGPU that can game on side but simply cannot compete against its competitors in gaming stuff.

>world of tanks twice

its a rebranded pro GPU Insticnt, its not a good choice for gaming

Attached: 04e72e65f35ad1d3546afe3431e1cc96d698772dbc9eb381446a450f60591a7d.gif (197x197, 874K)

not even a cash grab but more like a halo high-end product needed to keep AMD brand in the spotlight, the margins in the pro\enterpise market
and much higher

Greens did literally the same with the TITAN and ti cards. But that was okay because leather jacket man's dick is more delicious than Lisa's.

The 1080ti is the rebrand of wich pro card?
The Titan Xp is the rebrand of wich pro card?
The 2080ti is the rebrand of wich pro card?
The Titan RTX is the rebrand of wich pro card?

Attached: GP100Chip.jpg (1252x1234, 359K)

I so love that everything that is in the past even just for one day is completely gone and lost, so nvidia literally can never do anything wrong, at least in the eyes of the mindless drones and unpaid shills. They totally didn't bribed and threatened the devs of 3Dmark to fake the benchmarks depending on which brand you use to get out better. Totally didn't happen.

>I hope this is temporary like with the PhysX meme.
Nope, nvidia's current implementation maybe, but after seeing it implemented in minecraft, I would be lying if I didn't say that ray tracing is the future. This is literally the next leap in graphics technology that we've been waiting for.

youtube.com/watch?v=54kPNv4s-5M

Attached: Minecraft-con-SEUS.jpg (1240x698, 317K)

the blue pill is the red pill for this hunk of trash

for now its an absolute marketing meme just like PCSS realistic traced shadows - which everyone already forgot about
afterall its all down to engines and game implementation and we wont see it until mainstream GPUs (think 1050\1060-tier budget cards) are capable of running it at playable framerates

Attached: tom-clancys-the-division-shadow-quality-001-nvidia-pcss[1].png (1920x1080, 3.32M)

>Undervolting is a one click task.
Which makes it even more strange that it isn't done from the factory.

They overvolt it to get more performance out of it, brainlet. AMD cards are very efficient until a certain point, where they simply can't handle it anymore. By default all cards are set way beyond that point to be competitive in gaming performance. You can choose if you want lower temps though by sacrificing that 4% fps.