Oh NONONONONONONO

Oh NONONONONONONO

*breathes in*

HAHAHAHAHAAHAHAHA

Attached: Screenshot_20190114_182044.jpg (1080x431, 64K)

Other urls found in this thread:

arstechnica.com/gadgets/2019/01/lg-announces-its-2019-oled-tv-lineup-plus-an-8k-monstrosity/
twitter.com/RyanSmithAT/status/1084174168835289088
imgur.com/8qTR8qK
twitter.com/AnonBabble

A non-gayming card doesn't support a feature primarily useful for gayming. Colour me surprised.

Why would people use HDMI over Displayport? That's like still using Mini USB over USB Type C.

>Using HDMI on your desktop

>Why would people use HDMI over Displayport?
Because AMD Freesync is so great that it causes random monitor black out during gaming for a lot of users when using Freesync over DP

only if your monitor has a low freesync range

Wtf, I'm 100% a amd fan, I can't fucking believe this is happening, fucking fucks, I hate my fucking life now. What the hell I'm going to do

>40-144Hz is a low Freesync range

Who knows user

;)

Attached: hdmi-photo-1000x500.jpg (1000x500, 26K)

>Non-gaming card
>No FP64 support
Pick one.
No, actually, pick none.

Ayyyyyymd

What are you doing

Attached: Screenshot_20190114_183012.jpg (802x681, 128K)

Hdmi 2.1 has more bandwidth than DP 1.4

>Gaming card
>HBM2
No u

>hdmi for anything without audio

Because no high end 4k monitors currently use anything more than displayport 1.2
Meaning you don't have the bandwidth for full 4k HDR 10-bit 4:4:4 signal.

This requires DP1.3 or 1.4, or HDMI 2.1

So far you can basically count the number of monitors with DP1.3/1.4 on one hand. And they're expensive, all at least $800+

Attached: 2019-01-14 13_35_21.png (786x632, 122K)

>In Hardware
What the fuck does that even mean? What it's gonna support it in software?

Wow and the latest dp is 64gbps you fag

Hey with that 16GB 1Tb/s memory they can put some killer neural networks in your games, so you can watch your ass get destroyed in 4K by a higher being.

How do you know? It hasn't been finalized yet, so there is no current next-gen displayport tech yet.

Sure it will likely be somewhere around 60gbps, but it isn't set in stone until the spec gets published and finalized. And since it hasn't even been finalized yet, you can be damn sure no products currently exist that utilize it.

Hell, DP1.3 and DP1.4 still aren't in almost any existing products besides on the GPU side.

Look for DP1.3 and DP1.4 monitors and you'll find they basically don't exist.

Awesome dude, was hoping for deepfake integration in TF2 if I'm honest but I'll take that

AYYMD IS FINISHED & BANKRUPT

AYYMDPOORFAGS CONFIRMED ON SUICIDE WATCH

I like these because there's still no mention of SR-IOV.
Please, please let this be the card.

Attached: Reaction - Wenge smiling through the horizon.jpg (1280x720, 78K)

Works in my machineā„¢

It's supported by the MI60 and MI50, which are what the Radeon VII is based on.

But I have a hunch they'll keep SR-IOV support to the workstation class cards that have workstation class pricing.

Please no. It just looks like repurposed MI50s to calm shit down due to laughable RTX series. They already crippled FP performance, let us have at least this.
The most disappointing thing about Vega was that it did not include SR-IOV. I don't give a shit if it's the second best performing card, I just want a decent card that lets me run VMs with GPU passthrough off a single card.

has 2.1 certification even been released yet? i dont see any 2.1 products anywhere. hopefully these cards can be updated to at least support a-sync on hdmi 2.1 televisions

arstechnica.com/gadgets/2019/01/lg-announces-its-2019-oled-tv-lineup-plus-an-8k-monstrosity/

well you could be in luck

>SR-IOV is not on the disabled list. But regardless, we'll be sure to confirm it for the review.
twitter.com/RyanSmithAT/status/1084174168835289088

All TVs this year from Samsung and LG

pls
pls

Attached: High impact sexual violence...jpg (400x489, 56K)

Can amd even do deep fakes?

Probably doesn't, but it's cool that you think that, because it gives that *sync is a meme.

LG's newest line have 2.1

> 4k HDR 10-bit 4:4:4 60Hz
So you be sayin DP 1.2 doesn't support these parameters? How about I prove you wrong?

Are there any out yet?

And i know minimally their new 5120x2160 5k ultrawide only has DP1.4 and Thunderbolt 3 with HDMI 2.0, not 2.1.

Wow, fuck this shit, navi better have HDMI 2.1 and VP9, AV1 and HEVC full fixed function decoding on hardware for the highest profiles.
Sick of their hybrid bullshit decoding.

Attached: please do something.png (425x520, 229K)

You can go ahead and try.

DP 1.2 is 17.28 Gbit/s

4k HDR 10-bit 4:4:4 60hz is 22.28Gbps

We can't even get that on HDMI 2.0b

>I can't run N64 games at 4K 144Hz at 10bit color

Nobody cares.

You can have 8-bit 4k 60hz 4:4:4
or you can have 10-bit 4k 30hz 4:4:4
or you can have 10-bit 4k 60hz 4:2:2

You can't have all 3 though.

Attached: 2019-01-14 14_49_07.png (307x252, 9K)

looking at the repo, it doesn't appear to be the case, it's either CPU only or using CUDA for acceleration which sucks big time

They were launched as CES along with a few others but most were not full fat 2.1 like LG. LG will support VRR, auto low latency, eARC, quick media switching, quick frame transport. 4K at 120Hz and 60Hz full chroma.

Attached: file.png (568x509, 35K)

LAUNCHED or announced?

If launched, post some links to buy

Oh no no no no no...

Attached: Screenshot (20).png (308x325, 32K)

Announced sorry. But not long before you can buy one.

You can't get it on HDMI 2.0, that I agree with.
> file too large
4channel should step up its game.
imgur.com/8qTR8qK

Yeah, a GPU that came out in mid 2018 compared to a GPU coming out in 2019 when TVs and monitors are currently releasing with HDMI 2.1

Great comparison.


AMD had the chance to not fuck it up, but look what we get.

Every monitor I own and every monitor at my company has displayport inputs. Are you retarded?

I specifically said DP 1.3 and DP 1.4 didn't I?

Are you a retard or do you just want more (you)'s?

I like AMD, but this is pretty shit I must admit.

Are you in HDR nigga?

That's the

>RTX 2060

specs.

Do you think a GPU that takes years to develop can just have a new feature tacked on at the last minute?

>Every AYYYYMD fanboy is going to swam this thread and say "WEEE DIIIDD NOTT WANNNT ITT ANNYYWAYYY!!!!!"

Maybe a bit of column A and a bit of column B. Can you answer my question now?

All this despair and I'm just chilling here with a 1440p 144Hz freeysnc monitor. I'm going to grab Vega VII and hopefully get some shekels of the Vega 64 after selling.

Attached: it can't be helped - Haruhi.png (664x602, 315K)

No HDR enabled.

The DP 1.2 spec literally doesn't support HDR.

Attached: 2019-01-14 15_00_37.png (838x453, 34K)

Oh, I don't have HDR, my bad. But HDMI 2.0 doesn't support 10bit without HDR, so I'm pro DP, still.
Sorry, I weren't paying attention that you specifically discuss HDR and not wide gamut.

What question?

None of the monitors you're talking about support DP 1.3 or DP 1.4

HDMI 2.0 can support 10bit on its own
10bit is a requirement before you can even have HDR

Only at 4k 50hz with 4:4:4.

4k60hz 10-bit on HDMI 2.0 requires 4:2:2. Which is fine for a TV, but not acceptable for a PC monitor.

who uses double precision for deep learning?

non-free trash
DP supports audio

Aha, so that was probably why I couldn't get 10bit to work out of the box. Thanks.
I heard that HDR can be faked. Like, HDR is a description of nits across the screen. So standard range TVs can take that info and compress HDR values to their 8bit range. Entry level LGs do that.

maybe don't buy shitty screens? I used a samsung and LG screen and none had a single issue. the LG even had a quite narrow range. my friend is using one of those more typical ASUS gayming screens and that works fine too. only issue i have ever read about was some early AOC models becoming completely white-screened. but that was only present in early productions models.

Why would they be retarded and sell a product that would cannibalize their "true" pro-grade cards?

>No HDMI 2.1
>No FP64
>Probably no SR-IOV
>No raytracing
>No DLSS
>300 watts
>No custom models planned
>Same price as a RTX 2080 (Non FE)
Reasons to buy this heap of junk? None. This coming from a stock Vega 56 on Vega 64 firmware owner.

I understand the limitations I was just clarifying that 2.0 can support 10bit on it's own. HDR doesn't have to be enabled to get 10 bit

>Shitty screen
>$400 screen, no reviews mentioned anything about any issue
>Worked fine with DP on my GTX 970
>Switch to AyyMD and activate Freesync
>Random black screens
>Switch to HDMI
>No black screens
Really makes you think

>Vega 64
Of course you wouldn't upgrade, I wouldn't at that point.
I have Polaris and am quite considering it. I was willing to pay Vega FE prices if it had SR-IOV. $700 is a steal in comparison if it has the feature enabled.

On your points:
>>No HDMI 2.1
I don't give a fuck about HDMI 2.1, don't have any 4k screens and won't have them for another several years. It still sucks that it doesn't have it tho
>>No FP64
at least it has good compute, very good for the guys wanting compute for less than $1k
>>Probably no SR-IOV
hopefully has it, but I'm sure they're going to disable it too
>>No raytracing
raytracing, lol. I sure love losing framerate for no reason. too early to adopt, none of the current cards will amount to anything in that regard.
>>No DLSS
lol
watts
unless you live in a third world shithole, that's negligible. If heat is your concern, then that's a different story
>>No custom models planned
That's a shame.
>>Same price as a RTX 2080 (Non FE)
You mean same or lower price than custom 2080s, yes? There are plenty of air cooled 2080s at the $800 on newegg. I'd say they're both the same price.

lack of FP64 support is a bigger issue for professional use than a card like the RTX Titan not using HBM.

>the new proprietary port isn't supported
>user laughs
this new Jow Forums is sad

Attached: 1484067692219.jpg (444x460, 112K)

It's just not very inspiring. Even if I was thinking of upgrading from a RX 480 for instance.

Theres more than 1 HDR spec but they all have requirements for contrast, max brightness, and color gamut. Most monitors dont go past 100% sRGB. Some of the pro monitors get close to 100% dci-p3 which is ideally what you should have for HDR, and only a handful meet 100% AdobeRGB

Well, duh. It's just nerfed MI50. It was never a gaymen card. Wait for Navi if you want that.
I haven't noticed much problems running 1440p on older games or newer ones with lower quality settings, a VII would be much better in that regard.

>no fp64

being havled means no support? lol

You cared enough to reply :*

Attached: 1543846629189.jpg (545x526, 233K)

But NVIDIA but NVIDIA but NVIDIA

Nice cope

>So far you can basically count the number of monitors with DP1.3/1.4 on one hand. And they're expensive, all at least $800+

Why is this, user? DP 1.3 is more than four years old at this point, yet it's near impossible to find any products that support it, despite it having huge benefits over 1.2. What's the hold up?

it's the 2060
COPE

lmao, 2060 is still based on a GPU that came out in 2018.

The Radeon VII is the first consumer 7nm GPU.

Radeon VII is based on the mi60 that came out in 2018
COPE

maybe it's just faulty? if an advertised feature doesnt work properly why not simply return it or insist on the manufacturer to fix it? that's what consumer protection laws are for. as i said few of the early screens suffered from issues that could be fixed by sending it in for free. but that isnt exactly AyyyMDs fault...

HDMI jews

>despite it having huge benefits over 1.2
It doesn't, except for the few displays that require it. There's almost no 4k 120hz screens that would require DP1.3 . Probably because there's no content, no video and games aren't really playable.

Freesync works without issues on HDMI. Clearly the monitor isn't faulty.
Google that black screen issue, it's there since Freesync came out and AyyMD never solved it

>that came out in 2018
the MI60 and MI50 still aren't out.

proof it

how about your proof your spelling

Well the MI60 was supposed to be sometime in Q4

MI50 was supposed to be Q1 2019

So far i've seen no evidence that either has started shipping. No one will sell you one anyway.

ofcourse not, it's not a retail card
but if you contacted AMD you could've had that card for atleast a month

>could've had that card for atleast a month
Proof?

There was a single question in my post. Are you that bad at reading that you can't identify a question mark?

Then the answer is not shit-for brains, but you've clearly demonstrated that you yourself are at least moderately retarded.

some already have them in racks

Attached: AMD_Radeon_Instinct_MI60.jpg (1920x1080, 506K)

lol, that's the stock image AMD issued for marketing.

Show me proof of customers actually getting cards.

not just a datacenter that was beta testing hardware. Or internal AMD testing.

fine, you got me
but pretty sure I read somewhere some partners have them already

And you keep falling for it. Thanks for the yous.

1.3 can't do 4k60

YOU started lying when you said the words "my company"
You stupid dog cunt.