/pcbg/ - PC Building General

>Assemble a part list
pcpartpicker.com/
>How to assemble a PC
youtu.be/hGiAfMoYEjI?t=92

Want help?
>State budget & CURRENCY
>Post at least some attempt at a parts list
>List your uses, e.g. Gaming, Video Editing, VM Work
>For monitors, include purpose (e.g., photoediting, gaming) and graphics card pairing (if applicable)

CPUs
>Athlon 200GE - HTPC, web browsing, bare minimum gaming (can be OC'd on mobos with the right BIOS)
>R3 2200G - Minimum 30-60fps gaming. 2400G/3400G may be worth to less likely require a CPU upgrade when adding a dGPU
>R5 2600 - 60fps+ gaming CPU with great value
>R5 3600 - Great gaming CPU
>R7 3700X - Overkill gaming CPU
>Wait for Threadripper gen3 - Extreme overkill gaming with its larger cache
>R7 1700 - Budget production
>R9 3900X - Professional tasks

RAM
>Do NOT use a single DIMM. 2 sticks for a typical dual channel CPU
>CPUs benefit from fast RAM; 3200CL16 or Micron E-die ("AES" in P/N) recommended
>AMD B & X chipsets and Intel Z chipsets support XMP

GPUs
1080p
>RX 570/580 8GB - Can be found on sale/used for cheap. Look for 570s which are >1240MHz boost
>GTX 1660TI / Vega56 - higher fps / more demanding games; only worth it on sale as normal cost is overpriced
>RX 5700 - higher FPS
1440p
>RX 5700 - standard, 70-100FPS+ gaming
>RX 5700XT - higher FPS
2160p (4K)
>RX 5700XT - budget option. Upscale with RIS
>2080Ti - best for 4K, but poor value

>RX 570/580 stock is becoming limited as RX 5600 launch approaches

General
>Yes, adaptive sync (g/free-sync) is important for gaming
>HDD are defunct except for servers, NAS, and sub-$350 builds; SSDs are cheap now
>NVMe isn't better than SATA SSD for gaming
>QLC is for storage, NOT for main drive
>Don't use Speccy
>Beware sites which rank CPUs by arbitrary, obfuscated scores (eg userbenchmark, passmark, cpuboss), and comparisons which only use averages and not 1% minimums nor framegraphs
>AM4 VRMs, Monitors & SSD Guide under "more"

more rentry.co/pcbg-more

Attached: comfy.jpg (4240x2832, 2.78M)

Other urls found in this thread:

dlcdnets.asus.com/pub/ASUS/mb/SocketAM4/PRIME_X370-PRO/Memory-QVL.pdf
tomshardware.com/news/amd-ryzen-3000-cpu-sales-revenue,40287.html
tftcentral.co.uk/reviews/lg_27gl850.htm
twitter.com/SFWRedditImages

Nice shill OP.
Let's hope nobody actually falls for this shitposting.

Nice shill post.
Let's hope nobody actually falls for this shitposting.

ATTENTION: Graphics card prices are excessive by historical standards; therefore, consumers should delay or completely forgo any midrange to high end graphics card purchases. The gouging has two root causes: lack of market competition and shortage/"new normal" pricing during the mining hayday.

Attached: pgpfbeDOuV1v05rsfo1_1280.gif (1152x1152, 2.55M)

5700xt owner here
The upscaling option is shit it looks off to the point you might as well risk it with hdmi

Hey boys
I have an ASUS Prime B350M-A motherboard that I got with my first gen Ryzen and I bought a 3600 thinking I would sort the memory out later.

I'm looking for 3000+ memory that will work with my board and I'm shit out of luck on the QVL they provide on the site (The ones that say 3200+ are $400+ or nonexistant).

dlcdnets.asus.com/pub/ASUS/mb/SocketAM4/PRIME_X370-PRO/Memory-QVL.pdf

Is this list all inclusive and is there anything I can do? brainlet here.

Good job, user

Attached: 1364810661637.gif (150x148, 996K)

Bear in mind that the memory validation for this board was done in 2017 and probably never updated since. Newer memory probably works fine but they're not going to waste money going back and updating QVL on old products.

>Graphics card prices are excessive by historical standards
Stop lying. Just because you say this shit with no sources or facts to back it up, doesn't make it true. If you really *~believe~* it to be true, you're delusional.

1080 launched at $599 US, but it was a COMPLETE LIE just like the RTX launch MSRP, and you could not find cards for less than $700-$800.
1080Ti then launched at $699, again a lie and you couldn't find them for less than $800-$900, in 2017.

The price/performance of Navi is relatively good. 5700XT is only 5-10% worse than the 1080ti on average, for $400.
That's a lot better than anything else has ever been for that performance. 1080s and 1070Tis dropped to around that price, but are worse cards. It is a complete LIE to claim that 1080s price-dropped to $400 was a better deal than a 5700XT is today.

Is there a list of non-shingled HDDs around here?

>2070s evga ex ultra expected delivery on wednesday
>hurricane dorian expected impact on wednesday
I hate this industry i back ordered this shit on aug 20th

RTX 2060 launched at $349. Meanwhile, the 1060 launched at $249.
>inb4 the successor is the 1660 Ti retard

>Midrange card in 2014: $250
>Midrange card in 2019: $400
Yeah it must just be because of that 9% inflation rate we've been having.

I honestly don't know why people even bother trying with those. Why get a monitor at a resolution you can't use to begin with?

There were rumors that the 2060 was supposed to be priced at $250 like the 1060

AMD deserves to BTFO those greedy fucks. And the would have gotten away with it too if they had just priced their shit better and kept decent stock

>all this bullshit
>just to shill Navi

>1080Ti then launched at $699, again a lie and you couldn't find them for less than $800-$900, in 2017.
Your """"belief""" aka made up shit that those cards were never available at MSRP is just wrong. Do you really think I can't just use pcpartpicker to expose your lies? He's an example of a 1080Ti selling for $712 in Oct of 2017

>The price/performance of Navi is relatively good.
Literal marketing, small Navi should be 250USD at most considering previous trends. It's a Polaris replacement, in die size, in number of SPs, and also in the fact that it's only previous X80 performance, aka a $200 part like the RX 480 was, which was just below GTX 980 performance

>It is a complete LIE to claim that 1080s price-dropped to $400 was a better deal than a 5700XT is today.
More made up shit, because no one is saying this. The facts remain:

High end
>780Ti - 699USD
>980Ti - 649USD
>1080Ti - 699USD
>2080Ti -1200USD

Mid range
>980 ~ 1060 6GB - 250USD
>1080 ~ 2060 - 350USD

It's not difficult to deduce that there's massive price gouging going on from both manufacturers right now.

Attached: Screenshot_2019-08-31 Zotac GeForce GTX 1080 Ti 11 GB Mini Video Card(1).png (1322x778, 137K)

>tfw my uncle brought home and gave me an old hp probably used on a company/office
this is my first time seeing a slot 1 cpu
it seems that the cd drive doesnt work or at least doesnt seem to get energy/be recognized by the mobo, how can find whats the source of this issue? i dont think i have another ide drive somewhere in my house and since something like this cant boot from usb and i dont have useful diskettes im stuck here
hopefully i can get this up and working for a old gayming machine

Didn't say Nvidia pricing wasn't retarded.
Yeah, the 2060 was supposed to be $250. Though I WOULD argue the 1660Ti is the proper 1060 successor, yes, just like how the 1070 was the 980 successor, but the 1660Ti should have been $230 and the 1660 like $190 or so.
The 5700/XT is priced well since the initial pre-launch price drop. Before that, I wasn't recommending people even wait to see how they are.

Yes, 2060 pricing was bad. Less VRAM yet cost just as much as the slightly faster 1080 was.
But the 5700 is fine. It's FASTER than the 1080 and 2060, and has an appropriate amount of VRAM for that price. It's also even better perf/watt than the already very efficient 1080 was.

Got a super used PC from a friend for cheap. Graphics card is a GTX 650. Played some Skyrim and the frame rate is pretty bad.

Would a EVGA 1060 Mini 6GB Super Clocked for $175 be worth it? Found one on craigslist but I'm a little iffy about buying used electronics. I just don't want a shitty framerate but not sure how to upgrade.

>just like how the 1070 was the 980 successor,
It wasn't, a 1070 is on average better than a 980Ti, just like a 970 was as good as a 780Ti

>Yeah, the 2060 was supposed to be $250.
>But the 5700 is fine.
Wow, so a 2060 should be 250USD including gaytracing, but a blower card that's 6% faster on average is great at an MSRP of 350USD and even more expensive with a proper cooler

This is your thought process if you're a shit-for-brains brandwarrior

Attached: 1440p condensed.png (985x636, 188K)

Post your specs using speccy

And no, a 1060 $175 is way too expensive.

new series of cards are supposed to be faster and better price/perf of what they supercede.
Navi does that. The 2060 and 2070 didn't. But what you keep fucking missing is that the 1080 and 1070 didn't either. The prices you keep citing for the 1080 and 1070 were FAKE. The FE launched at around $100 higher than the supposed MSRP, despite being a shitty blower. Partner cards were also those prices. Vega also had a fake MSRP with that price only being for preorders and the real MSRP being $100 higher. They only reached MSRP after months, and took years to really get to reasonable prices.

No get an RX570. You can find them for around $130.

Anyone here have a Be Quiet case
reviews seemed mixed on if the case metal is bad or not

Been influenced by too many minimalism memes lately and I've been considering switching from my current build(i5 2500k, 750ti, multiple HDDs) to a small Ryzen APU build (probably with an ITX case). The performance would probably be worse, but I'll be saving a bit of space and won't waste that much time on games and shit

Has anyone done similar, and if so, how'd it turn out?

>FAKE MSRP
The only thing fake is your memory. Only a few posts up I completely disproved your "memory" aka made up bullshit about the cards never selling for MSRP.

>new series of cards are supposed to be faster and better price/perf of what they supercede.
By that logic, Turing must be a great arch. But I guess you have to admit that if you're going to shill an AMD card at almost identical price/perf

Both sides are trying to price gouge. AMD is just being a little less greedy

If they had priced the 5700 for $300 and the 5700XT for $350 they would have forced NVIDIA to drop their prices to what their cards were actually worth

>AMD is just a little less greedy
Technically true, but part of the Nvidia price is the far better drivers that actually function on older titles and non-AAA shit and don't cause random issues for tons of people.
I'd always suggest the 2060S over the 5700 and the 2070S over the 5700XT, simply because when you buy one of those you can actually be certain that it'll work properly with everything and not randomly fuck with you.
Radeon is chinkshit.

>AMD is just being a little less greedy
You have to understand that they aren't. They're basically pricematching Nvidia. The 5700 should be $250 with an aftermarket cooler, especially considering how basic it is without any attempt at ray tracing. Not to mention the other poster's point about inferiority in some actual PC games like Anno 1800 or that recent game from the creators of Mist.

>forced NVIDIA to drop their prices to what their cards were actually worth
As previously stated, the 2060 is only worth $250 based on previous trends, maybe a little more for memetracing. So your estimate of $300 or more is still to high, probably because you saw the prices during the mining shortage and you think that's normal.

Anyone have any info on the Gigabyte 5700XT AIBs?

out of stock

???????? Your post confirmed it $699 MSRP yet you couldn't find a card at $699. That one was $810 at the time, and the lowest it was was $710 which it WAS NOT at right at launch.

Navi can't be priced as long as some retards like this one spammer thinks.
It costs over $200 to manufacture the 5700 between AMD and the board partners and shipping costs. They can't sell it at $250. Those margins would be too small for AMD, the board partner, and retailer to all make money.
Navi's MSRP is reasonable. There is still a bit of skin left for sales in the future, while still being a decent price for those who buy now.
Yeah see, this is the same complete mouthbreathing retard who thinks the 5700XT should have been $250 because "muh die size" even though it costs over $200 to manufacture all-in.
The 1060 was 4.4 Billion transistors and 200mm^2. Sure the TSMC process is very dense, but it costs twice as much per mm^2, so that 255mm^2 chip is like the cost of a 500mm^2 chip. And it's 10.7 billion transistors, you fucking retard.

why would you even consider those? Gigabyte is generally trash.

>mining explosion
>private shops gouge prices over msrp because of demand
>mining ded
>hurr prices should still be over msrp because that's what they were before

>I'd always suggest the 2060S over the 5700 and the 2070S over the 5700XT, simply because when you buy one of those you can actually be certain that it'll work
I see you're still trying to peddle your baseless FUD Jensen!
>it just works

Don't expect prices to improve any time soon
especially with cards selling out within a few days of restock

A lot of it requires leg work.
I had to wake up first thing in the morning to nick my card

>$699 MSRP
Wow, at $712 in 2017, it was $13 more, or exactly 1.8% over MSRP for an aftermarket cooler. You're embarrassing yourself. You literally said:
>1080Ti then launched at $699, again a lie and you couldn't find them for less than $800-$900, in 2017.
and I posted proof that that's not the case

>It costs over $200 to manufacture the 5700 between AMD and the board partners and shipping costs.
Proof? Nah you can't provide proof, because that's obviously just more of your delusional, wishful thinking. Pic related, a worst case scenario for the cost of a big Navi chip.

>mouthbreathing retard who thinks the 5700XT should have been $250
You're god damned illiterate scum, making up more bullshit. I have only ever said small Navi should have been $250.

Just give up, you're the only one who's stupid enough to believe the things you post

Attached: 5700XT cost.png (1920x1080, 348K)

Any recommendations for m.2 heatsinks?

Cryorig Frostbit

Will Sapphire be releasing a Nitro version of the RX5700? Or is that line dead?

Of course they will, no idea when though.

I may purchase a 2200G or 2400G as the 3200G and 3400G seem to be a waste of money. I suppose they include a Wraith Stealth CPU cooler but it seems inadequate. I'm not sure what to replace this CPU cooler with.

Attached: 1567314607560.png (741x568, 45K)

2600

1080 prices were above the FAKE MSRP at launch, before mining existed. You're retarded.
Stop citing MSRP when Nvidia gave fake MSRPs with both Pascal and Turing, you fucking retard.

>$71 per die
>implying the cost of a graphics card is only the GPU die itself
>not counting RDNA costs
>not counting GDDR6 costs
>not counting board partner's other costs
you're clearly trolling at this point. You can't be this stupid. People have been correcting you on your bullshit lies for WEEKS yet you keep spouting the same lies.

They are releasing a 5700 XT Nitro this month.

2400G and arctic freezer 34 if you want complete overkill. gammax 400 if you want something cheap.

What's a good aftermarket cooler for a 3700x?

Attached: Hopegun.gif (640x360, 3.43M)

I literally bought my EVGA 1080ti in August 2017 for just under $700US from newegg, it was only a couple months later that prices shot up

what's your budget? If you want overkill, nh-d15 or dark rock pro 4

$50-80 USD. I was looking at either of those, but they look fucking huge so they intimdate me a bit.

You're fucking awful at bullshitting. I got my 1080 at MSRP on launch. Prices rocketed afterward.

You're god damned delusional. You can't just keep yelling FAKE msrp when you've been proven without a doubt to be the faker yourself. All cards were close to MSRP after the initial buying rush subsided, as is the case with any launch. A 1080Ti selling at $712 with an aftermarket cooler is most definitely considered to be at MSRP.

>implying the cost of a graphics card is only the GPU die itself
Not at all. There are lots of other costs going into a graphics card, which is why historical pricing is important. Polaris is analogous to Navi for many reasons, including die size and number of SPs, and Polaris had retail pricing of around 200USD. I'm not saying manufacturing and R&D costs shouldn't be included, because they should: small Polaris had an MSRP of 170USD, and I'm saying small Navi should have an MSRP of around 250USD, a whopping 50% increase.

>I literally bought my EVGA 1080ti in August 2017 for just under $700US from newegg
Be prepared for the inevitable liar, faker, shill spam he'll vomit at you.

>pic from reddit
FUCK OFF

>170USD
179*, so it would be a 40% increase

if your case can fit them, they're all you'll ever need for any cpu cooling. Bigger is better because it means more mass for heat to dissipate into. The DRP4 is reasonably easy to install from experience, and it's silent all the time - you'll only ever hear your case fans or gpu fans

Will it/should it fit into a Meshify C then? That's the case i'm looking at. I imagine it would but I might as well ask.

yeah that'll fit fine. Most cases will list max CPU cooler height in their specs, the DRP4 and D15 are both 160-165mm ish, the meshify C supports up to 170mm

The 2400G is more than enough. The 2xxx series of APUs have Ryzen gen 1 based cores with 8 Vega units for the 2200G, and 11 Vega units for the 2400G.

The 3200G and 3400G have the same amount and same performing Vega units, but their CPU cores are based on the 12nm 2000 series Ryzen. CPU performance will increase a little but not video performance. you're better off with the 2400G and decent motherboard/cooler.

What’s the most cost efficient (

Attached: B38DF171-F5CF-4380-8F8B-D7C112B11222.jpg (600x421, 55K)

Stock does the job

Why used? Get something like a Ryzen 2700 cheap brand new along with a stout B450 motherboard.

How much should I pay for a used RX 580 8GB?

1700 + b350/b450 motherboard

no more than $100. You can get a brand new RX590 for $180 these days

arctic freezer 34 dual fan, fuma, mugen 5, le grande macho.

his own graphs shows the cheapest card you could find wasn't $599 on launch. lmao.
Go show me a 599 card on launch.

Would any of those be better than a DRP4 or should I splurge for the DRP?

>his own graphs shows the cheapest card you could find wasn't $599 on launch
Oh yeah? Where did anyone post a 1080 price chart? Oh that's right, no one did, brainlet. I did however post a price chart that showed a 1080Ti at 712USD, basically at the MSRP of 699USD.

SAPPHIRE NITRO+ for $100
Cop or not?

80-100 for a dual fan model is fine

they'll average 1-3 degrees worse and 2-3 dB louder for about half the price.

Cheapest 5700 (Reference) 349€ @ Amazon
Cheapest 2060S 419€ (Ventus) @ Amazon
Cheapest 2070S 549€ (Reference) @ nVidia-shop

Not sure how much eurobucks I should shovel at the problem of "I just want to play my games at 1080p/144Hz without worrying about shit". Im so close of just going fuck it, taking that blower 5700 and seeing how it goes.

see if you can get a 2060 non super for about 320

I may as well just get the 349€ 5700 for that

2060S is about 10% more powerful than a 5700, so normalized the 2060S should be at 385, disregarding the cooler. You're paying 35 extra for an aftermarket cooler and ray tracing. Personally I'd say it's worth it if you want to try out memetracing

Personally I'd say anything higher than a standard 2060 is overpowered for 1080p

Attached: Screenshot_2019-08-30 Control тест GPU CPU Action FPS TPS Тест GPU.png (720x455, 26K)

meshify c or nzxt h500?

provided it wasn't a mining card sure.

This is a meme. Most mining cards were treated very nicely, even undervolted.

>2060S is about 10% more powerful than a 5700
>in these selected titles
Reality is it's about 5% faster on average

>very nicely, even undervolted.
not the ones I've seen. Only ever buy mining cards when they're from a friend/someone you trust. I've seen my fair share of ex-mining cards die premature deaths due to RAM module failure. It's a hit and miss thing.

2060S is only 5.1% better at a 20% cost increase over the blower 5700. Get the 5700.

Figure an "user" to post fake Russian benchmarks to be a retard.

I mined on a bunch of 1070s. Treated them nicely and didn't put them under more than 160W. No one who I sold them to was any the wiser.

>Treated them nicely and didn't put them under more than 160W.
yea well not everyone is as considerate as you. I've seen cards that were blatantly overvolted/memory clocked like mad because the owner didn't know what custom mining BIOS's were.

This. I've got absolutely no problem with buying mining cards as long as there's a saving, and as long as I can either bench/stress test them in person or am covered with a consumer protection if buying online.
Most of the FUD spewed about mining cards is from people who've never had anything to do with them but just cling to the popular opinion of "mining is bad mm'k"

Yeah, but it's a used card. Any gamer could have done the same thing

>fake
Let's just say you have no credibility with this word

Claiming things are FAKE: Debunkings:

His credibility is pretty much on par with Russian benchmark user tbqh

Russian benchamark user (me) has impeccable credibility, silly user. Perhaps you'd like to try to find a flaw?

>you should avoid buying chinkshit Radeon because Vega doesn't play some old games and Navi may not either
Pretty much on par with the AMDrone claiming
>you should avoid buying GeForce cards because you can't use dual monitors with different refresh rates
Where exactly did AMD touch you user?

>tomshardware.com/news/amd-ryzen-3000-cpu-sales-revenue,40287.html
>AMD Overtakes Nvidia in Overall GPU Shipments for the First Time in Five Years

Is there magic latency in ram to keep to for AMD specifically 3700x?

3200 with 16-18-18-38 vs 3600 16-19-19-39

I've got a feeling I would notice no difference between either of them, but I'm really trying to squeeze performance and make a well tuned machine this go around. I know the CPU is overkill for many, but I actually do things that need power, startingg at 32gb may actually end up at 64 so I want to chose well so it doesn't get mix and matched later if the need happens.

Sometimes when my pc goes to sleep, after I wake it up, my monitor doesnt recognize the signal. It says its connected to an hdmi 2.0, but no signal is being transmitted.
Also when I'm actively using my pc, I hear random windows disconnect and reconnect sounds. Not sure if the ports on my case arent plugged in properly, or the USB 3.0 plug to make my monitor a hub is the problem.
It's been 4 days since I built this machine, I want to iron out all the kinks within 30 days.

Attached: SmartSelect_20190901-005832_Gallery.jpg (720x477, 226K)

Firstly, it depends on the price difference. If there's minimal difference (presuming you're looking at the G.Skill Trident Z Neo kit) in price, definitely the latter.

>>you should avoid buying GeForce cards because you can't use dual monitors with different refresh rates
Is this true? I'm using a 75hz and 60hz screen together right now and have no issues with them.

It's unequivocally false. I'm currently using a 1080ti powering both a 1440p 144hz screen and an old 1080p 60hz screen.
Apparently it was once an issue with allegedly broken Nvidia drivers (there is forum discussion on this subject around), but I can confirm it's no longer the case.

Yea G.Skill, had good luck with them in the past in ye ol days of DDR2. Both sets are "for AMD", but I'm sure that is a load of shit marketing. Price is rather different being with the latter being about 40 dollars more.

You're confusing me with someone else; I don't involve myself in discussions about AMD driver issues with DX9 games. Old games have enough trouble running on Win10, unfortunately. However you're sadly mistaken if you think that there's any hope that a 5700 can run those games if a 580 can't.

>Where exactly did AMD touch you user?
Radeon is trying to rape my wallet, which is why I'm pissed off. I've had an OG Athlon, an FX 6350, R9 280X, RX 470, and now an R5 2600. All of those were good purchases except the FX 6350, but I blame myself equally for that purchase because I fell for the marketing. Pretty much every company has shitty products, and I bought one. "its FAKE" user is deserving of scorn by all in /pcbg/ for the obvious lies he throws around and then doubles down on as documented in

change your USB sleep settings in your power plan. Pic related. Your USB devices if not used are enabled by default to suspend/go to sleep. Same with your monitor no getting a signal. The PC goes to sleep and your USB devices are entirely disabled due to a windows issue they don't wake up properly, so you moving the mouse or hitting the keys does nothing because they're technically turned off. So the PC senses no input which means the GPU doesn't send a wake up signal to the monitor.

Typical Windows garbage

Attached: AdvancedPowerPlanSettings.png (402x437, 19K)

yes, try setting the higher refresh monitor as your main, then playing fullscreen games on it while having video play on the secondary. You'll either get the game locked to 60hz, or the video will stutter and drop frames

wrong, I have a 1080ti with a 1440p 165hz and 4k 60hz setup and if I set the 1440p as my main, I get framedrops on twitch if I have the stream on my secondary, and if I play fullscreen games on the 1440p, most videos will stutter on the 4k secondary

What is the best monitor, for under $500, with freesync, 120hz+, 1440p. Is HDR10 the standard or is HDR400? HDR1000? I don't really get it. Are these sub-standards of HDR? Is there a noticeable difference? Please help me bros.

I'm not personally familiar with the issue but I've heard that it was specific to windows 10, particularly when playing in borderless fullscreen, and when using browsers like chrome which use hardware acceleration. I have no idea if anything has changed since.

Like other user said, gamers treat their cards just as bad if not worse. It's really no different.

You just replied to Russian fake benchmark user.
Posting fake shit then attempting to gas light people into believing they're real and that they're crazy for not believing the obviously fake shit is his goto.

Yes, it's true. It must be a flaw in the architecture since Nvidia just refuses to fix it after many years and THOUSANDS of complaints.
Some people say they don't see a problem, while others with the same set up and same monitors do, so maybe you won't notice it but many do.

Triple buffering also likely alleviates it, so you might not see it in borderless windowed but you get severe input lag then. Not sure.

The new ASUS "ELMB-sync" one, or the new LG NanoIPS.
HDR400 or HDR600. HDR1000 is excessive on monitors that are so close to your face and is made for TVs despite what one retarded shill will spam about. He's probably the same retard who spams the fake russian benchmarks and fake launch prices.

HDR doesn't mean shit on monitors because only the most expensive ones actually have local dimming to create the HDR effect. The LG 27GL850 came out recently and is basically the best freesync monitor you can get now tftcentral.co.uk/reviews/lg_27gl850.htm

I personally own a 165HZ 1440p G-sync panel and had a non adaptive sync 60HZ 1080p panel as my secondary. I got huge frame drops when having a video laying on the lower monitor and trying to game on the main one. It could totally have been a driver issue, but I'm not sure. I only use my single monitor now. I have an RTX 2080Ti.

There are no good monitors out right now. HDR is basically an OLED feature, considering FALD is such garbage. You're forced to choose between TN, VA, and IPS in the monitor landscape, which each have their crippling flaws. The NX-EDG27S v2 caught my eye because it has adaptive overdrive, but it's IPS so it has shit contrast