PC gaming is such a Jewish trick

PC gaming is such a Jewish trick.

I've owned 144Hz monitors (with 2d lightboost), a 290x back in 2013 (when it was the most powerful card at the time), and now I have a BLCK OC'd 6600 with an RX480. So everything I'm saying is from my experience.

Anything over 90Hz is a meme, audiophile level of snake oil. "Ultra" graphics settings are made to artificially sell you 1080Tis when a 1060 can play at nearly the same visual settings for a forth of the price. More than 4 cores are still not needed, cause no game fucking benefits from it. HiDPI is a fucking meme can windows can't deal with it properly. I'd rather have a good quality 1080P display any day. HDR just doesn't fucking work, ever.

By buying that $3000 you look like the biggest dumbass. Next year a $1000 build will shit on yours.

Attached: mspaint_cool.jpg (1280x1280, 428K)

Other urls found in this thread:

humanbenchmark.com/tests/reactiontime/statistics
blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
studiocult.co/products/ms-paint-enamel-pin
youtu.be/MCDmRJvtgbY
twitter.com/SFWRedditImages

OP actually thinks he just discovered something. Are you telling me premium products don't have good bang per buck? That's crazy. Next thing you are going to claim a 100k car is not twice as good as a 50k one.

>anything over 90hz is snake oil
Agreed.
>Ultra graphics are made to sell you way more GPU than you need
Agreed. Except at 4k things tend to start to shit the bed.
>HiDPI is a fucking meme
Yeah, but a 37'' or larger 1080p panel will look like crap compared to the same size panel at 4k. So high res has a place, but the display has to be large as well.

>vidya

>PC gaming
>gaming
just play nethack and call it 'playing nethack'
anything else is gay

Lol @ gaming. I'm still buying threadripper 2 for muh MPI.

Nethack is objectively one of the worst roguelike's I've ever played. The sheer amount of bullshit that will kill you at random is frustrating. I feel like winning is 90% luck. Maybe I'm just bad or is this considered 'fun'?

it's like dwarf fortress in that it's not a game you're supposed to win. you're just supposed to enjoy playing it. being a old wizard who falls down the stairs and breaks his neck on the final level of the dungeon is funny. it really doesn't matter that you "lost" because you got a good laugh out of it.

>tfw too smart for gaming
>Posts on imageboard using 10 year old $400 laptop instead
>Actually have a laugh and be entertained
>Meanwhile, "fun" gamers have the 1000 game stare, a dead inside zombie look

lol, this is almost too true to be funny. ever since they began hiring psychologists at the aaa studios games have become soul-destroying. something about working to achieve something only to have that achievement _immediately_ undermined by the next available level of reward seems to turn gaymers into unstable headcases. nothing about modern games is fun, they're designed by psychologists to be really addictive. the game industry is the new tobacco industry, lol.

1080p is fucking garbage. 1920x1200 or higher.

>anything over 90hz is snake oil
for some its enough for others 144hz is not enough -> pro players

gaming is a jewish trick

Attached: 13edadaa850956d1.jpg (300x250, 16K)

you "upgraded" from a 290x to a 580?

seems kinda stupid imo

>but a 37'' or larger 1080p panel will look like crap compared to the same size panel at 4k
I'm not sure if you know what DPI means.

>pro players

Progaymer.
Seriously I want this "esport / gaymer" scene / fashion to end, its so stupid.
Many of us play games, silently, we aren't showing our status, it's like a LBGT parade where they show their penis.

Attached: 1514833857064.jpg (680x510, 53K)

The human eye can be trained to see well over 244fps though but that's in the extreme only fighter pilots in vr would need too.
As for your 90hz argument it depends what your brain is used to.
I've been using 60hz since I was born so anything lower or higher feels weird.
I'd happily swap to 120 or 240 but I am waiting for panels to improve colour and black levels without sacrificing picture quality at 4k so I guess that might come with dp1. 5

>stop mining crypto and making money so i can play my games!!!
the absolute state of pc gaming

And of course, all the industry behind gamer scene with all the gear pimp'd up to look "cool" and transformer like.
I'm so fucking angry.

Attached: 1529339797571.jpg (640x628, 97K)

LOL games have being designed by psych grads and sjws since the 80s arcade booms.
Most games don't depress me because I don't invest shit in that rng gambling loot crap

Attached: 1528817957637.png (675x827, 35K)

Nobody in this thread has ever played any competitive fps. Reducing the input lag from 17 ms to about 7 ms is incredibly useful. And depending on what your life situation is (proximity to datacenter, bandwidth, response time of... well your brain). 144 also looks like fucking butter. Play 144 with Gsync or freesync and then see if it's snake oil.

This fucking shill probably still uses the stock cooler on his GPU.

Yeah, the GTX 1080 Ti is seriously hindered by any stock cooler. If you want to see a seriously powerful card, put a third party one on and watch it never thermal throttle.No way a GTX 1060 is getting anywhere near that level of performance.

Also, just because your fucked eyesight and dead response time can't take advantage of better features in a display doesn't mean other people can't. It's like features are some kind of object permanence for your dumb ass - if you can't see it or notice a difference, it must be snake oil and not exist.

Wrong. Ideally you're going to want double of 60hz or you're getting inconsistency.

Well it's snake oil at least for him

>says the miner who also complains about 'meme machine learning' taking 'his' supply.

But R9 290X is faster than RX 480 and even RX 580? wtf are you doing with your money nigger

Which doesn't make it snake oil at all.

Is 244hz worth it? If I go get a 2.5k used gayman screen imma go all out

Haven't been many good games released since 2013 desu senpai
anything over 90hz is indeed a meme, but "pro gamers" (read: people who need to get an actual fucking job) will defend it.
Ultra graphics often include the highest texture and shader settings which look great, but also shit like antialiasing and anisotropic filtering which yield a slight visual improvement at a huge performance cut.
Vsync, freesync and gsync are often uneccessary but vsync is on by default on every damn game, people need to buy good monitors that don't screentear (A SHITTON of monitors actually just don't screentear at all, only shit monitors do)
High DPI is pretty much meme tier for a lot of things, but windows not handling it properly isn't one of them. HDR is often poorly implemented, and more than 4 cores is literally great for everything but gaming, and is totally worth it.

As far as spending 3000 on a build goes, it's really fucking dumb, and your 10 years late in your realization of it.

Kek, OP is probably /v/ spillover

>Competitive gaming
Why would I even want to?
Gaming is for fun, if you become that serious about it then quit and get a job, you leech.

This.
I could have bought a 1070ti for that but I got a 1080 cheap instead
Seige is a rough diamond
Insurgency 2 darude sandstorm is just about to drop
Simcade shooters are much more fun then dead shit like arma and flash in the pan indie shit like pubgay
Oh and arcade racing games are making a comeback.
Only fps are suffering the other genres are fine

>people need to buy good monitors that don't screentear (A SHITTON of monitors actually just don't screentear at all, only shit monitors do)
What do you mean by this? Every monitor tears if the buffer swap is not synced to the vblank.

Temporal aa is the best of both worlds it has a tiny render time budget and looks good if used right like in ue4 and Idtech titles

>Only fps are suffering
Have you even seen the state of RPGs?
Either way I said there weren't many good games released, I didn't say that no good games have been released.

I mean that shit just doesn't happen on quite a few monitors.

Isn't he talking about xsync and or nvidia fast sync?

Attached: 1528272746339.png (286x315, 81K)

DPI stands for dots per inch, fagstick. 1080p at 37'' will look like garbage, but a 37'' 2160p panel will look much better.

>Anything over 90Hz is a meme

Monitors generally are either 60hz or 144hz so if you want the benefit of 90hz then you have to get a 144hz monitor. I wish it wasn't this way but it is.

>literally has no idea what he's talking about and is just spewing words words words.

Rpgs are a lost cause.
Tes is vapourware on some old shit gameengine from the 90s

>people need to buy good monitors that don't screentear (A SHITTON of monitors actually just don't screentear at all, only shit monitors do
Are you fucking retarded? Screen tearing is simply when two images are drawn simultaneously during a screen refresh. It has nothing to do with the screen itself unless you're using Gsync or Freesync.

And thus, my point has been proven.

>No actual counterargument

You can easily run a 60Hz monitor at 80Hz.

Muh 10MS when the human reaction time is ~300MS.

I sold the old build for college tuition

...

Rpgs are shit just go play something else?

Attached: 1528896410479.gif (615x413, 1.96M)

Mhmm, checked it and seems you are right, must be that I always have sensible fps limits in my config because higher fps than the screen refresh rate will yield nothing but an increase in power usage.

Nah man, I'm still playing Morrowind and Baldur's Gate and the likes, RPGs are great.

Yeah no shit those are the last good rpgs made.
Deus ex sjw divided and witchermark 3 downgrade edition are just action shooters with RPG elements.
Kingdom come was neat but I only got half way in before the bugs ruined the game

lmao I just bought dell xps 13 and doing development shit on it. I already forgot about gayming shit because I know the same would happen if I bought a top tier rig.

What are you actually arguing against?
Do you dislike RPGs as a genre or just the modern ones?
Also
>Deus ex sjw divided
I kek'd

OP doesn't even know how vsync works and I had an effective understanding of this when I was 10 years old, yet he's an "enthusiast" thousands invested into his hobby.

Sometimes i don't get this planet.

Every game is on some old shit engine.
UE4 is literally just UE1 with several generation of new hat.
iDtech6 is literally the quake engine
Source2 is literally the quake engine (forked at quakeworld)
IW engine from COD is literally the quake engine (forked at idtech3)
CryEngine was built in 1999 (19 years ago)
Dunia is literally CryEngine
RAGE was built in 2004, first game released 2005 (Rockstar tabletennis)
MAX-FX was built in 1997, first product released was 'Final Reality' (a benchmark) first game was Max Payne
3DMark still uses a modified MAX-FX engine.

Most any engine you can list, was originally built 10+ years ago (in some cases close to 25 years ago) and added to over time.


About the only recent 100% ground up engine is FOX engine. (and it showed)

2077 looks like more of the same soft RPG elements of you know what I mean.
If they are gonna do a RPG game make it like gta sa
Difference is those engines are actually good unlike Todd Howard's shitty gamebyro larping simulation

human reaction time is about 215 ms, on average.

Changes in display tech and latency show that the online tests now average it at 260ms

humanbenchmark.com/tests/reactiontime/statistics

Reaction time to audio is faster, also.

>About the only recent 100% ground up engine is FOX engine. (and it showed)

Yeah even shitty ps3 hardware was able to handle it magically. Other engines are basically hardware hogging shits made to increase profits of hardware companies.

Especially Unity, that fucking thing requires like 1GiB of RAM for a simple pong game. Every motherfucker uses it proudly.

Most people in general have no idea what vsync does or how screen tearing works. If you ask someone if they use vsync they'll probably say "muh input lag," even if they have no idea what the difference between double buffering and triple buffering is. Honestly it doesn't do any favors that Vsync is technically a relic of CRTs and LCDs sync to your GPU completely differently. It's embarrassing that Linux still calls it "sync to vblank," like who the hell is using Linux with a 480i TV?

Gears of war 4 runs and looks amazing it's all about how the engines used and who the developers are.
Death standings engine looks like fox2 off brand kojima magic... Hopefully it makes its way to pc and Ps5 I wunna see that fucker at 4k 120fps+

>About the only recent 100% ground up engine is FOX engine
Decima
Snowdrop
Ignite
Luminous
Panta Rhei

Decima engine
Quantic Dreams' in house engine
Northlight engine

there are many more examples of completely new engines made in the last 5 years, it's just the fact that they're private and have only used for 1 or 2 games.

>Death standings engine looks like fox2 off brand kojima magic
It's Decima, the same one used in Killzone: Shadow Fall, Until Dawn and Horizon: Zero Dawn.

Law of diminishing returns, faggots!

And lower input lag, you fucking braindead idiot.
Jesus Christ Jow Forums is such a shithole, full of illiterate fucks that read and regurgitate the same bs over and over again while thinking they're smart.

>Hurrrrrrrr duurrrrrrrrr anything above 90hz is a meme!!!!!1!!111
blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/

End yourselves.

I want that doodad
Where2cop?

sugar pill autist boy

How does one find a 90Hz monitor? I can only find 75 then all the way up to 144. I'd probably play games at 1080 or 1440p@90

a gpu from 10 years ago still gets an ok 30fps on most titles

fucking loling @ /v/

>More than 4 cores are still not needed
I agree with everything but this. Higher core counts provide good performance boosts if those cores are individually slower. For example: I am running two Xeon x5650s each at 2.67ghz for a total of 12 cores. This 8 year old technology but it keeps up with current games because they are well suited to parallel processing.

Found it since it looked cool.
It's a pin.
studiocult.co/products/ms-paint-enamel-pin

Jow Forums is more fun than just about anything in life.

>More than 4 cores are still not needed, cause no game fucking benefits from it.
can tell you've never ran multi-cleints for profit. nor botted a game for profit.
you're a casual.

world class players play on some of the shittiest set ups you've ever seen. It's mid-tier shitters who demand 144hz screens.

this one has fully drunk the kool-aid

80% of the actually good games you can play on a PC runs at 320x240 60hz

>playing video games in the current year
>j-just wait until vr comes down in price, everyone will be using it
>m-m-my reaction time is like a ninja, i need a faster refresh to win
>its not brand whoring, my plastic toy is just better than your plastic toy
what happens to somebody in their lifetime that causes this level of general deficiency?

>Anything over 90Hz is a meme, audiophile level of snake oil.
Now you are just saying shit because there is a noticeable difference even going from 120Hz to 144Hz especially in competitive FPS. Other than that totally agree gaming is mostly a past time for man children and the triple A games that require the latest hardware to run at ultra settings are fucking garbage so there is no point upgrading ever

Imagine being proud of the fact that you play games as an "adult".

imagine having an NSA/NASA tier computer at your beckoning call, and only know how to spend money with it, not make money with it.

Yea the new Nintendo games hit this hard. Breath of the wild has super short dungeons and retard easy shrines to grind orbs. In the mario game there's hundreds of stars that you get every 2min. It all feels like mobile gaming with shitty and constant rewards

...

Most DOS games are meant to run at 70Hz.

Indeed indeed.
Of course, they rarely went over 30 due the ISA bus, but was 70 divided.

>Anything over 90Hz is a meme, audiophile level of snake oil.
And that's how I know you don't have a 144 Hz monitor.

My brother got a 144hz monitor and even though there's a difference I would never ever pay extra for that shit. Gamers tend to overexagerate shit like that. It looks a bit smoother, that's it. But I would never pick 144hz over 4k

144hz is mostly for online games

120Hz should be the default, 60Hz panels should be phased out or relegated to low end 1366x768 models.

>AAA games become less popular
>everyone playing league, blizzard games, Dota etc which have little hardware requirements
>anyone can run these games at 60fps on crap hardware
>hardware manufacturers need a new way to make vidya retards buy inti high performance gpus
>let's bring out 144hz monitors so they need at least 144fps
>vidyafags fall for it

And the best thing is that they sell 3 year old GPUs at twice the MSRP

Because competing is fun user.

Lmao this. If by some magic AMD or Intel released GPUs that are 300% faster than what we got at the moment they would start shilling 488hz monitors to gamers.

They already have a rabbit in the hat for this one called VR.
"look! now it's TWO 4k,144hz screens to feed!"
And i can go even further beyond with "multiplayer lewd games" where you have FOUR screens to feed.

>going above 800x600 @ 30hz
Stop embarrassing yourselves

Attached: stereoscopic_porn.jpg (1173x723, 67K)

>muh refresh rate on an LCD
You realize they are all refreshing at different rates?
Only way to get totally smooth high refresh rates is on a CRT.
85hz CRT > 120hz LCD
110hz CRT > 144hz LCD

>120 hz CRT
>Voodoo 2 SLI
>640x480
Buttery smooth turok

CRT.
that shit was noticeably flickering on sub-75hz.
My cheap-ass TFT at 60hz is infinitely better.

>blizzard games
>which have little hardware requirements
My HD520 gives me sub-30 fps in overwatch with

>>>>>>>>>>>>>>>>>>>>

the bigger joke are gayming monitors. they cost upwards to 1.5k these days. meanwhile you can get a high end flagship tv that has all the features these (((gaming))) monitors have AND MORE.

youtu.be/MCDmRJvtgbY

has actual 1ms of input delay, perfect blacks,120hz, lightboost, true 10bit 4.4.4 hdr, atmos 7.1, and soon freesync 2.0. I don't even have to mention that they are 3x the size. only a complete braindead drone would buy one of these shit asus monitors. oh did I mention that they come out with a gayming tv around christmas? make sure to shill out 3k for that trash

Attached: Untitled-1.png (2100x1500, 468K)

Well, as you go higher up in terms of quality, you get diminishing returns. Not just for monitors, but most equipment.

If the difference between a product at the 10th percentile and the 90th percentile is a couple hundred dollars, the difference in price between one at the 90th percentile and the 99th percentile would be more than a thousand bucks.

>gaming monitor
>gaming mouse
>gaming mousepad
>gaming keyboard
>gaming GPU
>gaming printer
>gaming chair
>gaming desk
>gaming power supply
>gaming cables

Attached: Q0ol-Ylx.jpg (250x250, 12K)

Developers don't give a fuck about PC gaming. Most graphics effects above console settings are a meme

As a person with a Q6600, a Q9300, i7 2700K at 5GHz, 4690K at 4.5GHz I now know you're shit at computers.

Q6600 was good in it's time, but the difference is insane when you compare it to a sandy bridge. DDR2 and Q6600 just can't keep up with 2400MHz DDR3 and i7 2700K

>Anything over 90Hz is above what I can see.
What a shame.

>"Ultra" graphics settings are made to artificially sell you 1080Tis when a 1060 can play at nearly the same visual settings for a forth of the price.
Not really, but it's true you can lower requirements with some settings most won't notice or at least won't dumb down the graphics too much, but don't expect a 1060 to play as well at UHD resolution like a 1080 (non-Ti) or even a 1070 would.

>More than 4 cores are still not needed, cause no game fucking benefits from it.
This is Jow Forums, not /v/, and I rather play a game that hogs 4 cores and have 4 extra cores for background stuff that doesn't slow down the game.

>HiDPI is a fucking meme
You're just blind or using bad scaling. (125%, 150%, 175% and other odd stupid scaling sizes. You always use 4x size (200%, 400, 800% if you want non-retard scaling)).

>HDR just doesn't fucking work, ever.
Don't bother. HDR is a stupid gimmick and the HDR abbreviation is already used for shaders. They should've just called it what it is; Oversaturated shit.
Having 10-bit (everything above right now is just dithering) doesn't make things more colorful, it just makes the jumps from one color to another less visible (color banding).

>By buying that $3000 you look like the biggest dumbass. Next year a $1000 build will shit on yours.
Agree.
Luckily, I've used my $1500 water cooling for 6 years and it lowered my temps on my graphic cards to below half of what it was on air, while also lowering CPU temps, all while the fans are running at mere 700-1150RPM, and pump at lowest setting (silent).

Attached: firefox_2018-06-19_16-33-38.png (7680x4200, 1.43M)