CPU and GPU Utilization run between 60-80%

>CPU and GPU Utilization run between 60-80%
>getting around 45fps

What does this happen? The game isn't maxing either of my CPU or GPU, what else could be bottlenecking it?

Attached: Desktop Screenshot 2019.07.14 - 23.06.05.55.png (1824x1433, 3.89M)

Your mum

Installing Gentoo should fix it.

Games don't, never did, and never will use more than 4 cores (the laggy abomination that is DX12 or Vulkan doesn't count)

If all you do is play games and you buy anything more than an i3, you are literally cucking yourself

Your CPU is maxed.

Turn off hyperthreading and watch you sit at 100% at around 42fps.

>playing a shitty /tv/ pseudogame that is more about lolgraphics and style over substance that needs fucking overpriced, worthless goddamn hardware
Go play a real game.

Attached: Asteroids.png (640x480, 4K)

This, your threads are at 60-80%, not cores.

Shut the fuck up boomer tryhard.

I see someone hasn't played any games from 2019

How are you gonna lie like that when a simple googled benchmark can easily prove you wrong and make you look like a fool ?
>shintel drone
Nevermind

Yes you're right, my bad. In 2019 we have amazing tools that we can use to utilize more than 4 cores such as DX12, which is indispensible in order to truly enjoy a modern high FPS multicore gaming experience.

The internet is absolutely littered with people complaining from motion sickness and headaches from the input lag caused by DX12 and Vulkan but giving yourself brain damage and permanently fucking up your hand eye coordination is a small price to pay.

>boohoo he's right but i don't want to admit it
Grow up, child. You won't ever be considered a real adult until you learn to admit your massive flaws.

I bet you also think people can't tell the difference between 60 and 144hz monitors.

That guy's right, you haven't played any games in 2019 and you have no idea what you're talking about.

It's funny because 144Hz is the ONLY fucking monitor technology improvement there is. The whole 4k and adaptive sync bullshit? Worthless fucking garbage for autistic idiots.

>boomer retard thinks his shitty simplistic arcade games have substance
Lmao, go die in a hole you old pretentious fart

Attached: enjoy your brain damage bro.png (879x904, 112K)

DX12 and Vulkan do not cause input lag, the program design does. Vulkan is strictly more performant than OpenGL, and DX12 is basically the same as Vulkan. Where did you get this idea that input lag is caused by these standards?

>adaptive sync isn't real
Poorfag cope

Every single DX12, Vulkan and Mantle game I've played has that same problem.

But again, enjoy your brain damage, I don't care.

Doom with Vulkan has no extra imput lag and is very well optimized, much better than your shitty dx11 would've been. You're either baiting or retarded.

More substance than your fucking shit moviegames, you little child. Gaming began dying in the mid-90s when garbage like MGS and Mario 64 cared more about presentation than ACTUAL FUCKING GAMEPLAY. And now it's all garbage.

Then why are there literal hordes of people complaining about input lag, headaches, nausea etc.

Why so scared of Vulkan, you are brain damaged already, it can't do anything to you.

All right bro, have fun with your brain damage.

What games? And i'm sorry, you getting a headache is not evidence. Show me an input lag test with one of the new APIs vs the old.

And these APIs don't introduce anything that leads to more input lag. Vulkan is literally more streamlined and has numerous features that should reduce latency. Mainly having immutable pipeline state, which OpenGL does not have, maybe DX11 did i'm not sure.

>old good new bad
Imagine being this much of a seething brainlet. Go back to playing pong you autistic manchild, let normal people discuss games that are actually interesting.

Proofs ?

Baiting retard. I get 200-300fps on doom which uses vulkan and its silky smooth. I have no fucking idea where you got your source from since you've provided none, you absolute fucking retard.

Hey bro, just wanted to know if there is a way I can upvote your post? I really liked it, thank you.

>Proofs
>source

Attached: enjoy your brain damage bro.png (852x904, 112K)

3770/k doesn't have the performance for that cpu hog of a game. i get similar lows with a 2700x/radeon vii thanks to amds utterly fucked dx11 driver. swapping in a 1080ti just about keeps things above 60. no other game is this poorly optimised. lock it to 30 and play at higher res and/or settings.

This has to be bait, right? No one could seriously call Mario 64 a garbage game.

go back to Jow Forumsboomers

>being this much of a genetic failure
Lmao. End your life if you are like this. I can play for hours without any issues, works with my brain.

>lol bro who cares everyone's complaining about headaches that last up to a week, it's probably nothing I'll just keep playing my gaem XD

Did you know that only one of your google results is from a date when Doom used Vulkan? Also it has never used DX12.

You are a dipshit who doesn't play video games and whose opinion means nothing.

switch AA off, it taxes CPU a lot, you CPU is bottle necked
here is something to kill you: my 1600x runs this game at 60+ on 1070 1440p

No modern graphical API causes input lag, retard.

There are certain game engines that do, but id Tech 6 is not one of them.

Most games released since Xbox One was released have had significant input lag because they're on the UE4 engine, which for most of its lifespan had a shitload of input lag. When a developer produces a game that is coded with 90% middleware, then the game will have however much input lag the middleware has.

If you went to a doctor and told him that you've had a headache that lasted a week he'd give you a brain MRI

False. AA is purely GPU side. Whether it's MSAA/SSAA or a post processing FXAA, it takes place entirely on the GPU

on what, low settings? your post is suspicious, sir.

>Mom the new graphics APIs are giving me a brain disease

>everyone
That's where you're wrong, kiddo. It's just failures like you and 4 other guys. Cope harder !

How is his post suspicious? A Zen chip is significantly more powerful than an Ivybridge chip.

I've once had a headache that lasted a month and didn't go to a doctor, I'm fine now.

on ultra, except distance is on high it goes down to 55 if distance on ultra, I didn't play it since first patch though, probably runs better now, origins sure runs stable 60 after patches

really? I thought MSAA is CPU bound, remember it wrong then

Cope harder with what? The fact that my brain has evolved to recognize hand eye lag as a sign of poisoning or organ failure?

pic related hits 96% gpu util at times using the very high preset at 1440p. dropping the res to 1080p to test cpu perf sees it dropping out at 68fps at times. the 3600 is significantly faster than the 1600 and the 1080ti is quite a bit faster than a 1070.

Attached: pic.png (3069x1668, 455K)

>my brain has evolved to recognize hand eye lag as a sign of poisoning or organ failure
I dunno what you think that sentence means but yeah, looks like you better cope with it as best you can.

Fuck you're right
I switched DotA to vulkan and was feeling sick after playing for more than a match at a time.
Switched back to dx11 and feels much more responsive now

>I thought MSAA is CPU bound, remember it wrong then
No, SSAA (Super-Sampled) is just rendering at higher resolution than downsampling. MSAA (Multi-Sampled) is an optimization of MSAA where only the edges of objects are supersampled, but not the middle of triangles. Both happen on the GPU and are actually options baked into the GPU, usually there's no shader code written to do it.

FXAA (Fast-Approximate) is a post processing cycle that converts hard edges into soft edges; but again this is a shader step that is invoked on the CPU for basically no cost.

Then there are optimizations of FXAA like TXAA (Temporal-Approximate) which involve differences between frames.

But there is no AA algorithm that works on the CPU.

Either bait or retarded

Uh, this hasn't been true since GTA V.

>too retarded to turn vsync off

Don't feed the dx11 shill

They can use more than 4 cores
They just won't use them effectively

Jesus Christ shut up you piece of shit he told you he switched back to DX11 he never said anything about vsync why do you keep running your retarded mouth do you need attention or something do you have no friends at all? Go talk to your mom or something.

ivy bridge bottlenecks a 1080ti

Still doesn't sound suspicious.

I mean, the fact that you think a Ryzen 3600 should be barely better than an Ivybridge CPU sounds much more odd.

>turn off vsync
>no more lag with Vulkan
>bruh I'm too dumb to do this therefore my outdated API is best
SEETING RETARD

No fucking way I'm reading that shit.

>gets BTFO
>can't hear you lalalaalala
Lmao

type some more greentext bitch motherfucker

what, i never alluded to that, wtf are you on about. the 1600x isn't strong enough to keep to 60, true, but the 1070 can't handle 1440/60. my 1080ti just just about do it.

Attached: creed.png (3529x2009, 2.74M)

Want to provide a link instead of your inspect element addled garbage?

How old are these benchmarks?

Get a modern CPU.

The game also have fire effects which tanks performance

Attached: ACO_1080p.png (1405x1517, 82K)

I feel like you're stuck in the 90's. Even Carmack had to eat his own words, since using a dozen cores for a single game running at 200 FPS *IS* in fact possible, while evenly distributing the load.
20 years ago, people didn't even believe running a game on two cores would be possible, but here here we are utilizing 8 cores and more evenly for a game.

Not true, you can see people testing with clickers and slow motion cameras. It's bullshit and you know it.

Try games with Vulkan, like DOOM. It uses all 32 fucking threads on my 2950x evenly and loads my 5700 xt for 99% while opengl 4.5 uses only couple of threads and loads the video card for like 30-35%.