You will never need more than a dual core for gaming

You will never need more than a dual core for gaming.

Attached: Intel-Core-2-Duo-E6600-e6600-CPU-Processor-2-4Ghz-4M-1066GHz-Socket-775.jpg_640x640.jpg (631x640, 135K)

Other urls found in this thread:

youtube.com/watch?v=lHLpKzUxjGk
wowhead.com/news=287727/new-multithread-optimizations-coming-in-patch-8-1-tides-of-vengeance
steamcommunity.com/app/379720/discussions/0/351660338733413841/
en.wikipedia.org/wiki/Hyper-threading
twitter.com/SFWRedditImages

2GB of ram is more than enough, go with 4GB if you are rich.

T. user 2007.

Bottlenecks something as low end as a gtx 660.
Kill yourself

Wrong.
Insurgency Sandstorm uses 8 cores on my 1800X.

I have a Core 2 Duo and it's already dropping frames when watching YouTube 1080p 60fps. It happens less when using Chrome vs Firefox.

I made fun of what people said in 2008.

Granted by the time games took advantage of quad cores. The old quad cores was outdated. At least on stock clocks.

I love Insurgency and Day of Infamy, but Sandstorm is a load of shit in regards to performance and visuals, and the design changes to incorporate AI helos and the AP rounds vs armor situation plus them ditching their best game modes (ambush and elimination) in favor of Push on maps that are for the most part badly designed (DoI and Ins source has some bad map design too) means overall, the very best thing I can say about sandstorm is they pushed the release date back to December. If they had called it launched in September, they would be murdered in reviews. But it's not enough time to fix the game, I think they're under publisher or other financial pressure and not adapting well to UE 4 but also making some bad design choices. The Audio is still great, but almost everything else points me towards refunding.

Attached: reloading is for fags.webm (976x550, 2.98M)

>You don't need gaymes
ftfy

multicore games literally don't exist the only diference you see in benchmarks is intel use older gen chips for i3 and i5 and still call them 8th or 9th gen for example.

the multicore gaming meme is so bad that China doesn't let intel market its 9900k as "intels best gaming cpu" because china knows multicore games are impossible

john carmak did a lecture on latancy last year he said true multicore games will never ever happen due it huge latency increase.

I upgraded from my E6600 to a Q9550 just last year, hoping it'll keep me at least till 2021.
I would have bought the Q9650 but it was double the price. Anyway I absolutely wanted the Q9550 for that sweet 12MB cache. Even slightly upgraded my motherboard from 1067Mhz to 1333Mhz FSB

Link to the Carmack lecture?

1066* one thing I'm nitpicking about is that my Q9550 isn't E0, it's C1.
I especially would have loved the S optimized version which reduces 30 watts of consumption.

>E6600
>Only 2.4 Ghz
Nigga a true 6600 is suppose to be an overclocker. Mine has 3.0Ghz from default and even then I overclocked it to 3.2Ghz.

Gaming computers are retarded when consoles are sold as loss leaders.

...yeah mine stutters on 720p 60fps. i'll be upgrading in the end of the year when there'll be holiday sales. stuck around with this boy for over a decade. makes me sad. reminds me of my own mortality and eventual obsolescence.


i'm 20.

Mine has 6 cores. I'm sure it still bottlenecks but I don't care as I can get a solid 60 FPS on any games I care about playing.

Attached: Z400.png (683x533, 45K)

you will never need more than 640KB of RAM

Is that still running on LGA 775?

FCLGA1366

youtube.com/watch?v=lHLpKzUxjGk

its mostly talking about latancy but he gets to multicore at one point and draws to scale how much added even one core adds to it and thats not even for rendering engine which is where you would actually get performance gains.

to have a multicore rendering engine the game would literally have like 1-2second lag and run at literally 2-3fps.

>You will never need more than a dual core for gaming.
Only the retardiest of retards said that.
Like the people making fun of PS3's Cell because "multi core gaming" or "developing games that utilize multiple cores" will "never be a thing". Like it wasn't obvious that single core was at its limits and multi core was the logical step, especially after we got unlocked shaders on GPUs, CPUs were the next in line.

>Games properly using multi-core
Kek, where?
Quad cores were released a decade ago. Games still fail to properly use even 2 cores

Except everything in the game does not have to run perfectly in sync, you can have job threads for everything from AI to physics.
Plus you can have internal engine and actual front end API (like Vulkan) on seperate threads.

Did you even watch his lecture?

Plenty games utilize 4 cores, but more than 4 there's very few.

As a matter of fact, the only software that I've ever seen use all the cores properly is waifu2x (before I switched to caffe with CUDA)

dude can't spell even *latency*, what did you expect?

wowhead.com/news=287727/new-multithread-optimizations-coming-in-patch-8-1-tides-of-vengeance

Even the
>tiny
>indie
>company
is laughing at you corelets.

Games will utilize one or two cores more but have extra 2-3 cores running other things. Just open the task manager while gaming and select per core view.

>have extra 2-3 cores running other things
Yeah, theyuse the 1st core at 80% and use the others at like 15%.
Complete waste

What are you nigglets smoking? Dried up diarrhea?
This is Doom 2016. Just Doom 2016 running with highest settings at 4k and staring into the distance.

Attached: doom2016.png (567x360, 32K)

>One example
Great. Now look at all your games released since a decade that don't do that

This is the same CPU on idle, just Jow Forums and this thread open.
Notice a difference in how much and how many cores are utilized? Is it 1 or is it most of them? (Yes, spread out usage is better as cores can run at lower frequencies and use less power and produce less heat.)

Attached: idle.png (563x360, 13K)

>utilizing all 16 cores to render your Top 10 Best youtube video

do you get microstuttering?

>since a decade
Nobody claimed that every game made in the last decade utilizes multiple cores properly. Can you quote the post that did?
Modern games and games released in the last few years that need the power do though.

Here is GTA V for example, a game several years old already that also had PS3 and Xbox 360 ports.

Attached: gtav.png (562x367, 22K)

FreeSync monitor with no FPS cap.

Witcher 3 in novigrad is also a good example for multithreading use, there's plenty of games that use multiple cores nowadays.

>that also had PS3 and Xbox 360 ports
Which is even less excusable for games to not use properly use multicore.
The PS3 was released late 2006 and was already multi-core.

It's good enough, ignore the 6 threads with low usage, as I'm pretty sure it's smart enough not to use hyperthreaded cores, seems to be using two main cores and four extra.

It's good enough, ignore the 6 threads with low usage, as I'm pretty sure it's smart enough not to use hyperthreaded cores, seems to be using two main cores and four extra.

Attached: witcher3.png (560x354, 25K)

dual core shitters should be hanged on lamp posts

DX: Mankind Divided

Attached: mankinddivided.png (556x351, 16K)

Its really hard to develop for multicore. I read factorio dev blog. Interesting forum read.

Forza Horizon 3

Attached: forza3.png (563x363, 38K)

It's hard if you're a streetshitting indian.

Wildlands runs at 60% CPU usage evenly across 6 cores on my 4.1Ghz Wintel.

>that horrendous dark theme

wtf
I don't get that kind of nice evenly spread CPU usage with my AyyMD CPU!
What do?

you upgraded to a 10 year old processor??

I use it purposefully, but I'm glad you dislike it.

Attached: ss.png (1191x825, 190K)

Hei hei, it's still a upgrade

At this point even a FX chip would have been a better "upgrade".

>4k
well no shit sherlock. your system is GPU bound. Turn the resolution way down and then report the CPU usage. Unless you have a monster GPU your point is moot

No practical difference at 1080p, mostly because of Vulkan.
The point wasn't CPU usage by the way, Sherlock, it was core count utilization.

Attached: doom1080p.png (559x355, 18K)

Core count utilization? Its utilizing all of your cores. Half of them arent even real cores so it makes the task manager look wonky.

Take your image, remove half the boxes, and double the graph values in the remaining box. That is your true CPU utilization

and you still didnt list your GPU. Im going to assume its 2-4 years old and mid tier.

In that case, turn DOOM to absolute minimum graphics, like 640x480 @ lowest, then report CPU usage.

task manager doesn't tell you the game is multicore you retard. watch jay two cents video of disabling cores on a extream processor all the way down to 2core

gets same fps in every thing even on 2core
and he tried like 15dif games with dx12 modes and dx12 patches that suposedly added "multicore"

it doesnt exist get over it.

lol look at anons picture, you're literally too retarded

Half of the cores in user's pic don't even exist.
Try again

>vulkan
That'd be nice for developers to implement

Yes, how many cores a game uses.

>Take your image, remove half the boxes, and double the graph values in the remaining box. That is your true CPU utilization
You could just look at the total CPU usage percentage, dummy. We are talking about how multicore friendly games are.

We are not talking about CPU bottleneck or related. You seem to be having a hard time understanding what's going on.

>and you still didnt list your GPU
You could have just asked in the first place, it's a RX 580.

>In that case, turn DOOM to absolute minimum graphics, like 640x480 @ lowest, then report CPU usage.
Not what we are talking about but since you're so curious, here's the picture.

Been there, done that. Ryzen Master makes such testing real easy.
The more cores I disable, the more the remaining cores are being utilized, that's kind of the point of scalability and SUPPORT for more cores. If the core count becomes too low and the utilization of a single core is maxed out, then the framerate will start to suffer.
You seem to be having a tough time understanding what's going on here.

Yes, that's called hyperthreading/simultaneous multithreading. That's the point of it, different threads can run on both. Two virtual threads for a single physical core.

Attached: doom-low-400p.png (556x349, 23K)

>Its utilizing all of your cores.
That was the point of anons picture

Thing is, you will get mostly same graphs on 2 4 6 8 16 cores and probably same performance.

>Two virtual threads for a single physical core.

Thats the point you're missing. Doom detected 6 cores so its running in 6 core mode, not 12. Forcing 12 core more wouldnt make it run any faster, as that fake hyperthreading doesnt get more performance out of 1 core by running 2 threads at once, it switches between the threads faster. 1 core can only run 1 thread at a time and forcing it to run 2 just adds overhead, not performance or efficiency.

>Look at a test to see if games truly use multi core CPUs
>They used different CPUs instead of using only one and manually disabling cores
>mfw
It's Intel levels of rigging

Attached: Smile.jpg (600x600, 94K)

Try forcing 12 core mode and see what happens.

steamcommunity.com/app/379720/discussions/0/351660338733413841/

I want to be unironic but it's so hard when people are being dummies.
Sweety, that's exactly the point of having multiple threads over physical cores.

no its not, dipshit. do you even know how virtual threads work over 1 core?

This depends. Look at Witcher 3 for example in pic related, red are physical cores, green are the hyperthreaded cores. This depends on how the game itself handles them.

We are talking about core count utilization not speed. What's so hard to understand?
Also no, that's what hyperthreading is for.

Again, what's so hard to understand?
en.wikipedia.org/wiki/Hyper-threading
>The main function of hyper-threading is to increase the number of independent instructions in the pipeline; it takes advantage of superscalar architecture, in which multiple instructions operate on separate data in parallel.

Attached: 1539173753195.png (560x354, 29K)

Hyperthreading is like feeding one mouth with two spoons

That's what I was doing, by default it uses all cores, including hyperthreaded ones. The picture showed perfectly it utilizing all cores, including the hyperthreaded ones.

Attached: threads.png (1226x93, 77K)

Playing MHW and FFXV maxes out my quad-core to 99%

>hyperthreading doesnt get more performance out of 1 core by running 2 threads at once

Attached: 0008.png (200x180, 7K)

>doesnt know how hyperthreading works
>doesnt understand that making more threads for the same application doesnt magically make code faster

Attached: 1508532704780.gif (245x239, 359K)

>make code faster
not how HT works

what are you even arguing at this point? that all modern 3d games should be raping your processor to absolute 100%?

no just making fun of a retard who does not know how things work
i couldn't care less about the /v/ tards here

just here to misrepresent peoples posts via greentext and post anime reaction pictures? speaking of /v/tards...

>that all modern 3d games should be raping your processor to absolute 100%?
But they are, if optimized properly and ran uncapped with a equivalent performing GPU.

not if it's a piece of shit that can't use multiple cores

Even a perfectly optimized game won't get true 100% cpu usage, even if it shows 100% on the task manager

Obviously, but this is not a problem since we don't need the games to be running at multiple hundred frames per second when our screens or graphics cards can't keep up anyways. This thread was a joke that games will never use multiple cores and this was already proven false to anyone who thought it was still true.

>anime reaction pictures
>violated heroine
>anime
kek

then slap a 1030 or a 750ti in there and call it a day
video playback sucks nowadays without video decode

This, the codecs aren't even meant for software decoding and every modern GPU or SoC has built in support for them.

Faster cpus and more memory only encourages lazy coders to write more shit code.

False. DXVK really needs a quad-core.

>bottleneck
Kys.

sure if you wanted to play games from 2004

This I played the beta and it was horseshit
If the full game comes out and isn't an improvement I'll get a refund

In 2007 a friend said I was dumb for getting a Q6600, then I showed him my computer running UT3 and WoW simultaneously on two monitors and thought it was witchcraft.

>that fake hyperthreading doesnt get more performance out of 1 core by running 2 threads at once,
Yes it does idiot, that's literally the entire point. Not to mention that Hyperthreading is fake SMT, not the other way around.

i upgraded an exxx dual core to this it's still solid.

>Minecraft on dual core Pentium
>Absolutely unplayable LAN lagfest
>FX-6300 @ 4GHz
>Absolutely no lag, even when loading chunks while flying
explain yourself Jow Forumstards

A quad core i5 is Perfect for Games™

i use a based fx-6300 too

Based

Attached: 1442691134843.jpg (493x655, 24K)

its actually very good for how cheap it is, will upgrade in 2020 when based zen3 comes out, mostly because i live in a hot country andi need a lower TDP

why do you have 5 usb drives?

Bump