Help a brainlet out

Help a brainlet out

when the new nvidia cards have ddr6
why do pcs still use ddr4 ram? why cant pc also have ddr6 in the motherboard?

Attached: 81yuvhTTlkL._SX425_.jpg (425x266, 20K)

Other urls found in this thread:

youtu.be/pbgvzVgfoSc
twitter.com/AnonBabble

:^)
youtu.be/pbgvzVgfoSc

thanks

what the frick, i still use ddr3 ram in my main pc

I use DDR2

Attached: Q6600.jpg (1080x1080, 60K)

man thats slow as fuck

not really I used a core2duo overclocked to 4.5ghz from 2008-2017 every thing ran at 75+ fps beside BF4 at 45+ fps and BF1 at 30+

upgraded because pubg ran at like 24fps but since they have made it more optimized it runs at 50fps kinda wish I had waited for pubg to be optimized and not upgraded because I think CPU have been basically the same performance in games since 2011 and next big jump in architecture is prob 2020 or some thing when intel releases the shit they have had in RnD for last decade.

>core2duo overclocked to 4.5ghz
man, why do they even make new CPUs anymore

low key think intel pay games like DayZ and Pubg to make shit made games to force people to upgrade.

DayZ ran like shit but when they put it on PS4 and upgraded the PC version the FPS went from like 24 to 75+ and simlar thing happend to PUBG.

I knew heaps of people that got new computers for DayZ and same for PUBG. luckly i waited for DayZ to be optimised and was rewarded with tripple the fps and didnt upgrade but for pubg i didnt wait and could have had simlar experience. some thing about a games FPS increasing by 200% is really dodgy and i dont think could be pure incompetence seems intentional to me and both seemed to happen about a year after release.

I kinda wish I had kept the system once i got 7gen i pushed the core2duo to 5ghz and it ran for a while but i think the motherboard died but the CPU still survived it i think. you would be surprised how much you can actually push old hardware.


i wish i had kept it and tested a 4.5ghz core2duo with a 1060 or some thing. wouldn't be suprized if it performed about the same as my new cpu games use shit all cpu and the ones that do are dodgy as fuck. like Metro for instance is one of the fue games to perform worse on 2cores than 4+ but if you do a config file edit it performs the same on all core counts.


some dodgy shit is going on with "intel gaming" i think.

my man.

Attached: x3220.png (810x615, 101K)

its honestly not that far out of the realm of possibility that intel approach games like PUBG and DayZ and make shady deals. both thous games didn't know they would sell alot and im sure intel earn way more from people upgrading to play them than the devs would get from selling more copies to people on crapper computers. like obviously in the 2000s intel and nvidia supported developers doing actually cutting edge shit and made hardware optimized for games like this pushing the limit like hl2 and crysis but i ligit think in the 2010s they gave up doing that and probably now just gimp games on purpose to drive sales.

learn from the techcuck.

like intel/nvidia selling 2mil 500$ cpu/gpus to a game that sells 1mil copies is 2 billion dollars in revenue and it probably only costs them like 20$ to make a 7gen intel CPU (including RnD because its mostly similar to a 2011 architecture) or like maybe 100$ to build a 1080 (including RnD)

that's like 1.7billion dollars for doing a shady deal with some retarded DayZ or Pubg programmer/publisher. they would totally do that.
that's like 1.

Wait when did pubg get optimized?

like we pay you to make your game shit for 1year regardless of how popular it gets contract. 1.7billion dollars richer.


if you where making pubg or dayz even if you knew they would be sucsessful you would take a 100mil check from intel beacuse its instant cash and you dont have to wait for your game to sell 9mil to people with crap computers to earn than 100mil.

not sure exactly when might have been patch 1.0 in December or slightly after early this year. my fps went from like 80 to 120 on a 1060 and went from 24fps on my HD 6970 to 50fps

I run ddr3 alongside my OCd 4690k

but the DayZ jump was even bigger . i literally would get like 23fps on my core2duo 4.5ghz and 6970 and then basically just like pubg a year after launch they suddenly patched it to DX11 i think and the fps went up to 70-80


the fact the "performance" patch improves performance on old hardware more than new hardware just seems dodgy to me and like they are actually doing some thing els and the "new version of UE4" or "Dx11" is a cover and they really turning some switch. how the hell can a new version of UE4 increase fps by double on old hardware and 50% on new hardware and how can going from dx9 to dx11 in dayz increase fps by 3x on old hardware. i imagine on new hardware the improvement would be less noticeable so no one calls it dodgy.

I bet. Intel does have enough money to throw at game companies to endorse their new CPUs. It wouldn't surprise me that they would pay developers to push unoptimized or "CPU specific" code, then the developers rollback that codebase once intel quits paying them for that game. That goes to show that Intel architecture hasn't fundamentally changed, rather they added a whole bunch of bullshit on top of their existing CPUs and rebranded them as next gen.

i don't think its intel being shit i think its just impossible to make a better CPU for gaming since like the late 2000s or maybe even mid 2000s. i really do feel like going and buying some old CPU hardware and getting it working with modern GPUs to test.

it always seems to be CPU limited games that this shit happens. never seen or felt that nvidia did some thing dodgy like this. other than maybe paying Crytek to make crysis ridiculous but at least we got a pretty game. at least with GPU shit you can generally turn it off.

leave

>how the hell can a new version of UE4 increase fps by double on old hardware and 50% on new hardware and how can going from dx9 to dx11 in dayz increase fps by 3x on old hardware. i imagine on new hardware the improvement would be less noticeable so no one calls it dodgy.
I wish I knew the specific differences of the DirectX API. It seems like 11 might be more optimized, or works on old hardware extremely well, since that hardware has been around long enough to really understand it.
Diminishing returns, dude.

i think best times to buy CPU where 2001-2008-2011 buying at other times or multiple times between or after thous points was a retarded waste of money.

like buying a AtherlonXP 3200+ or some thing
like buying a core2duo that can oc to 4.5 in 2008 (forget model number sorry)
like buying a 2500k.

sure you can show benchmarks of 2500k vs 8350k or what ever. but in games they perform the exact same and are 8 years apart.

intel scam the fuck out of gamers but im not sure its their fault i dont think they literally know what they can do to improve gaming performance unless they spend like billions in a "true" new architecture not the incromental shit they use for 6/8 years at a time .

your just pissed off you brought like 5 cpus in the last decade and realize it was utterly pointless and you could have got 2 and had exact same performance in games or got 1 and had similar or good enough.

yer but the reason it makes no sense is DayZ and PUBG where CPU limited not GPU limited. and then suddenly they got massive CPU improvments. going from DX9/11 cant suddenly unlock the fact that Arma2.5 or what ever they use for dayZ is a shithouse engine for cpus.

DayZ is an Arma 2 tc mod... Dx 10 moved a lot of shit that was done brute Force on the CPU to the GPU. And 11 and 12 continue that trend. A properly made modern game uses the most it can on the GPU for maximum performance.

Just download faster ram moran