Has the gap closed in 2018? Is paying the Intel tax still worth it for gamers?
Has the gap closed in 2018? Is paying the Intel tax still worth it for gamers?
Other urls found in this thread:
techspot.com
en.wikipedia.org
intc.com
wccftech.com
wepc.com
extremetech.com
twitter.com
It's getting closer. Amd works but if you want the best framerate 144hz+ and single core perf then intel
The single thread gap was
not really, i'd say. the gap is too small now, objectively speaking Intel still gets better performance in certain games due to superior single core, but overall you're getting more bang for your buck with Ryzen. unless literally *all* you do on your computer is gaming, that is. if you do any sort of multi-processing, editing, or want to do a lot of tasks at once, than a 2600 or a 2700 is much more affordable and good performance wise than any of the mainstream Intel options.
what if im 90% gayman but i have other shit open too like chrome tabs and youtube vids but i don't do any editing at all? i'm thinking about upgrading from my i5 2500k
Real Talk:
Clock for clock, Zen/Zen+ is faster than Skylake/*lake in everything except AVX2, where it is half the speed of Skylake.
Most Intel CPUs can be clocked upto 5ghz, Zen+ still only manages 4.2 (golden samples on both can go higher)
Software that heavily relies on AVX2 will be faster on Skylake, Software that favors single-core performance will likely also be faster due to the 1ghz clockspeed advantage.
You'll pay more than twice as much for an equivalent *lake CPU
If you buy a *lake CPU now, the motherboard won't be compatible with Gemini Lake, if you buy AM4 now, your motherboard will still be good for at least Zen3.
Zen2 doubles the width of the AVX units, so the AVX2 advantage *lake has will be gone, presumably the clockspeed advantage should be wiped out too, but be don't have real information on clockspeeds yet, only speculation and hype.
How much are you willing to spend on the system?
If it's a lot, go with the i9 9900K, it's the best 8-core CPU available.
If it's tight, consider the following CPUs:
> i7 8700K or 9700K
> Ryzen 2700X or 2600X
> i5 9600K or 8600K
Between the i5 and Ryzen 5, I would go Ryzen since it's more future proof.
You can't go wrong with any of those alternatives... just get a decent GPU.
>Zen+ still only manages 4.2
The 2700X does 4.3 out of the box tho...
Not even close.
is there any point buying a 2600x over a 2600? surely the 2600 oc's to x levels easily anyways, I'm willing to spend more on my processor maybe not 9900k levels but id consider 9700k it's just that cpu is like $600 in aus, and 2600 is $240, is there really $360 worth of difference in them for mostly gaming? i'm planning to pair with a 2070 or 1080ti
For seconds at a time with XFR.
Manual OC won't do 4.3 on most samples.
2600X will get better clocks with XFR as the 2600 will hit a 65watt TDP wall.
If you're going to manually OC, just get the 2600.
i probably won't upgrade again for another 5~ years(this 2500k has lasted me since q1 2011) so should i drop the extra shekels for a 9700k or will the 2600 do me fine? i don't have a hard budget but i'm not trying to go broke
intel is far behind if your system is patched. If it isn't patched and you're vulnerable to 300 attacks then it's slightly ahead.
>5 years
I prefer Ryzen 5 2700
Go for the 2600. The problem is to get the most out of ryzen you need to buy fast RAM. And b-die RAM too. This can be more expensive than the processor itself.
Oh but then again, intel chips also benefit from faster RAM so you're fucked either way.
how fast we talking here
3000mhz CL15 or 3200 CL16 minimum for ryzens. 3200 CL14 would be great.
>only have 2800 CL16
i fucked up didn't i
I mean it won't brick it's performace, you'll lose 5-10%
xfr will oc the cpu on its own as good or better then most people's manuals.
2400 CL17 here
#yolo #idgaf
>if u have tight budget get expensive unlocked CPU that requires expensive motherboard
here, look at first gen threadripper and first gen ryzen prices, 16 core threadrippers can be had at sub 500$ and 1800x for sub 200$
the 2000 series will go the same way, and do will the 3000 along with every single one after it.
buy a good enough cpu now, get the 7nm ryzen new or wait for final gen am4 to come out or go used, upgrade to that. this way you have an upgrade path so its not a full system rip out and rebuild
techspot.com
i7 is better for vidya, 2700x is not so far away (and also cheaper). pick one
Some people consider $1200 for a tower a tight budget.
No. Intel is not worth buying anymore. Too many backdoors and security bugs. Patching those out puts their performance below AMD and performance per $ even further below.
>had course on low level optimization
>lecturer had shitload of experience in the industry
>says amd's profiling and debugging tools are meh and that intel's are amazing
>prefers cuda vastly over opencl
Now I trust this guy to not be an intel or nvidia shill, but just a guy who's giving his honest opinion. From a consumer perspective I'd go AMD without a second thought but the developer in me wants nice tools and a workflow that works so I guess I'm kinda stuck with overpriced intel and nvidia stuff.
patched? explain.
he's right about cuda at least. don't know about intel, in my experience their tools are fairly mediocre
He means patched with all the security bug fixes that have been done in the last year. Intel has had, like, a dozen and the result is that performance has dropped considerably in certain operations. Games mostly only took a small hit, though.
how do you patch a cpu?
And what sort of security? just crypto rng shit?
Have you been living under a rock?
If he had to ask at all in the first place he has. Probably /v/ermin.
>being this retarded
where have you been the past 5 years?
good goys buy covfefe lake. you're not a bad goy, are you?
fucking retard lol
>microcode
>kernel fixes
their fix is to not use sections of the cpu and features of the cpu. intel fucked itself.
For me yes, intel offers nothing but the best in terms of gaming performance, which me, being a gamer, really needs to push out the FPS.
AMD is nowhere near as close, and my opponents that use amd get destroyed by me in seconds
that isn't true, intel offers many high ranking slots in gaming performance but even their best CPU is beat by others in many areas of gaymen
>gaymers
>caring about security of their systems
>gaymers
>thinking they have a choice whether their system is updated or not
>gaymers
>thinking
they gotta think how to restock shelves at their grocery & retail jobs to support their manchild habit
Honestly i never really cared for low-end CPU microcode shit, i was just curious how they patched it, since i assumed it was hard coded.
as you lads said they shut down bits of the CPU, but even then, how do they do that? does the CPU have a bios of sorts? or does it get done via the motherboard?
Again though, how was this shit insecure? Just RNG/Crypro related algorithms, right?
>bios
Before you roast me, i mean more of a firmware. Like a CMOS
>i probably won't upgrade again for another 5~ years
I would definitely wait for the Ryzen 3 launch. It's most likely less than half a year away, and might not entirely unlikely provide a significant upgrade over current offerings.
just get more ram and oc
>Honestly i never really cared for low-end CPU microcode shit, i was just curious how they patched it
Mostly by having the operating systems flush TLBs and caches at context switches, which is what leads to the massive performance penalties.
That being said, for the same reason it mostly only affects programs that do a lot of context switching, ie. mostly I/O-bound programs. Programs that are primarily CPU-bound aren't much affected.
>Just RNG/Crypro related algorithms, right?
No, not at all.
>If it's a lot, go with the i9 9900K, it's the best 8-core CPU available.
>Same price as a 9820X which is a 10 core
>Same price as a 1920X which is a 12 core
So what if it is the best 8 core
Thanks intel, thanks for the patches
What am I supposed to read from this? That the security patches made it faster?
The opposite. Left is with patches, right is without em.
Well anyway, as said, it does affect strictly I/O-bound programs the most, so it's not exactly surprising that a pure disk test would be the most affected.
interesting read, thanks.
There was a piece running around that Intel don't want all patches to be applied because they don't want performance loss.
Make with that what you will.
>like a CMOS
>not like a bios
holy shit you need to google what cmos and bios are.. retard
you can just NOT USE shit, like you can NOT USE your left hand for a day.
>Same price as a 9820X which is a 10 core
requires a far more expensive motherboard. X299 ain't cheap.
The 9900k CAN be run in a ~$100 motherboard if you felt like it. Though i'd recommend spending at least $180-250 for that level of CPU just to have decent VRM cooling if nothing else.
Yeah, but for performance per dollar is definitely Ryzen 2600/2600X. Still frustrated how Intel skyrocketed their prices.
>Is paying the Intel tax still worth it for gamers?
depends, if you're poor like Jow Forums then no it's not worth it
otherwise inlel is still top in gayming
On the CPU is a mask ROM which contains the factory firmware. After the mask ROM is burned you cannot modify it but there is a slab of SRAM which can hold diff-patches between the factory firmware and the most up to date version of the firmware. Every time the power is removed the firmware reverts to the factory default so the diff-patches have to be applied by the BIOS/EFI or OS at system startup.
2990wx, or the 9900k if you're poor
Note: while you can fix the firmware but you cannot fix fundamental issues with the hardware itself. You can only workaround those issues or maybe disable the hardware if there's a disable bit but the hardware will always be fucked.
The gap was matched or exceeded in nearly all applications that aren't games. Zen2 is expected to match or exceed the perf gap in gaming, and exceed permanently in multi-threaded applications EVEN with Intel's compiler, which does shady shit. If there was a unbiased compiler used in the industry at large, the gap across all products (standard/hedt/enterprise & gaming) would be even larger.
You can fully expect for Intel to pull every shady trick it has in the book if AMD's enterprise market share gets within 15-20% of total available share.
Intel's FY'17 Revenue was $62.8Bn. This encompasses everything they do, but around 50% of this is purely enterprise. The other 50% is broken down into memory, networking, hedt, desktop, and mobile. So $31.4Bn is their approximate yearly revenue from server sales of their Xeons across their entire stack (both new products and previous gens).
As of this month, wccftech.com
1/2
Microsoft, Amazon, Google, and Baidu all can relegate that to their compute heavy instances they offer to their customers. But for everything from low end instances, lambda/firecracker-esque offerings, containerization abstraction suites, all the way up to high core/thread count & performance in non AVX related workloads, Zen2 EYPC completely BTFO's Intel in cost, perf/watt, and core/thread capabilities.
Additionally Zen2 EYPC is Spectre hardened further than Zen1, and like Zen1, is completely immune to meltdown. So there's no additional patches that need to be added to the system to improve security. Unlike Intel's situation, where in order to be maximally secure, you're looking at a 20% perf drop; which for TCO related platforms, is a major major catastrophe in cost/perf/watt risk/gain models.
Further, AMD has secured revenue streams YoY till 2020 minimum from Microsoft & Sony with their Xbox One/S/X & PS4/Pro models. AMD gets royalties on their licensed IPs & designs for each SoC they sell. Then, in 2020, AMD has once again secured revenue further with consoles, for the Xbox Scarlet & PS5 platforms. If Navi ends up being around Vega64 in performance with a 150W TDP, you're looking at minimum GPU performance between the 1080 and Ti in standard gaming scenarios PC side. Console side, with 5-7 years of dedicated optimization, you're looking at 1080Ti to Titan Xp levels of performance. Assuming a dedicated 1080Ti perf target, you're looking at a 3x perf uptick GPU side between current & next-gen console GPU offerings. The addition of Zen2 (likely) as the CPU in the SoC over Jaguar, is effectively a 75% increase in IPC clock for clock. The combined performance difference will be akin to going from PS2 to PS4 with the PS4 to PS5.
Next-gen is expected to last till 2027, so that's another 7 years of dedicated revenue from that. wepc.com
2/3
Are you autistic? CMOS is a memory medium for the BIOS.
It's just a medium for the settings of the board.
Interesting about containing a copy of differences, but it makes sense so if you restart you cant brick the cpu, i guess?
Figured as much for the hardware, you can only make electrical signals go in the manner they are designed and obviously they would have them set in such as way you can configure them but it would be upon a scope of access they intentionally designed it with.
no shit dumbfuck... read what he said you stupid piece of shit.
Why are you so angry?
You can safely assume that if Xbox Scarlet & PS5 are both to cost around $499, then ~30-40% of their cost will be entirely the SoC. But because its a joint design agreement for AMD + Microsoft & AMD + Sony, you can (paper napkin math) half that for the split for royalties. So 15-20% of the $499 will go to AMD and the other 15-20% will return to Microsoft & Sony for their hardware sales respectively. Assuming a conservative royalty estimate of 15%, each sale is looking at $74.85 per unit sold from either company. Assuming that they sell at a 1:1 offering and sell say 2M units combined in 2020, you're looking at $149.7M in revenue for 2020. Assuming that it ships exactly 2M combined YoY for 7 years for a total of 14M hardware units sold by EoL, you'd be looking at a combined revenue of $1.05Bn. That's a fuckton of money that can be reinvested into paying back debt, investing into new R&D paths to improve uArch for CPU/GPU and chiplet (revolutionary) based integrative (evolutionary) designs.
PC market sales if AMD claws back say 15% of the consumer market by 2025, that would be an immense amount of revenue. According to extremetech: extremetech.com
3/4
So yes, you are autistic?
Just because it stores bios info doesn't mean it can't be used to store other shit, you autist.
Any non-autistic person of sound non-autistic mind would notice, since they are not autistic, that i mean CMOS as an example of a small memory storage chip, rather then if you were autistic, whatever the fuck autistic thing your autistic mind thought of you autistic fuck autism
retard
autist
Caveats:
*Everything is paper napkin math. I shouldn't be quoted on this without a grain of salt for very confidence in outcome. But, it would be fair to estimate that AMD will by 2020-2022 see their operating revenue increase into the range of $6-8Bn/year. That much is fair, given their current growth and expected offerings across the markets.
**Additionally, all of the above paints a very clear picture for console market. Given the very deep integration of Zen and Navi will be to Sony & Microsoft, plus the relationship MS has cultivated with ATi & then AMD with Xbox 360, Xbox One/S/X, and Sony has with AMD with PS4/Pro, all coupled with the fact that Microsoft and Sony both have gotten burned when dealing with Nvidia's unquantifiable faggotry; in console space, respectively with Xbox (original) & PS3. Both partners don't want to work with Nvidia ever again. Its why they are so fucking buttblasted by all the gains AMD has made in console space, and why they keep branching out into AI, Cars, Deep Learning, RTX, Tensors, etc.
Which means that Microsoft & Sony have both locked AMD in pretty much for the rest of console existence with AMD designed SoCs. Zen is insanely good, Zen2 will go into the next consoles. By the time they hit EoL, Zen5 will be out in market; which means Xbox Scarlet II & PS6 will likely be Zen5 @ 3nm EUV + 2 generations successor to Arcturus (whatever that becomes). 3nm EUV w/ Zen5 & chiplet + I/O means that the successors to Xbox Scarlet & PS5 will likely see 16c/32t, 24c/48t, or even 32c/64t + mid-range GPUs of 2027 which offer GTX 1080Ti Quad-SLI 100% scaling performance for ~$250 targets in consoles. So another revolutionary leap forward in performance. Around this time, expect that GPU to be able to do multi-sample (say 10-15 rays per pixel (rpp)) ray tracing in games while offering minimum 4K60 baseline or 5-10rpp for 6K60 or 1-5rpp for 8K60 and so on.
***AMD gains market permanence.
4/4
Intel's profilling and debugging tools are amazing because of all the shady shit they pulled over the last decade and half so that AMD CPUs always performed like ass, even though they never really did. Additionally, by doing so, they gained near market monopoly, and thus could dictate what happened where, and when. Further, by gaining this monopoly and like 15 years of uncontested reign at the top, they could spend the time on creating these "amazing" toolsets for debugging & profiling.
If AMD takes the crown and sits on it for the next 15 years, that professor of yours would say the same shit then, if he was still alive. His statement, in the moment, isn't wrong; but I'd still call him a faggot for ignoring the extreme lengths of anti-competitive and anti-consumer practices Intel engaged in that ultimately allowed them to do what he claims to be true.
>If there was a unbiased compiler used in the industry at large
If only there were such a thing.
You still get more FPS with an Intel CPU
>that intel's are amazing
I'd be curious to know what they can do that I can't do with tools like callgrind, gprof and perf.
AMD users are the vegans of tech enthusiasts basically.
>dislike being cucked over
>vegans
Your choices are limited to one.
Jow Forums - consumer electronics
New coffee lake should be at least a little bit better, as Intel actually started fixing their issues. Some of them are still mitigated with software, but I assume the perf is still better because you don't need as many mitigations. Do you have benchmarks of 8th and 9th generation CPUs to compare?
Still, I'm mad at Intel, and I'm considering either a ryzen 7 or a threadripper box for myself. I wish Intel would release their discreet GPUs already, as they will probably be the only ones with good drivers (it's either nvidia with garbage proprietary drivers that barely work and don't support modern standards, nouveau with performance so bad I'd have trouble playing quake 3 arena even with a modern and supposedly powerful GPU or AMD with their sub-par performance with their drivers for linux, as well as lies about it being free software when it contains a blob loader).
The prices of Intel processors are ridiculous (I can buy a first generation ryzen 7 for less than the cost of the CPU I use at work, i5-8400 which isn't the newest chip either). Considering that Intel managed to build a reputation of making vulnerable chips, I don't want to lose perf every time someone finds another one and it requires a software mitigation.
Oh, and I'm worried about DDR5 and whether AMD will have to release a new socket for it (Intel would do it anyway).
>as they will probably be the only ones with good drivers
Seeing the quality of their IGP OpenGL drivers, I won't be holding my breath for that.
>don't support modern standards
There are many reasons to be mad at nVidia's blob drivers, but I've never seen that being one of them. What standards have you been missing that I haven't seen?
>AMD with their sub-par performance with their drivers for linux
While I don't have a GPU for them, word on the street for quite some while now has been that the ordinary open source Linux drivers are really good these days, thanks to AMD having both documentation for their hardware as well as contributing to the drivers themselves.
>lies about it being free software when it contains a blob loader
While needing no blob would certainly be better, it's a very big improvement over the nVidia drivers to have the software side of the driver be free and open. Having open firmware is pretty much just a cherry on top.
>DDR5 and whether AMD will have to release a new socket for it
They'll definitely have to release a new socket to support DDR5, since the electrical interface is different. Whether they'll keep DDR4 compatibility and in that way allow newer processors to keep working on DDR4 boards is perhaps a better question, at least. I wouldn't bet on them doing that, but they did after all do exactly that with Socket AM2+, so who knows.
>being this stupid
yes I love 144Hz on my 60Hz monitor. YOU CAN SEE THE DIFFERENCE I SWEAR.
>being this autistic
Intel is still better for games, though not sufficiently so to justify some of their insane pricing. If you're going to be GPU limited as well it will matter less or not at all, for instance there's very, very, very little difference at 4K. AMD is much better value in general, I would say you shouldn't buy Intel unless you require something very specific which AMD cannot do, whatever that could even be.
Is not used universally across the majority of the industry. So while that exists, it doesn't exist in an area where it actually matters.
Autism aside, it would trigger pretty much anyone's autism to call non-volatile memory "CMOS" in 2018. It was already pretty retarded in 1988.
Even on Windows, MSVC and ICC coexist in harmony, so it's not like ICC "dominates the industry" either.
>What standards have you been missing
KMS. My old GeForce card doesn't support it with their proprietary drivers, and when I used them I had to rely on hacks with setting the resolution in GRUB and then telling Linux to not change it. Oh, and don't forget that time when the sway developer told Nvidia to fuck off when they added their own implementation of some buffer-related stuff that was already in OpenGL and expected people to write code just for their shitty GPUs.
>it's a very big improvement over the nVidia drivers
Nvidia cards with nouveau are considered fully libre, rms recommends either Nvidia or Intel because you don't need ANY blobs to run these GPUs.
The state of Jow Forums on 4channels of cu_ck, TOP KEK.
>KMS
Oh, you meant on that side of the interface. Then yes, agreed. RandR was also a long missing feature, though I think it is somewhat up-to-spec these days.
>Nvidia cards with nouveau are considered fully libre
While true, support is pretty mediocre, and newer cards are hardly supported at all. Which is not a critique of the nouveau project, which I think is doing admirable work, but I'd prefer using AMD cards thanks to them having public documentation, than cards from nVidia which have to be supported entirely with reverse-engineering, even if the former require a firmware blob.
I mean, it's not even like nVidia cards don't require firmware, it's just that it's stored in non-volatile memory on the chip itself rather than having to be loaded at runtime. The distinction is fairly academic at that point.
I currently have "Ballistix 16GB Sport LT Series DDR4 2666 MHz UDIMM Memory Kit" and just upgraded to a Ryzen 5 2600 + B450 mobo. What type of ram do I need to go with the Ryzen?
CMOS is not like a bios. Stop making things up.
Not anymore. That was a Release Candidate of the 4.20 kernel, and they fixed that.
Why do people keep forgetting that beta/RC releases are for performing basic Quality Control?
>I mean, it's not even like nVidia cards don't require firmware, it's just that it's stored in non-volatile memory on the chip itself rather than having to be loaded at runtime. The distinction is fairly academic at that point.
I guess you haven't looked into what rms thinks about these things. Software that can't be replaced is fine in his view, so such firmware would be acceptable, just like pocket calculators. Oh, and keep in mind that the lack of a blob really matters on the free software side of things.
You do know there's a full operating system implemented in the Intel CPU's right?
what the fuck are you on about? i never said it was. i asked if cpus had something like a bios to store firmware or some shit, then later corrected that by bios i meant cmos. I even said "CMOS is a memory medium for the BIOS."
I thought it was more of a programming language then an OS?
Fair enough, i was more so meaning in the fact it was a small little memory medium for storing settings and shit.