So will the AMD bottlenecking meme finally stop now?

So will the AMD bottlenecking meme finally stop now?

Also I'd rather have a few less frames in 1080p than deal with Intel which has shown blatant contempt for consumers - their EULA for a vital specture security patch which would have banned users from benchmarking their own systems is only one of many examples.

youtube.com/watch?v=q7t0kA5VJ7o

Attached: AMD bottlenecking.png (1408x685, 564K)

Other urls found in this thread:

en.wikipedia.org/wiki/TressFX#PureHair
github.com/GPUOpen-Effects/TressFX
pugetsystems.com/labs/articles/Photoshop-CC-2018-NVIDIA-GeForce-vs-AMD-Radeon-Vega-1197/
twitter.com/SFWRedditVideos

delet this you fucking idiot if we dont have gayms we have literally nothying

Attached: 1519151364375.png (691x750, 33K)

Got myself a 1600X beginning of this year.
Gladly paid 250 euros instead of the 300+ for the intel equivalent that indeed provides a few more frames. Also AMD motherboard was cheaper.

show us doom, and ashes of the singularity benches too op.

>stock clocks
>pure hair: off
What did they mean with this?

Both are bottlenecking the 2080ti. Wait for the 9900k or 9700k to BTFO the streetshitters.

>1600x
>not going with the chad 2600

2600x wasn't out yet
also
>marginal improvements

better to wait for proper zen2

Pure Hair:
>en.wikipedia.org/wiki/TressFX#PureHair
AMD's version of HairWorks, pretty computationally expensive. Probably leave it off to avoid the impact on frames.

>stock clocks
You know what this means. It doesn't quite tell the whole story, of course - the 8700k has a ton of OC headroom, which does dampen OP's point.

Just gtfo with this shit. Nobody is or should be paying a couple grand for just two fucking components.

lol, buying raisin cpu

Attached: System.png (1631x931, 181K)

> the 8700k has a ton of OC headroom
But, but, but muh 1%!!!
Hardware reviews should include both OC and non OC because majority of people don't OC their shit.

>AMD's version of HairWorks
say no more
>You know what this means. It doesn't quite tell the whole story, of course - the 8700k has a ton of OC headroom, which does dampen OP's point.
Everytime I see benches like these they also use some super fast ram on the AMD system, and oops up setting on the intel ones.

>AMD's version of HairWorks
>say no more
let's see
>License MIT License[1]
>github.com/GPUOpen-Effects/TressFX
ohh my, it's literally botnet... who knows what kind of Furry optimizations they snuck into that proprietary blob...

faggots.

Hey Julia, are you one of those disgusting eastern european crackheads ?

>stock clocks
yeah how is the 8700K even running at all without a heatsink?

What? We don't do drygs here. That's a thing for lefttards.

How often do you post this image?

Why are you asking?

>AMD's version of hairworks
>Completly DIY solutuin that'sharder to implement
I guess most devs will just use hairworks, and whatever is part of nvidia's gameworks.
>ree why are devs only using goyworks

cuz you never post feet, but that's the second, or third time that I've seen that image.

Most devs use hairworks because novidia offers either money or man hours from their employees to help the studio with the game.
Novidiots would even defend a closed source library over an opensource one.

>stock clocks when it supports my point
>max OC with liquid nitrogen when it doesn't

Attached: 1445871083869.jpg (495x503, 100K)

>offers either money
>nvidia offers them money
>intel offers them money
>software devs don't want to sell software to consumers, just intel, and nvidia
The state of amdfags
>or man hours from their employees to help the studio with the game.
NVIDIA has better customer service, so they are bad.
No one complained when AMD did the same thing with Ashes of the singularity, or the fact that AMD should have the advantage since they are console cpus/gpus.

This just shows AMD is superior since a woman choose Intel. Toothpaste cpus

GAYTRACING DOESN'T MATTER GOY!!!

Attached: 1489161877622.gif (256x256, 425K)

>toothpaste
not on the 6950x

newfag

i don't give a shit if jayztwoshekels is shilling for either brand, he's a dumb faggot

how does it feel having paid for a 1920x but 3x the price?

probably feels good when using it with content creating software.

>6-core vs 8-core
>almost a tie
>AMD WINS!!!
AYYMD Logic
Just wait for 9900k

Considering that a Meme Lake CPU like 8700 or even 7700 mops the floor with AyyMD trash, pretty good.

Intel has the core speed advantage. To cry about ryzen having too many cores, is the same as crying about intel having too fast cores and that all test should be done at equal clocks.
The reality is that what ever edge one CPU has over the other should be presented to people so they can know the real world performance.
Objective core for core and clock for clock tests are good and have their place, just not something end users care most about.

i3 vs thread killer?

The one valid critique is where do they have room to improve. Intel can't get better yields or get down to a smaller node so more cores and higher frequency are hard to attain right now. AMD has great yields due to the chiplet design and they have 7 nm products working.

There's also the recent security flaws in Intel chips and the artificial segmentation of LGA1151 with 300 series chipsets and Coffee Lake that are easily compatible with 2015 hardware.

>So will the AMD bottlenecking meme finally stop now?

When it stops being true.

Attached: 1537979120825.png (630x371, 35K)

>The one valid critique is where do they have room to improve.
>if I have one fault is that I am not perfect, yet
Jesus Porsche tone down on the ego.

Fuck off you stupid ugly attention whoring tranny.

>Having more cores is not OK
>Having more GHz is tho
Explain your logic

this tells me less than nothing

should have at least waited for Zen+

Not him but at the time of my upgrade from Incel to AMD there was no option but to pull the trigger, and fuck if I was gonna buy a shitty ass 7700K.

what's wrong with Intel's best bang for your buck consumer enthusiast CPU, goyim?

>content creating software.
get a real job

>Explain your logic
He can't, he's braindead.

Attached: 1525353398975.png (1066x600, 429K)

Reminder this tranny namefaggot was most likely neglected by his parents and is seeking attention on an anonymous imageboard.

It's getting harder, and harder to find a purpose for that amd hardware huh.

Good.

The 1080ti only has 1.5% GPU marketshare on steam HW survey after how many years it's been out? I'm willing to bet the 2080ti will be registered in the sub-1% category.

Remember when AMDrones were saying Ryzen is meant for 1440p gaming and that 1080p is obsolete to cover up their shit performance.

Well as expected now that more powerful GPUs are there it's a bottleneck even on 1440p

this is why you should always benchmark CPUs with low res

Attached: 1440pbottleneck.png (1252x726, 402K)

>inb4 some AMDnigger claims that 20 fps is meaningless

>"there were lot's of cores sitting there in the single digit utilization doing nothing so this is obviously an anomaly with the far cry title."

Attached: 51c6dd2e37ad746aa68123f315ebf13c.png (916x483, 276K)

More like Hairworks is a really shitty knock off of tressfx

>BUT I NEED 4 FPS

Any other videos about this? I'm not giving my (view) to JayzTwoBraincells.

>AMD losing FPS in 1080p compared to Intel
>have more FPS than Intel in other resolutions
this sounds like it's fixed by game developers optimizing the game a little, the performance is available from hardware side

Pretty much game by game you will get mixed results. Things to look for are the fact that 1. 2700x is cheaper than the 8700k and 2. pretty sure it does better in real world applications like for actual work.

>b-but muh intlel for gaymen
IT'S OVER INTLEL IS FINISHED

>jayz literally says the game engine is shit
SEETHING intlel shill

>pretty sure it does better in real world applications like for actual work
Like with what?

Wow, you saved 50 whole Euros and the only downside is that you got noticeably shittier performance.

2600 is a kino CPU.
i am looking to buy an AMD GPU soon also. freesync and like like offer way better value propositions over their competitors.

Go back to installing security patches

Irrelevant
Only thing that matters is price.

>AMD Ryzen 7 2700X $329.99

>Intel Core i7-8700K $399.99

Basically the same in games with Ryzen offering substantial performance gains over it's Intel counterpart in productivity AMD multitasking. Allso the AM4 platform will be supported next gen while the Intel boards will be a dead end.

Looks to me like a clear AMD win.

Attached: 1471396486391.png (578x691, 182K)

>offering substantial performance gains over it's Intel counterpart in productivity AMD multitasking
>productivity
>multitasking
Like what? Can I do Photoshop, Premiere, Davinci Resolve 15, and Cakewalk better with AMD?

>Can I do Photoshop, Premiere, Davinci Resolve 15, and Cakewalk better with AMD?
crickets

Attached: dovlisoh.jpg (1024x1276, 132K)

general rendering encoding & compression or doing multiple things at the same time. Adobe software is heavily single threaded so they run about the same on both.

Is better on AMD, or Intel?

Adobe is on bed with Intel. Don't expect AMD to perform better on their software.

Fun fact, Adobe software used to have better multithreaded performance before Rysen release. But *something* happened to their render engines.

Adobe software runs about the same on whatever you put it one from dual core i3s to 28 core Xeons. At least until you start trying to multitask. Then the AMD system would pull ahead due to core count.

Attached: pic_disp.jpg (657x723, 105K)

>Adobe is on bed with Intel.

Not really. Just fucked a up ancient codebase with very limited multithreading.

Attached: 04b02ee2ef6b313d9dd849d644ee5ad2.png (696x678, 174K)

>Adobe software runs about the same on whatever you put it one from dual core i3s to 28 core Xeons
I guess I'll try out the i3 then. What's a good gpu?

Depends on your budget. You can get used gtx 970s extremely cheap on Amazon right now.

Aren't the Tomb Raider games optimized especially for AMD CPUs though?

>gtx 970 vs, gtx 1060 6gb, vs rx580 8gb?
RX580 should be better for adobe, no?

>Aren't the Tomb Raider games optimized especially for AMD CPUs though?
No. They are just properly multithreaded like all games should be. Games like Far Cry 5 are not optimized for anyone. They are just shit and Intel CPUs currently deal with legacy code and bad programing practices better.

Attached: 05f54730e4792fc68ccf83a0b66dbc6f.png (799x647, 86K)

I agree Intel bro, please delid thread OP.

Attached: 0000000002.jpg (638x599, 123K)

Yeah if you can find it for the right price.

pugetsystems.com/labs/articles/Photoshop-CC-2018-NVIDIA-GeForce-vs-AMD-Radeon-Vega-1197/
I guess for me I'll go with a gtx 1060 this time.

go with whatever works for your use case

The benchmark has it pretty even in most stuff, outside of adaptive wide angle (amd doing poorly for whatever reason).
And the gtx 1060 is esier to find, so I kind of have to go with it + a decklink.

>pretends he didn't see "stock clocks"

Toothpaste cpus

The vast majority of people do no OC.

>le gaymen
Grow up kiddies. Ryzen is best option and cheaper.

Ryzen is best. Just stay away from all the software that proves otherwise.
Maybe you like to run a database, or crack passwords, but with software that doesn't use cuda?

what the FUCK why can't amada be real

Attached: 1488432385535.png (1920x1200, 655K)

So not even AMD sponsored tittles from AAA studios will get AMD optimization, why bother

Attached: you.png (653x726, 18K)

this tells me less than nothing

Attached: 1501473416935.png (568x612, 187K)

>far cry engine
>ever not unoptimized garbage
intel only wins the "garbage game engines" race lmao

Not hard to find the 8GB for $200 after MIR.

Unfortunately the garbage game engine games are usually the most fun. And the "properly coded masterpieces" are shit games that never needed to be made for PC.

That's just your wrong opinion (´ヮ´)

>Ubishit
>fun
pick only one

Attached: 1481915065259.jpg (932x576, 52K)

Truly the most slavic of anons.

it tells you your post is fucking shit, m8

This is an Nvidia driver problem affecting Ryzen.

Jow Forums told me Nvidia never has driver problems and that the scots are fags.