Gay Mans Next Ass is getting DESTROYED over Steve's latest idiotic video

Gay Mans Next Ass is getting DESTROYED over Steve's latest idiotic video.

>Youtube compressed
>Uses slow motion scenes
>Ignores the fact that AMD CPU's can do it whilst Intel can't.
>Does not understand how video compression fucking works.

youtube.com/watch?v=jzaXvEPyKd0
reddit.com/r/Amd/

Read the comments for the lulz.

And here is an explanation of video compression for dumb fucks like Steve.
youtube.com/watch?v=r6Rp-uo6HmI

Attached: BoilingMaleCrustacean-small[1].gif (500x288, 1.26M)

Other urls found in this thread:

reddit.com/r/Amd/comments/c2qwik/gamers_nexus_explaining_amds_misleading_marketing/
reddit.com/r/Amd/
github.com/forkstreasury/ZombieLoad
twitter.com/SFWRedditGifs

Additional cancer
reddit.com/r/Amd/comments/c2qwik/gamers_nexus_explaining_amds_misleading_marketing/

based

>e-celebs and reddit on one thread
Someone nuke Jow Forums

AMD BTFO

INTEL BTFO

Wat until the Twatter screenshots start flooding in...

amd btfo turns out it was too good to be true, imagine believing chinese lies

No one cares.

>solved 12 captchas to tell everybody how nobody cares that amd got btfo

surprised nobody has mentioned the mini-drama about tom scott and trump's election

Twitch streamers do.

>r/Amd
fuck off faggot

More like GN just annihilated AMDrones.

>reddit.com/r/Amd/
Not r/AMD, it was r/Intel that was killing him after the tweeted about this. He didn't listen and doubled down.

Only ButtHurt AMD fans seem to attack our boy Steve.
They still remember him trash talking the 1800x and their victim complex went into overdrive.
Conveniently forgetting about all the good things he's said about zen 2.
Even in the stream he said Zen 2 didn't need to cheat to beat the 9900k, but AMD decided
>Ha ha look at the 9900k at 1.6fps
It's a shit test and Gaymers Nexus goes full REEEE when pool test with bad methodology

I don't give a shit how hard he sucks Zen2 dick, he simply lied too many times and doesn't even apologize for being wrong and absolutely retarded.

But he did a similar test when comparing 2700x vs 9900k.
Steve is just being autistic.

He's a cat owner so his mental illness is showing.

He showed the 2700x being every bit as fast though.
AMD didn't need to lie to make the zen 2 chips faster.
They already won that.

They have not lied though. Only with Navi which is a $500 Polaris replacement.

based

fuck pets

oh, it's you again

HOLY FUCK


HAHAHHAHAHAHHAHAHAHAH

AMD GETTING DESTROYED

Daily reminder that this is the same fat fuck who says that AMD is affected by Meltdown (WHICH IS FALSE. MELTDOWN = INTEL HARDWARE SECURITY ONLY ONLY)

I bet he's just buttmad that he will not get Ryzen 3000 CPU sample from AMD or his contact at motherboard manufacturer

Steve intentionally broke review NDA embargo for Ryzen CPU and that's why AMD said no to further free review samples for Steve but Steve mooched of motherboard manufacturer
but now motherboard manufacturer
say no to fat fuck assmad Steve

>"WAAH! I have to buy AMD Zen 2 CPU's and miss the launch day reviews because I broke AMD NDA last year. WAAAH!"
kys Steve.

>BITRATE DOESNT MATTER

Checked

Don't forget its summer ladies

ever wonder why there are crazy cat ladies with double digit cats and never a crazy dog lady?

>our
Fuck off.

Add it kek

lmao
Why did the dumbass doubled down when other places were shooting him down before hand? The worst part is not admitting he was wrong, I hate those people the most.

>steve being an autist whos wrong
>Fuck they know.. activate the drones
>AMD BTFO

Attached: 45632.jpg (700x565, 73K)

his egotistical arrogant smirks throughout the videos really sealed the deal for me. I unsubbed after having been subbed to him since ryzen 1st gen launch. There is just something else about people who can't admit that they were wrong.

According to his twitter, he apparently did record faster moving scenes, but "they were too hard to align in Adobe (R) Premierer (C) CC 2019 (TM)" according to himself. But hey, don't fret, because he says there is no visual difference in those faster moving scenes.

HE LITERALLY JUST PULLED A FUCKING "dude trust me" rofl

Pure coincidence..........................

He posted a correction like a day later dude.
AMD also doesn't send him shit because he took a dump on the 1800x when it came out.
Why have you formed this Victim complex?

dude trust me lmao

Attached: trust.png (649x544, 99K)

JUST BUY IT GOY!

fucking dumb fat fuck needs to fuck off, why do people even watch his trash videos when L1 exists? GaymensNexus only good content is Buildzoid's VRM rants lol.

Steve is right and AYYMDPOORFAGS are wrong

NO ONE ENCODES IN SLOW PERIOD

A lot of people are super fucking butthurt over this but there's a few things I noticed (and did at the E3 presentation).
>AMD presented the benchmark as "slow" in x264
>implied to be run on the same PC
>heavily focused towards "streaming and sharing"
The big thing here would be that if you were streaming and sharing your gameplay from one pc then you'd be using GPU encoding first. Anyone who's used GPU encoding knows that you record in one bitrate, stream on the other with little impact.
x264 Slow, I've only ever used the slow preset when I'm using a 2nd PC and that's about the only use case in streaming, in that case there isn't a game workload going on behind the scenes.

The whole presentation was flawed and that seems to be the point of the video, but people have got lost in the sauce somewhat. It's fair to say that the PC gaming community are a bunch of autistic children, you only need to look at just about any comment section on a build guide, troubleshooting page or even product launch to see that. So it makes sense that there'd be people up in a fuss about this.
I've got a shitty old 6700k and I have to record 1080p 120fps VODs sometimes. I use a capture card and use x264 to encode and I've never had dropped frames on this level and certainly not at a wimpy bitrate like 10k. The demo was misleading and a very fringe case. A wrong case if anything.

Problem is: No-one who has a 1 PC setup is going to be streaming 10k while playing in 1440p using their CPU to encode on slow.. This was what was wrong with the E3 demo. This was what he went for but even he got lost in the sauce because it's a long ass video and he doesn't state this within the first 5 minutes.

in short
>fuck intel's powe hungry firehouses & price creep
>fuck AMD's marketing
>fuck nvidia for boost 3.0 and shit drivers
>integer scaling still not available on intel, amd or nvidia graphics solutions
Wrong is wrong. Shed your fanboy for one minute.

Attached: 1560205081120.jpg (500x500, 125K)

...

Only because Intel can't.

Oh no noononononoononn

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AHAHHAHAHAHHAHAHHA

>HE LITERALLY USED 10,000Kbit/s STREAMING ENCODE BENCHMARK 2 YEARS AGO HIMSELF


DOUBLE STANDARD MUCH!??!?!?!??!?!? Yo, someone poke him on twitter because I don't use that cucked botnet

Attached: GAMER nexus exposed.jpg (3768x1953, 516K)

>wasting extra GPU cycles on encoding rather than game fps.

To add onto this.

Steve's benchmarks were flawed because he ran single player benchmarks with little to no action and slow camera pans. This is a valid criticism and could have been avoided by simply recording a demo in an FPS game or even doing what they've done for Overwatch in the past & entering a custom server to benchmark (now that it supports replays). This would allow them to get a repeatable result and benchmark each encoding preset on it's own merits.
This is the biggest thing they fucked up on and since they're not actually hardcore gamers and just tech enthusiasts who make some money, this isn't a gigantic issue.

It's not like they can truly replicate AMD's claims since they used a shit tier benchmark and the fringest case I've ever seen of streaming. It's the type of stream you'd do if you had 10 minutes to prepare. You hear that CPU encoding is better for streaming, slow preset, yup, but you still want to hit 1440p/144 so you max out your CPU/GPU in the process.
They could however make the comparison a little better by testing one of the titles AMD listed, but I don't think firing up overwatch would go so well for AMD.

and not on slow

What a fucking retarded faggot piece of shit, someone send this over twitter or something

AMD were misleading in that it is an unrealistic use-case
They weren't actually wrong about their shown CPU being able to do at the same price-point of an Intel part
Steve is still insufferable though

Yeah, that's exactly what I was saying. So I'm glad you got it.
He can be as insufferable as he wants, the demo on e3 was still flawed and they were right to be called out for it. His 30 minute video has little substance to it though unfortunately because he's not familiar with the topic.

>Unrealistic use case
For Intards. Twitch streamers are gonna enjoy a nice upgrade. Nice try shill.

>This guys is a liar hack fraud! I know because *Third hand information regurgitated from a 14 year old on reddit that I don't actually understand*

lmao dumb metalhead got EXPOSED

>>wasting extra GPU cycles on encoding rather than game fps.
Unless you're maxing out your GPU @ 100% utilization then it's really not an issue. Playing at 1440/144 isn't going to do that for most titles either and if it is, just cap your FPS at 133 or something. You're not going to miss out on that single player action and you'll get consistent frametimes. If you're a streamer then you'll already know where you can push your setup and how to reign it in.

GPU encoding may have shit tier quality compared to CPU encoding but no-one streams with CPU encoding on a 1 PC system unless they don't have GPU encoding available or they're not playing a game. CPU encoding is inefficient and just straight up isn't worth it unless you REALLY need the slight increase in quality it provides without having to re-encode.

its only unrealistic because it could never have happened before, retard
>dude just buy a second PC for streaming

>hating cats
Fuck off, newfaggot.

>liking cats
Fuck off, beta cancer

>For Intards. Twitch streamers are gonna enjoy a nice upgrade. Nice try shill.
>Twitch streamers are gonna enjoy a nice upgrade. Nice try shill.
Twitch streamers are already using 2 PC setups so the best this gives them is a little more headroom and nothing more if they're using AMD. If you care about frames and you care about quality then there's only one option available to you.

>>dude just buy a second PC for streaming
That's literally what you do if you care about your stream looking presentable on twitch and you still require any sort of CPU/GPU performance. it sucks but that's just the way it is. If you can't afford a 2nd PC then streaming is just a hobby for you and GPU encoding will cut it.

It's done in dedicated hardware, retard.

Maybe for some current streamers. But times they are a changing.

Updated

>Multi-core doesn't matter!
>Productivity doesn't matter!
>Price/performance doesn't matter!
>Performance per watt doesn't matter!
>Power usage doesn't matter!
>Temperatures don't matter!
>Soldered dies don't matter!
>Stutters don't matter!
>Streaming doesn't matter!
>Data centers don't matter!
>Locked CPUs don't matter!
>OEMs don't matter!
>Hyperscalers don't matter!
>Upgradeability doesn't matter!
>Anti-competitive business practices don't matter!
>Locked platform features don't matter!
>Synthetic loads don't matter!
>PCI-e lanes don't matter!
>Burnt pins don't matter!
>Heat doesn't matter!
>1771w cooler doesn't matter!
>Server space doesn't matter!
>ECC support doesn't matter!
>Free RAID doesn't matter!
>NVMe RAID doesn't matter!
>StoreMI doesn't matter!
>IPC doesn't matter!
>7nm doesn't matter!
>StoreMI doesn't matter!
>HEDT doesn't matter!
>Stock coolers don't matter!
>Security doesn't matter!
>Games don't always matter!
>Enterprise doesn't matter!
>Hyperthreading doesn't matter!
>VMware doesn't matter!
>MySQL doesn't matter!
>Unix doesn't matter!
>Linux doesn't matter!
>Waffer yields don't matter!
>Benchmarks after full patches don't matter!
>Asian markets don't matter!
>Own fabrics doens't matter!
>Chipset lithography doesn't matter!
>Cray doesn't matter!
>Cisco doesn't matter!
>HPE doesn't matter!
>AZURE doesn't matter!
>5nm doesn't matter!
>TDP doens't matter!
>10nm doesn't always matter!
>Cache doesn't matter!
>Integrated graphics doesn't matter!
>PCIE 4.0 doesn't matter!
>Germany doesn't matter!
>Processors doesn't matter!
>Custom Foundry Business doesn't matter!
>Stock voltage doesn't matter!
>Framerate doesn't matter!
>Amazon sales don't matter!
>Prime95 AVX doesn't matter!
>*NEW* Large corporations don't matter!
>*NEW* Your DATA doesn't matter!
>*NEW* Context switch duration doesn't matter!
>*NEW* 202x market outlooks don't matter!
>*NEW* VRM doesn't matter!
>*NEW* Bitrate doesn't matter!
>*NEW* Single core performance doesn't always matter!

Goddamn you amd basedboys sure pitch a fucking fit whenever someone stops blowing your lifestyle identity brand for even a second.

Well let's be real here. Intel shills have been doing this for many, many years.

>But somebody else acts like a faggot too! Thats why its fine if i blow my dog.

>Maybe for some current streamers. But times they are a changing.
They're not "a changing" You still need your CPU free to play the game on a 1 system setup and you'd never use CPU encoding for that if you have a dedicated GPU. A dedicated GPU will do a much more efficient job at the price of quality.
Once you care about quality you have to make a trade off in CPU cycles and that's just unrealistic.

If times are a changing, they're changing for casual gamers rocking a Ryzen 3k series and no GPU. I'm not sure what games they're streaming and playing but I'm sure it'll matter.

fair point, but the case can still be made that it's a disproportionate amount of processingpower for a result that's not worth bothering with

I'm surprised to see that a bunch of fanboys who have the edge in encoding surprisingly know absolutely nothing about encoding. They're still trying to save their brand for free though.

OP and his e-friends on reddit.com are all faggots that want to fuck kids at drag queen nigger hour

I don't even know what that means

as opposed to a lesser amount of processing power that cost 3 times more to only be used for streaming? there is no "disproportionate amount of processing power" only a poor price/performance ratio that is now thoroughly thrashed

stfu reddit nigger

Not that user, but you've pretty much got it and you pretty much get why no-one uses CPU encoding on a single PC system when GPU encoding is available.
It's reserved almost entirely for fast action scenes/increasing quality (within defined limits) on a second system for a reason. Streaming is probably the only thing that this performance metric matters in too.
CPU encoding just isn't worth it on a single system because performance is lost and it's barely worth it on a 2nd system. The benefit it does give is not using any resources on the "gaming PC". That's why you have a bit more headroom to work with on the 2nd PC.

The "slow preset" presented was a meme, and the use case AMD used was wrong. Steve coulda came out with this but he decided to get lost in his autism and make a flawed video about encoding instead. Shame he didn't ask anyone who streams or works in esports.

>but dude
>like every single streamer has a 2 pc setup already
Yea no they don't. Most of the popular streamers didn't start on two pcs. Sure they have two now because they become popular and got enough money to invest on making their streams look better.

Okay well nobody that uses a single pc streams with the slow cpu preset unless they are streaming a console via a capture card.

They sure as shit have a graphics card and truth is that they're not going to make it to the point of getting a 2 PC system if they have to rely on stream quality to get ahead. Even if

This. Slow preset is fringe use at best. If they were forced to use CPU on a single system then they'd be using fast/very fast depending on the workload.

>well nobody that uses a single pc streams with the slow cpu preset
Yea cus it sucked until now
>unless they are streaming a console via a capture card.
Hey look another use case.

Agreed that steve should have actually done research on encoding.
Also agreed on the eternal fuck up that is AMD marketing division that we are now talking about whether or not the slow preset is useful rather than talking about an incredibly lowered bar of entry for high quality stream output

>Yea cus it sucked until now
No. Because people have had access to dedicated GPU/embedded encoding options for several years. NVENC, Quicksync. AMD's VCE. There are better options out there to do the job and they don't use as much power to get the job done.
In the case of streaming they allow you to maintain a high and steady framerate without a leech on performance.

it's not the bitrate brainlet

>AMD's VCE. There are better options out there to do the job and they don't use as much power to get the job done.
I probably should have mentioned that AMD's VCE probably uses more power to get the job done since AMD's GPU's are still TDP housefires. My bad. Don't care to research how VCE works and don't care.

Attached: 1559677633638.jpg (542x573, 108K)

It's not about whether anyone will use it. It's the fact that it CAN do it. It shows how powerful these new CPU's are and that power will be used in other areas like speed of video encoding. Not just streaming.

Great job.

Now when can I run W:ET at 2400fps?

It's like all the figgits who crow about 240Hz @ 1080p or that CS:GO runs 10 fps faster on Intel.

I swear to God CS:GO fags are the audiophiles of gaming.
I've seen people convinced that 500 FPS is so much more fluid than 300 FPS on their 240Hz displays.

:^)

github.com/forkstreasury/ZombieLoad

>bad methodology
armchair (((journalist (female)))) talking big

>our boy Steve

Yea hes not my boy, dudes a massive shill and his content is shit, please kys.

>I've seen people convinced that 500 FPS is so much more fluid than 300 FPS on their 240Hz displays.
There's a certain reasoning to it and I can only talk for games I've been competitive in the past: Higher FPS on most game engines = lower input latency.
Some games even tie their PING etc to framerate nowadays. The difference between 60, 144 and 300hz are 16.6, 6.94 & 3.33 ms respectively. Mouse movement is smoother, everything just feels a little more fluid if it's stable.
However; There's a point where it's a diminishing return and that's why 240 fps is what people are after for the most part. No-one's crying out for 480hz displays.

Where most games aren't played on LAN, it matters somewhat but not much. It's about comfort first.

they will once they have a CPU capable of it

Why do people keep parotting this? The absolute state of burger internet service is garbage. You know what makes everything work despite this huge flaw? Lower bitrates and better presets!

Whats weird is the videos he does for Gamer Nexus are usually really high quality and informative. Its obvious the dude is not dumb, which really only leads me to think that he got a really nice sponsor deal or whatever from Intel to say basically, whatever.

people use dual pc setups because current hardware isn't powerful enoigh, AMDs benchmark shows that's not true anymore incel

> The absolute state of burger internet service is garbage
You talk about people parroting memes and yet look what the fuck you are doing

>Its obvious the dude is not dumb, which really only leads me to think that he got a really nice sponsor deal or whatever from Intel to say basically, whatever.
He had a point, he just got lost in the sauce and failed to dumb it down and get across the main points.

>You know what makes everything work despite this huge flaw? Lower bitrates and better presets!
It sure does, which is why people use 2nd PC's for the most part and capture cards.

People use dual PC setups because it means no frame drops related to streaming and no excess load on their one system. If you can't afford a 2nd system and a capture card then you use GPU encoding. Even if CPU encoding gets to the stage where you can competitively play games and play games at high framerate while using MEDIUM x264 on the same CPU. You still won't see it spread as wide as you want it. It'll go to the channels who do dank sekiro playthroughs but no-one playing games to their limit will use it if they have the money there.

Also to be clear: FUCK ELGATO AND FUCK THE CAPTURE CARD INDUSTRY.

>Also to be clear: FUCK ELGATO AND FUCK THE CAPTURE CARD INDUSTRY.
lol, thats the truth.
I'm happy with my USB3HDCAP though, works really fucking well for the price.

>e-celebs on MY Jow Forums

wrong GPU encoding is worse in every way except for the performance gain but you can play games at 120+fps while encoding @faster or medium with a good CPU

Are you actually fucking retarded. Come on. Fess the fuck up.
>wrong GPU encoding is worse in every way except for the performance gain but you can play games at 120+fps while encoding @faster or medium with a good CPU
Okay. Let's break this right down.
>GPU encoding is worse in every way except for the performance gain
but performance is king for most people and especially those who don't have a 2nd FPS.
>but you can play games at 120+fps while encoding @faster or medium with a good CPU
See above

Now, why did I ask if you were retarded? Pretty simple all round, this is what I said:
>People use dual PC setups because it means no frame drops related to streaming and no excess load on their one system. If you can't afford a 2nd system and a capture card then you use GPU encoding.
Pretty much true. If you can't afford a 2nd system and capture card then you'd use GPU encoding and you'll get no frame drops from either CPU or GPU encoding. I'm acknowledging that GPU encoding can and will drop frames at 100% load.

>but you can play games at 120+fps while encoding @faster or medium with a good CPU
Great, you're doing that while you most likely have a GPU capable of encoding the stream itself. It won't look the best but it'll do the job and medium isn't that much of an upgrade. You're playing at 120 fps which implies that you think that you're competitive or that it matters but you're encoding on your CPU so it looks good for your 2 viewers and you don't have enough income to afford a shitbox and a capture card.
Why are you playing at 120 fps and why are you even streaming? I'll accept your answer if you're in esports but otherwise you're a very, very fringe case.

If they could make capture cards more standard, I'd be happy. I'm glad the chinese are trying to break into the market slowly but they can't take over the high end yet sadly.

yes there is a point in doing that because the stream looks much better for the bitrate using CPU encoding vs GPU encoding and depending on the game/settings can easily manage playable FPS for yourself
your only argument is that ppl can choose to lower their stream quality by using gpu encoding, you're fucking retarded, there is only ONE reason to do that, and it's if your CPU is not capable enough, otherwise you can have the best quality for your stream and playable FPS for yourself.