Hentai lovers of this world unite

this kills the kikes

Attached: 0oj5irf4vn631.jpg (1024x768, 124K)

Other urls found in this thread:

hwbot.org/benchmark/hwbot_x265_benchmark_-_1080p/
tomshardware.com/reviews/intel-core-i9-9900k-9th-gen-cpu,5847-11.html
click.intel.com/tuningplan/purchase-a-plan
twitter.com/AnonBabble

what does it mean

absolutely based

what is this

does this mean I can encode two videos in real time

G-G-GOYIM!!!

Attached: 1505147990486.jpg (329x329, 57K)

really oys my veys

I'll report this thread

Attached: 1494162098217.png (653x726, 84K)

We can now watch hentai in 50k fps?

22 minutes?

most likely the length of the video encode.

I'm confused, but I guess I'm happy for you, OP. . . maybe? I dunno.

how many FPS intel got at the same settings?

1.6 FPS

Attached: 5781710+_ba13c5e43afdb30300394e217c4c5511.png (555x770, 340K)

hwbot.org/benchmark/hwbot_x265_benchmark_-_1080p/

why are celerons doing better than a xeon?

Is this ok?

Attached: bm.jpg (875x495, 112K)

Roughly 30% better than 2600 and equivalent to 1800X/slightly weaker than 2700X.

Ryzen 3600 is shaping to be very nice.

8700K gets same FPS for same amount of core/thread. But Ryzen is running much lower clock speed.

D-DELET THIS, RYZEN GOOD, INTEL BAD

Now post 3800X ;)

D-DELETE THIS POST NOW

Attached: navi is finally here bros.png (653x726, 49K)

Attached: 4.9.png (855x494, 37K)

>Taking pictures of a screen with a cell phone
>Instead of using the PrtSc key on your keyboard

Attached: 1494959297869.jpg (497x640, 110K)

now do the test in 3.0GHz. Because that's the point of the test.

The test is 4.1 Ghz, not 3 Ghz.

that's the point. Intel fanboys do not realize something that's in front of them.

The fact is that AMD now leads in IPC, which means that a Ryzen 3000 owner does not need to have an AIO to get 100+ FPS @1080p (I know, who plays at this resolution?) . It stresses the parts less, i7 users can brag about their shiny number 5, except it'll stress the non-CPU parts by drawing more power.

pedoweebs

av1 or bust

>5.0ghz

what did amdlets mean by this?

Attached: Screen Shot 2019-06-27 at 07.01.47.png (834x1882, 449K)

Not even a single Epic or Threadripper CPU. AMD eternally btfo.

>page 3
oh no no no

Attached: Screen Shot 2019-06-27 at 07.08.03.png (2138x1156, 506K)

man im happy amd's picking themselves up but the shillings getting obnoxious lmao

a stock amd competing with literal cascade and ln2 setups and he thinks it was prudent to actually post this against amd

It's also 32 cores versus 18, do you have a point or is the truth you don't know a fuckin' thing about any of this?

this multi bus is fucking insane

>ANOTHER poorshitter amd shill thread
gotta give you fags credit for your persistence haha

Remind me are these $200 cpus?

>SHOT ON MI 8 SE AI DUAL CAMERA
sounds fancy, but it can't even do flicker filtering?

Attached: 2019-06-27-161602_403x175_scrot.png (403x175, 56K)

a 3.8ghz 32 cores vs a 5.5ghz 18 core

its like our first day on kindergarden that we dont know that ipc scales with freq

does this test use all cores?

9900k - 67fps - 16t - 5GHz
4.1875fps/thread - 0.8375fps/GHz

3600 - 49fps - 12t - 4.2GHz
4.0833fps/thread - 0.9722fps/GHz

delta: +16%
ohnonononono

This only shows that AMD can’t overclock for shit.

>amd cant clock for shit

so having it running on stock means it cant clock shit?

what kind of retarded thinking is that LOL

So how much more do I have to spend upgrading to gain 16.107 more fps in this benchmark?

Attached: Untitled.png (855x479, 27K)

But kikes live hentai

Quick mafs kills the kike

IT GETS BETTER

tomshardware.com/reviews/intel-core-i9-9900k-9th-gen-cpu,5847-11.html

power draw at load
9900k - 42.4W
3600 - 30.2W (the same as 2600 but we all know it will be lower)

42.4W for 0.8375fps/GHz => 0.0198fps/GHz/W
30.2W for 0.9722fps/GHz => 0.0322fps/GHz/W

delta: +62% power more efficiency for AMD

You were the one complaining about overclocking on intel cpus... if AMD can’t do more than stock and gets beaten by an overclocked intel, well, tough luck bitch.

>DOWNCLOCK YOUR CPU NOW! IT'S NOT FAIR AMDBROS

OP is actually a pretty good score

Why is Intel so fucking pozzed and shit?

Attached: 1556325821006.png (1432x326, 98K)

click.intel.com/tuningplan/purchase-a-plan
>spend a shitton of money for a pozzed cpu
>software to overclock your house fires because muh 5niggahz
>"it's 20$ if you want a warranty over our own software"
kek

and even better

9900k - $410 (with 15% discount from $485)
mobo, cheapest aceptable - $100
16gb cheap ram - $60
total - $570

3600 - $220
mobo - $100
16gb 3733MHz RAM - $150
total - $470

$570 for 0.0198fps/GHz/W => 0.0000347 fps/GHz/W/$
$420 for 0.0322fps/GHz/W => 0.0000766 fps/GHz/W/$

delta: +120% performance per watt per dollar

it's a literal holocaust and you'd have to be braindead to buy intel after ruzen 3 is released

>Sirs pls do the needful and test intel at lower clock speeds even though they are capable of 1000mhz over AMD

Actually pathetic.

>per watt

What the fuck are AMDrones doing?

we went through the frames per thread, then frames per GHz, then found out how much power each processors needs for the previously found performance and then how much this performance/watt costs.

it's simple really, but don't stress it, keep buying intel.

>buy 6900k
>years later find its pozzed
>years later Windows disables overclocking
I will buy a 3700x even if it's slightly worse on gaymes rather than anything Intel desu.

Attached: 1492855909643.png (1228x1502, 944K)

Damn I want to buy AMD now

Attached: 1560930402850.jpg (800x1200, 157K)

We know how your calculating it, we are just confused why you think per watt is relevant for the average person outside of minor utility bill differences.

Nobody buys Intel really.

The patches have almost no effect on games and a gaymer desktop is the last place these vulnerabilities would ever be exploited. So if you can’t handle a 1-2 (if that) FPS drop then just disable the patches.

You know Intel will just come back with an even more pozzed uarch to beat Ryzen. Throw more speed holes cuz Intel fanboys are braindead monkeys anyway.

Attached: 1560272812074.jpg (480x853, 141K)

to showcase architecture differences?
heat output maybe?

Why do AMDreets get so angry that I use intel?

You paid less than me, and got a little less performance, good on you.

Yeah they added security patches, I looked at benchmarks for what I was using and found it had no effect. Your cpu didn’t need as many, congratulations.

I’m happy for you, everyone has a budget and you stuck to it, as did I. Why can’t you be happy for me?

I have a cooler from a previous system that keeps it under 60c load. I heard your chips run cool too. That’s great.

Next gen I’ll probably just buy intel again assuming their 3XXX competitor is 10-15% better performance (given their current products are trading blows with brand new AMD CPUs).

No need to get so angry over company wars.

ssd performance is cut in half doe.

3200mhz ram is fine and only like $80.
And the 9900k needs a cooler. A $100+, to get comparable temps to amds stock cooler

Imagine buying a 3700X now and 3 years later someone finds an exploit and it will need to be patched that will cause a performance drop. What will you buy then?

I don’t see this “cut in half” thing on my 970 Pro and Evo.

>ryzen
>xiaomi
this is refugee tier

Attached: almspls.jpg (850x551, 42K)

i decided not to give intel shills the chance to
>no magical ram
>90oC is perfectly acceptable

accounting for cooling and 3200mhz ram takes ryzen 3 to over 150% performance gainz over intel. and if we also consider all testing was done on beta bioses ad windows doesn't have that second optimisation implemented it's literally a disaster for intel.

6900k was turbo fucked. You buy it for $1000 to get multicore performance. You spend extra on a HEDT mobo.

AMD releases a $500 cpu that performs the same in what you bought it for, which soon falls in price to $350, and now 6-8c is the standard in consumer desktops from both Intel and amd.

Worst buy ever.

Hentai is for degenerate virgins
Have sex

>Doesn't understand what a leak is.

My sata 860 barely change if at all doing crystaldisk mark. Maybe an nvme m.2 would see a hit but then again I doubt anyone can feel the difference between sata and nvme.

YEA BUT 365 FPS INSTEAD OF 355 FPS DOE

Maybe you should try the BIOS update next time.

Attached: 1561491994436.png (1316x1634, 85K)

you are assuming that amd cpus cant clock higher than their stock 4.2ghz? are you mad?

>disabling your ht has no power over your games

Fuck off faggot

>needs ln2 to compete with amd
That's pretty pathetic desu desu.

This. Intel's attempts at "competing" are fucking depressing.

Attached: 87ececdb623ee58299c8f843d3c81667.gif (300x192, 1.33M)

Hit right in the feels