Threadripper vs Intel I9

I'm looking for a new fast CPU for encoding videos to x265, pricing isn't an issue. As I've heard that AMDs CPU are slow at video encoding especially compared to Intel as Intel's CPUs handles AVX workloads much more faster and have lower memory latancy. Which one would be better on the upcoming lineup?

Attached: 1535923656590.png (450x266, 62K)

Other urls found in this thread:

openbenchmarking.org/showdown/pts/x265
openbenchmarking.org/showdown/pts/svt-hevc
m.youtube.com/watch?v=M2LOMTpCtLA
twitter.com/AnonBabble

If HEVC encoding speed is your only concern then go with Intel.
openbenchmarking.org/showdown/pts/x265
openbenchmarking.org/showdown/pts/svt-hevc

thanks, but why does a 16 core 2950x beat a 32 core 2990wx by a large margin?

The encoder probably can't use the 64 threads efficiently (at least not for 1080p footage; might be different for 4K) and afaik the 2950x beats the 2990wx when it comes to single-core performance.

How is i5-4690k faster than i7-8700 and r5 2600?

Those are phoronix tests, does that explain enough? I ran the same x265 Bosphorus_1920x1080_120fps_420_8bit_YUV.y4m /dev/null command manually that the test profile is using on a R7 [email protected] and I got 3 run avg of 31.64fps.

Threadripper 3000.
Intel bugs are a serious problem.

>recommeding intel in 2019
you're an evil person

encoding on CPU is still a thing?

Those x265 benchmarks are horse shit. In what world do two e5-2660 v2 Xeons, which are 10 core ivy bridge processors, get beaten out by a single dual core i5 2520m laptop sandy bridge processor?

If you're going to post shill benchmarks, at least make sure they're even vaguely grounded in reality first.

Attached: trash.jpg (680x680, 81K)

Literally barely any torrents with h265 movies.

Or, even better: i7 8700 has a supposed score of 7, while an i7 8700k magically scores 26.

Didn't realise k SKUs encoded over 3 times faster

Why not use NVENC for HEVC ?
It's a lot faster than a cpu will ever be able to do it.

worse filesize/video quality

>worse filesize/video quality
Really ?
I tried some reincoding to HEVC with NVENC in shotcut yesterday (not giving it a limit to how much bitrate it could use) and I honestly couldn't distinguish it from the original file in terms of quality.
When encoding it with my cpu it would take 8 hours (6700K) to do a HEVC file and it would make the filesize massive due to not putting a limit on the amount of bitrate that it uses (for some reason using the CPU encoder uses about 450+ mbit/s when encoding in 4k HEVC) and using the GPU (GTX 1070) It only took about 45 minutes.
I couldn't see any difference at all between the two files (other than the filesize being much smaller in the NVENC version due to it being smart about bitrates and capping it around 75 mbit/s compared to the 450+ mbit/s from the cpu encoding).
I'd suggest giving it a go, I'm not sure if it was really bad before but it works really well from what I've tested till now.

Actually, it's because of a faster AVX2 implementation.

>why does a 16 core 2950x beat a 32 core 2990wx by a large margin?

32c Threadripper is basically a 4-node NUMA system with memory in only 2 of the 4 nodes, which completely fucks with bandwidth intensive software that can't into the topology. If you want >16c AMD, I strongly advise waiting for 3rd gen Threadripper, which will actually be UMA since all the DDR4 controllers will be on the IO die. Also, AVX IPC will more than double, plus better clocks, perf/Watt, etc.

What? I already have 2645 HEVC movies in my collection (I'm a hoarder). I've been replacing existing H264 with H265 as new encodes come out. Not a reason...

Forgot to mention the 2100+ TV downloads

I already tested it out, at the same settings I got a larger filesize

Careful OP, not everything that shines is gold. While intel AVX-512 processors do outperform their threadripper counterparts you'll solemnly see videos being encoded with more than 2-4 CPU cores at a time. Instead video files are split into pieces and each set of 2-4 CPU cores works on them separately on specialized x265 server encoders. This is done because x264 cannot efficiently use more than 2-4 CPU cores. x265 bumps this up to 6-8 cores depending on preset/bitrate control but it's still limited.

If you take say a 2 hour 4K movie and split it into 1 hour pieces and encode each piece inside a separate VM on a 16-core threadripper setup you'll effectively get double the encoding speed than encoding the entire movie on all 16-cores.

No matter what settings you use splitting video encoding onto thousands of GPU cores is going to massively degrade quality simply because of the nature of how video encoding works. HW encoding is faster because it uses the most barebones params from the ultrafast SW encoder preset to do very quick integer/single precision math on.

It's just simply not worth it when it needs 8-10X the bitrate compared to SW encoding using the slow preset.

Hardware encoding uses fixed encode settings, software encoding on the CPU allows you to tweak a ton of encode settings you simply don't have the options to mess with on hardware encoders like Nvidia's NVENC, Intel's quick sync, or AMD's VCE.

specialized x264*

>If you take say a 2 hour 4K movie and split it into 1 hour pieces and encode each piece inside a separate VM on a 16-core threadripper setup you'll effectively get double the encoding speed than encoding the entire movie on all 16-cores.
yeah but realistically, no one is going to be doing this at home for their own private encode collection.

If you're an encoding GROUP that is doing REGULAR encode releases, then sure this might make some sense.

why the hell would you spin up a seperate vm for each encode

What would be a good bitrate for 4k60 with the HEVC codec ?
I thought that being around 40 to 75 mbit/s was pretty good actually, better than the 450+ mbit/s from the software encoding.

I suppose that makes sense, although I don't really get much in terms of quality options in shotcut the way that I export it other than VBR rate control (leaving this at 100% quality vs the source file) GOP and B frames.
Maybe pure FFMpeg has more options for NVENC but I haven't tried that yet since I need the timeline stuff that shotcut has.

I guess but it sure beats forking over a grand for a CPU.

Setting up each encoder in their own VM you can be sure they'll get their own dedicated CPU core affinity.

>better than the 450+ mbit/s from the software encoding.
that's due to your software encode settings, change your settings, do a longer encode, and you'll get MUCH higher quality at a much lower bitrate.

>options for NVENC
Doesn't exist, because again, it's a hardware encoder, it has fixed preset encode settings that can't be changed.

>I guess but it sure beats forking over a grand for a CPU.
Most people just wait for their encodes to finish in a longer length of time.

Hell, MOST people doing their own encoding at home have a dedicated encoding rig with a mid-range CPU and they just let it run encodes 24/7

4K is really just 4 1080p resolutions stitched together so 1080p30fps to 4K30fps requires 4X the bitrate. Double that if 60 FPS like you mentioned.

I don't recommend enforcing a custom bitrate though, you're better off letting CRF determining the best bitrate for any given scene in the video. If you want to cram the movie in a specific file size then 2-pass VBR encoding would be ideal but CRF will get you more constant quality.

In x265 28 CRF is yify-tier, 22 CRF youtube quality, and 16 CRF high quality. Anything lower than that gives you marginal quality improvements.

Also 10-bit precision encoder is highly recommended.

>Setting up each encoder in their own VM you can be sure they'll get their own dedicated CPU core affinity.
how about don't be a retard and just use taskset to set the affinity

>that's due to your software encode settings, change your settings, do a longer encode, and you'll get MUCH higher quality at a much lower bitrate.
Yeah I should fiddle around more with the cpu encoding settings, 100% vbr quality just means it's going to throw as much bitrate at the problem till it thinks it's solved.

>Doesn't exist, because again, it's a hardware encoder, it has fixed preset encode settings that can't be changed.
That's a shame, I was wondering if there was a way to tweak some settings with it.

>4K is really just 4 1080p resolutions stitched together so 1080p30fps to 4K30fps requires 4X the bitrate. Double that if 60 FPS like you mentioned.
Yeah I figured that much, although HEVC is probably a lot better in terms of filesize/bitrate than h264 which is what I record my footage in.

>I don't recommend enforcing a custom bitrate though, you're better off letting CRF determining the best bitrate for any given scene in the video. If you want to cram the movie in a specific file size then 2-pass VBR encoding would be ideal but CRF will get you more constant quality.
I usually let it take 100% quality vbr and let it set the bitrate itself through the program, the program probably knows way better what parts need more bitrate than I do.

>In x265 28 CRF is yify-tier, 22 CRF youtube quality, and 16 CRF high quality. Anything lower than that gives you marginal quality improvements.
I don't think shotcut has that option for setting CRF like handbrake does.
We have Average Bitrate, Constant Bitrate, Quality Baed VBR (which is what I use at 100% quality) and Constrained Vbr.

>Also 10-bit precision encoder is highly recommended.
Yeah there doesn't seem to be an option for 10 bit HEVC export for shotcut, I might need to ask the guys on the shotcut forum for this.
It would be nice since it would get rid of the halo's you sometimes get in darker scenes.

Careful OP this tripfag has done nothing but shill for AMD for weeks

All blu-rays now are 10-bit h265's, so all new movie rips are

>All blu-rays now are 10-bit h265's, so all new movie rips are
just UHD blurays, 1080p standard blurays are still x264.

Was assuming windows, setting process cpu affinity through task manager never seemed to work for me.

You're better off installing ffmpeg and running a script to do encoding senpai, even lets you choose 10-bit x265.

>Threadripper vs Intel

Attached: 1551946516712.jpg (3072x1728, 1010K)

>You're better off installing ffmpeg and running a script to do encoding senpai, even lets you choose 10-bit x265.
The only problem is that I want to add commentary for a second version as well (one version with just gameplay and one version with commentary added) which is why I'm using shotcut.
Just converting the gameplay footage isn't the issue but I need shotcut to make some changes to audio levels when adding the commentary.
Currently looking into what command I would need to add in shotcuts other tab, the place where I believe the FFMpeg commands are put in such as movflags=+faststart, rc=constqp, vglobal_quality=0, strict_gop=1
to enable HEVC NVENC 10 bit encoding which my 1070 should be able to do.

Why GPU encoding though? You sure you're okay with 8-10X bigger file size?

>NVENC
it's like you hate quality encoding

this has been disproven several times over, the performance regression is purely a fault in windows' thread scheduler

The filesize is actually a lot smaller than the cpu encoding and finishes much faster.
I would like to find out how to do 10 bit but I suspect it's done in the "other" tab where you can pass FFMpeg statements.

CPU encoding takes like 10 hours for a 12 minute video on a 6700k.
The quality is identical to the NVENC version and the NVENC version is like 1/20th of the size due to the encoder going crazy with bitrates when using cpu encoding and having the VBR rate to 100%.
Also I need to encode these video's twice since I make a version with commentary and a version without commentary.
This means with CPU encoding I'd be using 100% of my cpu for like 20 hours straight compared to about 1,5 hours with NVENC and being able to record additional episodes while it's transcoding since it uses almost no cpu.
Honestly for me the choice is made pretty easily, I would like the option to encode it with 10 bit from the get go just to get rid of the banding issues you can sometimes get in dark scenes but other than that I'm pretty happy with it.
CPU encoding probably gives better quality when you really finetune it but I couldn't see any real difference.

>CPU encoding takes like 10 hours for a 12 minute video on a 6700k.
Then you're doing it wrong, you need to have the proper settings.

My 5820k would do an entire 2 hour movie in less than 10 hours.

You're mistaken, try a 16 CRF video file with nvec encoding and then 16 CRF with handbrake with the slow CPU preset. Then compare file sizes, both will have identical quality.

This placebo preset is just that, a placebo.

cant you just do pic related on each encoding instances? Better than using a vm

Attached: 1551939550864.png (640x374, 201K)

Hasn't worked well for me, something about the x265/ffmpeg process initiating with all cores available from the start. Might be different with loonix but having a hypervisor manage 2 windows VMs worked better for me especially if each have their own separate HDD/SSD storage to use.

Maybe ffmpeg can set cpu affinity with some added param, that would be really useful especially with scripts.

I've decided to wait see how the new TR will perform, I've actually wanting to encode multiple video files at the same time, over 10 at a time. Setting each to 1-4 cores, 4-8 threads.

>NVENC = x265
lol)

One word?
>Windows
Wendel from level1 made a video showcasing how windows is bad at dealing with multithreaded workloads and assigning them and a part time fix.
m.youtube.com/watch?v=M2LOMTpCtLA

Do you need to add the quotes when adding FFMpeg command to another program ?
When going through the documentation, in order to make it export as 10 bit I need to add ‘bt2020_10bit’
Do I need to keep the quotes around it ? None of the other commands in the part of the program to add these commands have quotes around them.

>--pools
>--threads
moved to ffmepg from handbrake, usually set it up for using more b-frame for extra compression without quality loss

>-pix_fmt yuv420p10le

Thanks going to try this now.

Sadly the encoding fails immediately using that.

do you have the latest ffmpeg build? otherwise something is wrong with your settings

>do you have the latest ffmpeg build? otherwise something is wrong with your settings
Yeah I need to dig a little deeper into what FFMpeg build is in shotcut.

download ffmpeg and use it in your cmd if using windows, it'll be better

>download ffmpeg and use it in your cmd if using windows, it'll be better
Yeah I guess I'll have to figure out FFMpeg in order to get what I want.