I recently built a new computer and I got a GTX1080Ti

I recently built a new computer and I got a GTX1080Ti.
I've always used XMEDIA-RECODE for encoding video.
I tried encoding H265 using the gpu. HOLY FUCK is it fast. I mean 250fps on fullhd rips.
And h265 is amazing, I literally can't see any difference with half the bitrate from h264.
How is it even possible.
So I'm going to convert all my videos to h265.
That is all.

Attached: h264-v-h265-hero-composite.jpg (800x400, 47K)

Other urls found in this thread:

developer.nvidia.com/nvidia-video-codec-sdk
twitter.com/SFWRedditGifs

>hardware encoding

Attached: 1407066562942.gif (320x214, 1.78M)

Why not?

Not as effective when it comes to pure compression efficiency compared to CPU encoding.

and it looks like donkey ass.

The end result looked the same to me. Are we talking about 2-3% difference here or what?

get new eyes

hardware encoding = fast or very fast profile software encoding = low compression efficiency + worse in quality.

>The end result looked the same to me.
Which means bugger all. People even say that about YIFY encodes.

Yeah gpu encoding is fast as shit even on my shitty laptop gpu. Too bad it looks like ass. If you can't see the difference you might need glasses.

There isn't a GPU encoder that's ever been created that can match software based encoding with x264 for h.264 video streams or x265 for h.265 video streams.

If you give any care to visual quality at all, you'll know that GPU encoding is nothing but a placebo and can never match the visual quality of software encoding.

It's sad because after all these years one would like to know there's at least ONE fucking GPU based encoder that can provide decent output.

But, alas, there isn't one, and there never will be.

Post a webm?

>ITT: OP discovers hardware encoding
btw, chances are you could do hardware encoding on your old computer with Quicksync

>I literally can't see any difference with half the bitrate from h264.
Being blind helps

Attached: x.webm (960x540, 1.73M)

Hardware encoding looks good some of the time but I've noticed it can sometimes completely shit the bed with encoding certain types of complex scenes. NVENC and Quicksync at their highest quality are about on par with x264 at Superfast preset settings but it is not a completely 1:1 comparison. I've also noticed NVENC and Quicksync seem to shit the bed in different ways where software encoding is far more consistent.

Looks smooth. I tried hardware encoding and got *stuttery frame rates. Going to revisit it.

*Stuttery is not a real word.

Your GTX1080Ti lacks B-frame support for H.265 encoding so you aren't even getting proper H.265 out of your hardware encoding. GTX2000 series has B-frame support though supposedly.

developer.nvidia.com/nvidia-video-codec-sdk

What's new with Turing GPUs and Video Codec SDK 9.0

Up to 3x Decode throughput with multiple decoders on professional cards (Quadro & Tesla)
Higher Quality Encoding - H.264 & H.265 (Refer to devblog for more details)
Higher encoding efficiency (15% lower bitrate than Pascal)
HEVC B-frames support
HEVC 4:4:4 decoding support

Turing rocks

>h265
>webm

Attached: 1485048311354.jpg (532x711, 69K)

Whats wrong with YIFY?

freeze frame on a high movement scene and you'll see.

Compare:
Straight Blu ray
H264
H265

Nothing, if you don't care about quality.

DON'T use HW encoding you mong. You can get 6-8X smaller file sizes by using SW CPU encoding for the same visual quality.

Try this on a sample piece of footage: CRF 22, slower preset, 10-bit encoding with normal SW CPU encoding and CRF 22 on GPU encoding. The file size difference is massive and both will have the same quality.

The reason HW encoding exists is for streaming, that's about it.

the lack of b-frames means that it will look like ass, the less cores that is used the better. A GPU uses thousands of cores while a CPU uses way less.

Everything

This, using GPU encoding is akin to using the ultrafast CPU preset.

>Whats wrong with YIFY?
Wew. You must have really good glasses