Image codecs

Does my web app need any more image encoders?

Attached: encoders.png (199x220, 3K)

Other urls found in this thread:

github.com/kornelski/pngquant
twitter.com/NSFWRedditGif

i want to know why any "web app" needs any image encoders at all

i made a manga browser for my manga library

the original files can be like 6MB PNG each. i need to compress them or the browser crashes

Attached: hmapp.png (545x535, 197K)

if it's just for thumbnailing purposes, use jpeg turbo

How the fuck can you read comics with such shit art?
I'm no expert but that looks uter shit

don't use js
problem solved

use feh or a PDF reader

so is this just basically just LANraragi?

using for:
1. thumbnails
2. browser viewing (resized to 1000px height)
3. batch re-encoding/resizing large comics on the server, for faster performance.

ive found WebP to be superior quality and size, but libJpegTurbo to be crazy fast.

Attached: resized_browser_viewing.png (578x600, 260K)

I hope you're keeping the source archives and are just re-encoding for performance.
Can't be too careful with that scare earlier this year.

yes, libjpeg-turbo is designed to be fast as shit, especially for realtime stuff like vnc
but it's a good solution for anything that isn't permanent storage, so transmitting thumbs and low res previews is good usage as well
just use high quality settings, like 98 to avoid artifacts
if you want high quality AND small size, then webp is the best out of the OP, but it's far, far slower, so i hope you have a server-side cache of webp's if you want to go this route, else your server will be spending a lot of (cpu) time compressing pictures

Don't resize it, you nigger. Use jpegoptim or optipng to reduce size without reducing quality. Also, offer a way to read the pages in the original size if the user wishes to.
Also, use a CDN and forget about needing performance in your web server.
Also, browsers can handle a lot of shit so a 6mb png will not crash the browser. I'd the browser crashes, fix your javascript. Better yet, remove it.

always

>a 6mb png will not crash the browser
no, but 6MB x 200 will crash either the browser or the server, or both. its been a while since i implemented resizing/re-encoding, it was one of the first things i did.

Attached: 7z_arch.png (529x419, 23K)

>browsers can handle a lot of shit
doesn't mean you should dump 200 thumbnails which are all browser-downscaled 8MP files
just because you can doesn't mean you should
having a few cached sizes so users only download enough for their display by default is a good idea, this saves time and bandwidth for both sides

Don't load all files at once. Keep the one the user is viewing + 5 or so pages before and after. Unload the rest. That's what you do when you have lots of items in a list.

Please do create separate thumbnails. Don't just use the original image as a thumbnail and resize it on the browser. That's just moronic.

> use high quality settings, like 98 to avoid artifacts
I've considered that route. ill have to try some benchmarks and quality checks to see if the 98 quality setting would be worth it.

>webp is slow
yeah, its slow, but still usable. creates the best thumbnails, but now im not sure if i need my thumbs to be that great.

>i hope you have a server-side cache of webp's
ive considered this too, but was worried i would be creating too many unnecessary files in comparison to creating them on the fly. maybe ill look into it.

i briefly considered using guetzli as well, but by all accounts its slow as fuck.

Attached: guetzli.png (294x320, 10K)

your png better be: github.com/kornelski/pngquant

>ive considered this too, but was worried i would be creating too many unnecessary files in comparison to creating them on the fly. maybe ill look into it.
caching is a big deal
if you can spare the space, there's plenty of performance benefits to it
spending 100ms to generate a thumbnail on the cpu might seem quick, but that's 80ms slower than if it was fetched from disk, and the resource cost of 100ms of scarce cpu time vs. 15kb of disk space? yea, caching usually wins

>ive considered this too, but was worried i would be creating too many unnecessary files in comparison to creating them on the fly
Bandwidth is more expensive than disk space. If your user doesn't get to see all the pages, then you just wasted a lot of bandwidth sending them to him.

>Don't load all files at once
i dont. i only load the first 20 images as thumbnails, and then the first 3 resized. after that its all dynamic. if i click on the next button, i get another 20 thumbs.

i think i tried it last year, but it didnt perform as well compared to the built-in PNG encoder, in addition to unusable file sizes.

i should probably remove that & webp lossless from the encoders list.

Attached: thumb_generator.png (913x516, 56K)

here, have some webp. paste this in your address bar

data:image/webp;base64,UklGRqQBAABXRUJQVlA4IJgBAAAwCQCdASpkAAwAPwFqrVArJiSislv9UWAgCWoAsR9yCBbjVrXaDLJ4zPB4wlnGMXdF/EImv3/RSomQqArsYaVVIASu9DfMbknwlsVbYbjE3qgAAP74Vms1zcMjURpmmleQ1wTfFVLPeMEWZb8h9EFwH10ReIUAz0+Ye8fMIbK8usc+NhZ34ZItfB/7P2nWY0E/fUTDH6Xah8iGRvadxHBlSVyt5kiTFkaM5UWBH0zxbxU7zoMdR9MnBBsIFUqLW71iZ9huUT6IEUxZCY659/k6Yr+7oo0gQyJfeo3JILz7Z6kg3Ur2aPK03vQ2W2E/PBYQyGEeWXgBa4ybG7BznOd9biCXAMk73DkfZ6nnrrzatsfw+RNWqZGhSdOuQRjnfHcGK+52uYa+NQ6iSGefus37fu6q+Qt3y+EsqEQJbMlkaiWptghmi3NT7QYzJav4J+QCw20cLke3DpDzLejViCEm1dimcILnyDKwFDShmiLW73e8d80TIAbgBDvbNk51jVNI7aC6FQ26dv6qmxz+KakjGDd3ThRAAAA=

>in addition to unusable file sizes.
what? it always debloats png's for me, and it works fine

what is that supposed to be

a thumbnail of whatever you call this thing. the little flap they put on the bottom some comics.

figured the text would fit within the text limit

Attached: thumb_thingie.png (498x641, 231K)

Bump