Remember to pngcrush your PNG files. With some images (text, illustrations, anime screenshots) the space and bandwidth savings can be huge. It is amazing how you can go from this...
Pngcrush
Other urls found in this thread:
my.mixtape.moe
my.mixtape.moe
css-ig.net
flif.info
my.mixtape.moe
twitter.com
...to this with no loss of quality.
Even in more typical cases you still get a 5%-15% reduction in file size. Over dozens of images it adds up.
>not using ECT
Try it on the OP pic.
Use optipng.
how's it work
I use PNGGauntlet on all my screenshots and many other images. It runs images through several optimizers and picks the smallest result.
This doesn't seem to be really doing a better job than basic irfanview
I see loss of quality just in the thumbnails you idiot
PNG is a lossless format.
dumb mobile poster
I ran OP's image and it came out just a little smaller.
>pngcrush
>1,154,468 bytes
>ECT
>1,106,195 bytes
>it's still above 1mb and under 2mb
Good joke.
neat
It tries different compression settings and finds the best.
optipng, 1153798 bytes
Okay, so
pngcrush ~ optipng ~ IrfanView < PNGGauntlet < ECT
webp created with ffmpeg -lossless 1 comes out at 799 458 bytes
my.mixtape.moe
Running zopflipng on OP right now
zopflipng, 1113952
i just use pngout
Lossy.
>lossy
get the fuck out
How you do it?
Bigger images are good because they screw over mobilefags on data plans.
jpeg, 12079 bytes
>lossy
imagemagick doesn't do to bad.
How to make images bigger?
Make yourself smaller.
command?
use
-define png:compression-level=9
>Error: Duplicate file exists. here.
lel
Did you get pngout plugin to work with x64 irfanview? I know the plugin is 32-bit, but I thought they should still be supported. It just doesn't load for me.
Thank you
Webp on 4chin when, chinkmoot-san
2032465 PNG, original
1174552 PNG, convert -define png:compression-level=9
1157823 PNG, IrfanView
1154468 PNG, pngcrush
1153798 PNG, optipng
1126247 PNG, PNGGauntlet
1113952 PNG, zopflipng
1106195 PNG, ECT
799458 WebP lossless
711923 FLIF
Anybody here who can test pingo?
zopfli
1108569 bytes
Just tested pngcrush and most of the crushed images have fucked thumbnails is that normal?
pingo is the best all-around png optimizer. Looks like open sores loses again.
the best lossless image posted so far is compressed with an open sores format
see and flif.info
kys retard
don't bother with the plugin instead of standalone, last i used it irfanview made slightly bigger files for some reason
But why would i save bandwidth when its unlimited?
No.
So you use just over ~10GB of your drive instead of ~20GB when you have 10k+ PNGs.
how do you check if an image is actually lossless compared to the source?
I think it happens when I leave the file manager open while the files are being written.
They stayed that way even after cleaning the thumbnail's cache though, weird.
Just used it again on the same files, without having the file manager open on the directory, and the thumbnails came out fine.
compare
see
or similar on gimp
You should convert each one to raw bitstream and compare checksums with original to ensure they are truly lossless.
does answer my question, but still just get another hdd if you have space issuces nigga. What are you, poor?
nigger I already have 5TB + 128GB SSD, but that's no reason not to save space if I can
kill yourself weeb, i'm dead serious
What are you whining about then?
So why should i compress pictures to save on bandwidth when its unlimited.
my.mixtape.moe
1179569 BPG lossless
and remember to pngquant --nofs your-screencaps.png
man compare
and remember to
pngquant --nofs your-screencaps.png
and remember to
optipng -fix -o2 -strip all your-screencaps.png
convert image1.png image1.pnm
convert image2.png image2.pnm
diff image1.pnm image2.pnm
If you have imagemagick you can as well just use compare.
Who is best girl?
Why does compare give me the same picture as in when I compare the original image to itself?
/tmp du *png
400K Screenshot_2018-08-16 g - pngcrush - Technology - Jow Forums-or8.png
924K Screenshot_2018-08-16 g - pngcrush - Technology - Jow Forums.png
1.3M total
/tmp optipng -fix -o2 -strip all 'Screenshot_2018-08-16 g - pngcrush - Technology - Jow Forums.png'
** Processing: Screenshot_2018-08-16 g - pngcrush - Technology - Jow Forums.png
1150x9322 pixels, 4x8 bits/pixel, RGB+alpha
Reducing image to 3x8 bits/pixel, RGB
Input IDAT size = 942224 bytes
Input file size = 943661 bytes
Trying:
zc = 9 zm = 8 zs = 0 f = 0 IDAT size = 777489
Selecting parameters:
zc = 9 zm = 8 zs = 0 f = 0 IDAT size = 777489
Output IDAT size = 777489 bytes (164735 bytes decrease)
Output file size = 777546 bytes (166115 bytes = 17.60% decrease)
/tmp du *png
400K Screenshot_2018-08-16 g - pngcrush - Technology - Jow Forums-or8.png
760K Screenshot_2018-08-16 g - pngcrush - Technology - Jow Forums.png
1.2M total
gee thx
compare op.png original.png -compose src difference.png
Got things still a little bit smaller with ECT. Now 1,105,685 bytes.
Perhaps I can improve it a little more with genetic filtering, but that'll take a while.
What are the ideal settings for ect? Thanks.
>start using pngcrush on my Cardcaptor Sakura screenshot folders, 1060 files
>it takes A LOT of time. A LOT
>ends up saving only ~300MB
JESUS CHRIST FUCK YOU Jow Forums
The one who sucks your penis but no one else's.
>my Cardcaptor Sakura screenshot folders, 1060 files
Please share!
I have a 100KB/s upload speed so I will have to refuse.
Damn. Well, at least share your top 30 or some reaction images.
ect -9 -strip --pal_sort=120 --allfilters-b input.png
This will result in the best compression ECT can offer, but it takes ages especially for large pictures. Started this command about an hour ago for OP's pic. Still going.
ect -9 -strip --pal_sort=120 --allfilters input.png
Still slow but usable. Often produces the same results as the first command.
ect -9 -strip input.png
This will be enough for most users. Sometimes it provides the same results as the commands above, but usually it produces a marginally larger file (we're talking
>ends up saving only ~300MB
If your current files are about 1 MB and you ended up saving 300, it means you went from around 1300 MB to 1000. That doesn't sound bad at all.
Not worth waiting at least 2 hours for.
I usually run these tasks in the background with a high level of niceness, so you don't have to wait to use your computer.
>Often produces the same results as the first command.
Case in point. Using --allfilters-b instead of --allfilters provides no additional compression for OP's pic, but it takes about 90 minutes instead of 2.