Re: [Gimp-user] Why does saves as JPG default to quality 85?

On 03/24/2012 12:50 AM, Patrick Shanahan wrote:
* Ofnuts<ofnuts laposte net>  [01-01-70 12:34]:
On 03/23/2012 09:20 PM, Liam R E Quin wrote:
every time you load and
save a JPEG file the quality is reduced and information is lost.
Not true... if nothing changes the algorithm is stable (decoded values get
re-encoded to the same values). You lose quality if you recompute something
different; changed settings, changed pixel values, changed 8x8 boundaries
(image crop).
Well, it is true and very easily proven.

Open a jpg file and save it.  Then open the just saved file and save it
again at the save compression level.  Then do a diff or simply check the
file sizes
I wrote this Perl script a while ago:

# Runs successive JPEG saves on an image to evaluate JPEG losses
# Author: Ofnuts

$defaultquality=92; # default quality for "convert"

$dotcolor="white\n"; # change this if the top of the image is too clear

$argc= ARGV;
usage() if ($argc > 2);
if ($argc>1)
    usage() if ($steps!~/^\d{1,3}$/);
if ($argc>0)
    usage() if ($quality!~/^(\d{1,2}|100)$/);

print "Running $steps steps at JPEG quality $quality\n";

for ($step=0;$step<$steps;$step++)

    # dot coordinates

    print "Step ".($step+1)." of $steps\n";
system("convert step$in.jpg -fill white -draw \"circle $cxc,$cyc,$cxr,$cyr\" step$out.png");
        system("convert step$out.png -quality $quality step$out.jpg");
        system("compare step000.jpg step$out.jpg diff0-$out.jpg");
        system("compare step001.jpg step$out.jpg diff1-$out.jpg");

sub usage {
    print "\n";
    print "Usage: $0 [quality [steps]]\n";
    print "\n";
    print "Where:\n";
print " - 'quality' is the quality factor of the JPEG compression \n";
    print "          (1-100, 100 is best, default is $defaultquality)\n";
print " - 'steps' is the number of successive steps to perform\n";
    print "         (default is $defaultsteps)\n";
    print "\n";
    print "Produces:\n";
print " - successive saves of a JPEG image to test JPEG-induced losses.\n"; print " - compare images with the original file and the 1st JPEG save.\n";
    print "\n";
    print "Starts from a 'step000.jpg' file in the current directory.\n";
    exit 1;

My observations & conclusions were:

I wrote this short utility to check the usual claim that JPEG image quality degrades with the successive saves.

This utility saves an image multiple times, each time after making a minor and very localized change to it. To avoid suspecting that "convert" does it cleverly to minimize losses, the image is saved to a lossless format (PNG) and then converted from PNG to JPEG. The resulting image is then compared with the original image (diff0-*), and with the result of the first step (diff1-*) (red pixels are the changed pixels).

Now for the interesting part. This dispels some misunderstanding:

- In all cases, most of the damage occurs on the 1st save. The subsequent saves show very little difference with the first step, even at very low quality settings. Save steps beyond the third do not add any loss... The JPEG algorithm is "stable", and the decoded values eventually get re-encoded the very same way.

- The amount of "damage" is very low at reasonable quality settings (75 or above). To get an experimental "feel":

-- load the original image and the result of any step in a photo editing software that support layers
 -- obtain the "difference" between the two layers
 -- the resulting image seems a very uniform black to the naked eye
-- use a "treshold" transform and lower the treshold value until recognizable pattersn appear (besides the marker dots at top left) -- At 90 quality, using the result of the 10th step, the first white pixel shows up at 20 (artefact at lower border due to picture height not a multiple of 8), the first pixel in the image a 11. -- At 75 quality, the difference produces a recognizable ghost of the linnet. The treshold method shows that most differences are below 20.


- Global image changes (white balance, contrast, colors) are a whole different matter, not adressed here (though, IMHO, the problem with JPEG in these operations is more the 8-bit-per-channel limit it puts on the picture that in turn leads to a comb-like histogram)

- The original JPEG uses 1:1:1 subsampling and so does 'convert' by default.

-- Unless reproduced by different means, these results only apply when the same software is used throughout.


My step000.jpg is here:

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]