Mozilla wants to make web pages load faster with new JPEG encoder

Websites are created with a lot of code like HTML, JavaScript and more, but in terms of file size, images make up most of the page "weight". As such, web browsers sometime struggle to load pages with a lot of images. This week, Mozilla announced a new project that's designed to help that situation.

A future version of Firefox could load Neowin and other sites faster with the "mozjpeg" encoder.

It's called "mozjpeg" and according to Mozilla's research blog it's an attempt to create a JPEG encoder that will improve the compression of images while also allowing for compatibility for the majority of decoders. Mozilla points out that the JPEG file standard is still the one that most websites use and while some people have suggested a replacement for it, that won't be happening anytime soon.

The blog states:

Given this situation, we wondered if JPEG encoders have really reached their full compression potential after 20+ years. We talked to a number of engineers, and concluded that the answer is “no,” even within the constraints of strong compatibility requirements. With feedback on promising avenues for exploration in hand, we started the ‘mozjpeg’ project.

Mozilla has already released version 1.0 of "mozjpeg" which includes support for the perl script known as "jpgcrush". The script has been shown to reduce the file size of JPEG images by as much as 10 percent. Mozilla is looking to see if others would like to help in this new project.

Microsoft has made its own attempts to decrease load times on web browsers. It claimed a few months ago that its own compression techniques have cut the load times of website JPEG images by 45 percent on IE11 compared to older versions of the browser.

Source: Mozilla

Report a problem with article
Previous Story

Microsoft briefly stopped Windows development in February 2002 to focus on security

Next Story

Razer rolls out their own Cherry MX switch clones for Mechanical Keyboards

48 Comments

Commenting is disabled on this article.

sounds a good idea, will be interesting to see how it plays out. Can you do this on a whole folder or is it per file?

"Mozilla wants to make web pages load faster with new JPEG encoder" ...I'd rather they implemented WebP instead! - that would make a lot more sense!!

GreatMarkO said,
"Mozilla wants to make web pages load faster with new JPEG encoder" ...I'd rather they implemented WebP instead! - that would make a lot more sense!!

If the world concedes to Google's video codec, then we are screwed. Get used to forced advertising, mandatory tracking and other 'treats' Google is slowly gearing into VP8.

If Google truly doesn't have long term nefarious intentions (which our insiders believe they do), they should be willing to hand it over to a standards body and relinquish control and modification of future versions that could significantly alter how it works.

If it becomes the 'defacto' standard, open or not, Google will mandate is implementation and features sets, which can do anything from break software using it they don't like to require advertising streams and content tracking to continue to work.

Open doesn't mean 'everyone get to contribute' to how it works. In this sense it only means Google will let you see the code and add your suggestions, it doesn't mean they have to use any of them or adhere to a common set of standards or practices.

Even Microsoft gave up WM9 (aka VC1) just to ensure the world they wouldn't screw with the codecs in the future for their own purposes or allow variations to break existing equipment.

Google needs to do the same, because right now it looks like they want their standard to take off so they can then leverage that control for more information collection and advertising back end dealings.

How exactly would you get "forced advertising, mandatory tracking and other treats" from using a WebP (or WebM) encoder based on the BSD licensed reference implementation, protected by an irrevocable patent promise[1]?

I'm not sure how you would expect Google to mandate it's implementation requiring ad streams or whatever, they have no foothold whatsoever in there to do such thing.

Even in the hypothetical event of Google eventually saying "hey, fancy adding this new cool tracking features?" you could just say "not interested, #### off" and continue using your WebP encoder as usual.

[1] http://www.webmproject.org/license/

ichi said,
How exactly would you get "forced advertising, mandatory tracking and other treats" from using a WebP (or WebM) encoder based on the BSD licensed reference implementation, protected by an irrevocable patent promise[1]?

I'm not sure how you would expect Google to mandate it's implementation requiring ad streams or whatever, they have no foothold whatsoever in there to do such thing.

Even in the hypothetical event of Google eventually saying "hey, fancy adding this new cool tracking features?" you could just say "not interested, #### off" and continue using your WebP encoder as usual.

[1] http://www.webmproject.org/license/

By being the 'default' standard implementation of the codec, they could force any feature they want by breaking any non-conforming version in Chrome/Android.

So if your VP8 implementation didn't add in a Google feature, they could simply make it no longer work on the products that give it leverage.

As for the BSD licensing, there is nothing that prevents them from adding in any crap feature.

This is another one of the arguments where 'open' is not truly 'open' in practice.

As you say, people could say, "not interested, #### off;" however, when it no longer works on ChromeOS, inside Chrome or on Android devices, people will implement anything Google wants.

This already happens with Google services that started out rather simple and open from GMail and other services that now are fairly encumbered with very specific Google advertising and tracking requirements.

Example: YouTube isn't available as an App on WP because the Microsoft written version didn't properly return the tracking information for the advertising. This stuff happens, and when dealing with more intrinsic technology like codecs, it needs to be in the hands of a standards group to prevent abuse. (The HTML5 excuse Google tried to use is also not accurate; however, even if it was, it itself is an example of the 'defacto standard' demanding extraneous features that are unevenly applied.)

Mobius Enigma said,

By being the 'default' standard implementation of the codec, they could force any feature they want by breaking any non-conforming version in Chrome/Android.

So if your VP8 implementation didn't add in a Google feature, they could simply make it no longer work on the products that give it leverage.

As for the BSD licensing, there is nothing that prevents them from adding in any crap feature.

See, that's the thing: because it's BSD licensed and royalty free there's nothing Google can do to stop you from coding around whatever extra targeting or advertising feature they might hypothetically add.

Say Google included some kind of unique ID to images through the WebP decoder for tracking purposes (because really, what else might they do with images regarding tracking and ads?). Everyone else can just implement the reference WebP decoder skipping the UID part, and it will be perfectly compatible with Google's WebP images.

If Google was to subvert the WebP and WebM formats for their own purposes they would be in no better possition than they would be now if they decided to do the same with current "standard" image and video formats.

Because they could just introduce those "features" in their own JPEG encoders and decoders and, as you say, make it so non compliant implementations don't work on ChromeOS, Chrome and Android.

It wouldn't follow the JPEG reference implementation, but if you argument holds true then "people will implement anything Google wants".

Mobius Enigma said,

This already happens with Google services that started out rather simple and open from GMail and other services that now are fairly encumbered with very specific Google advertising and tracking requirements.

GMail has always been about tracking and advertisement, and have been so wide in the open right from it's beginning.

Mobius Enigma said,

Example: YouTube isn't available as an App on WP because the Microsoft written version didn't properly return the tracking information for the advertising. This stuff happens, and when dealing with more intrinsic technology like codecs, it needs to be in the hands of a standards group to prevent abuse. (The HTML5 excuse Google tried to use is also not accurate; however, even if it was, it itself is an example of the 'defacto standard' demanding extraneous features that are unevenly applied.)

That's apples to oranges. Youtube is a service subject to TOS, VP8 is not.

every year we hear about a new jpg improvement. jpg200, jpgxr, etc. now this. It will never happen.

1) bandwidth today is focused around the needs of video streamers not still porn surfers
2) jpg is good enough for what it get used for
3) we have better image standards, but the chicken-egg problem will always mean jpg, gif and png will remain kings for decades to come.

A lossy compression algorithm is basically determined by: Amount of compression = original filesize * new quality / new filesize. A curve is generally used with the original filesize fixed, the new filesize on the x-axis, and the new quality on the y-axis. They'll probably draw curves for existing encoding utilities and compare it to their own to show improvement.

A really good example of this done (but for music) can be found here: https://hacks.mozilla.org/2012...ts-an-audio-codec-standard/

If I remember accurately...

Due to GPU disparities, Microsoft delayed using their jpeg decoding assistance, even though Vista and Win7 were already using more of the GPU for local decoding.

With Win8's reduce reliance on GPU hardware they moved forward with IE11 and even adapted the CPU/Software variation for IE11 on Win7.

If it is the blog post I remember, I think they even dipped into the color and progression changes based on the human eye's color accuracy.

But then mozjpeg is about encoding, not decoding, so whatever Microsoft does with the decoder pipeline can still reap the benefits of using mozjpeg-encoded files (assuming mozjpeg turns out to be a good encoder).

ichi said,
But then mozjpeg is about encoding, not decoding, so whatever Microsoft does with the decoder pipeline can still reap the benefits of using mozjpeg-encoded files (assuming mozjpeg turns out to be a good encoder).

As you might of guessed, I skimmed the article, skipping over the important bits about it being the encoder.

I know part of what Microsoft was doing was pulling the discernable aspects of the jpeg first if possible, thus leaving the lower resolution color channels like blue that our eyes don't see well. I wonder if this encoding essentially is forcing this for all basic decoders. (Time for me to actually do some old school research.)

Thanks for you post, now I am more curious with what they are doing.

Hello,


Mozilla points out that the JPEG file standard is still the one that most websites use and while some people have suggested a replacement for it, that won't be happening anytime soon.

Hi. My name is Portable Network Graphics. I was born on 14 October 1996 and the country I live in called "The Internet" made me a legal adult in 2004. Kthxbi

PNG is a great format with MANY pros and very few cons but for a photograph JPG will give better results

riahc3 said,
Hello,


Hi. My name is Portable Network Graphics. I was born on 14 October 1996 and the country I live in called "The Internet" made me a legal adult in 2004. Kthxbi

PNG is good for "design" elements on pages. Stuff like buttons, logos etc. Any time there is something resembling a photograph it's better to use JPG

PhilTheThrill said,

PNG is good for "design" elements on pages. Stuff like buttons, logos etc. Any time there is something resembling a photograph it's better to use JPG

And the biggest reason is that it can use compression algorithms to make it 10x smaller or so.

Good that they're finally copying this feature from IE. Perhaps their implementation might even be better, who knows? All I care about is that the two browsers I use the most keep improving.

Romero said,
Good that they're finally copying this feature from IE. Perhaps their implementation might even be better, who knows? All I care about is that the two browsers I use the most keep improving.

IE's improvements were about DECODING jpegs.

Mozilla on the other hand is talking about improving the ENCODER to produce smaller files that are still compatible with current decoders.

They are different projects that don't compete with each other. What you might get is a cumulative benefit when using both (decoding mozjpeg images with IE's decoding pipeline).

Oh, so decoding JPEGs in FF will still be no faster? I guess site owners should hurry up and start using Mozilla's encoder so IE users can benefit even more.

Romero said,
Oh, so decoding JPEGs in FF will still be no faster? I guess site owners should hurry up and start using Mozilla's encoder so IE users can benefit even more.
Correction: so all users can benefit more.*

That's the point of Mozilla, to improve the web as a whole, not just for themselves.

Pluto is a Planet said,
Correction: so all users can benefit more.*
Of course, but with IE's decoder improvements that means double benefit for its users.

Hello,

max22 said,

I never left.


Worst choice you ever made. Like I said in a thread: Firefox eats memory like a fat kid that loves cake.

The only reason I use FF still is FireBug but if I dont need it: IE and Chrome FTW.

riahc3 said,
Worst choice you ever made. Like I said in a thread: Firefox eats memory like a fat kid that loves cake.

Broken addons perhaps? My copy has 20something addons installed and uses a fraction of what Chrome does for the same setup.

riahc3 said,
Hello,

Worst choice you ever made. Like I said in a thread: Firefox eats memory like a fat kid that loves cake.

So what? I have 16GB of RAM.

riahc3 said,
Hello,

Worst choice you ever made. Like I said in a thread: Firefox eats memory like a fat kid that loves cake.


With per tab process, a browser like Chrome would use even more memory.

Also memory usage can vary depending on addons you use and websites you live.

InsaneNutter said,

I never joined them when it was the cool thing to do.

I never did then either, but been using some variation of it a lot lately. Using pcxfirefox right now. Have Cyberfox on a few computers and regular Firefox on only 1 other.

I think all of them run better than IE 9,10 or 11, for some reason, on older machines anyway. I see a lot of older machines here!

riahc3 said,
Hello,

Worst choice you ever made. Like I said in a thread: Firefox eats memory like a fat kid that loves cake.

The only reason I use FF still is FireBug but if I dont need it: IE and Chrome FTW.


In my experience Chrome eats more memory than Firefox ever has. It's just less noticeable because of all the processes it uses... But light it certainly ain't.

Pluto is a Planet said,
Right now with four tabs open: http://imageshack.com/a/img593/4839/kbvq.png

I have IE11 open right now with 14 tabs, and 459.7MB... 14 separate processes. You have Firefox open with 4 tabs, 183.9MB... all in one process.

IE11 average: 32.8MB
FF average: 45.9MB

Hard to compare without having the same sites open, though.