• 0

How to output compressed content using PHP?


Question

In my latest PHP script I decided to use compression for the output, one of the reasons is that I'm using lots of JavaScript and all that code compress looks sweet. I don't have any concrete question but I need someone that already knows about the subject to enlighten me a bit on the best way to compress the output via PHP.

For instance, at first sight, I decided to go with the method I though it was best. Use an .htaccess file with the following:

php_flag zlib.output_compression on
php_value zlib.output_compression_level 9

This way, all the output via PHP file would be compress and this would work for everything I'm doing, including JavaScript and CSS. Anyway, this method brings some issues to the table:

1) I just tested my script on a remote server and it didn't work. It was a free PHP5 server and something tells me that the webhost has PHP ini settings modifications through .htaccess blocked. And there are many servers like this one...

2) I would like to avoid the use of .htaccess. For instance, what if someone is using ISS, this will certainly not work for them.

Another option would be to use the ini_set() function to set the PHP configuration but in all the tests I did, something didn't work and I don't understand why. All my scripts aggregate all the content to be outputed to the browser in a single variable and then I simply "echo" that variable. I tried to use ini_set to set "zlib.output_compression" to "On" and also "zlib.output_compression_level" to "9", right before echoing the content variable. But something didn't work. Maybe the syntax for ini_set() was incorrect? I mean, the dots to separate the ini setting or the underscores or double-quotes or even case sensitivity is in place? I don't know if ini_set() is sensible to all this things. Anyway, this would have been the best option for every browser, OS and server, as long as the zlib module was enabled of course.

I also found another way to do it, using the ob_start("ob_gzhandler") function, but:

1) I saw lots of code examples and I don't really know the best way to use it. I mean, I was able to use it bu, it add several lines of code that I didn't really understand what were they for and if the were really necessary. I tried to comment them and the script worked anyways with output compression but I don't really know if thoses lines would be important in the future.

2) From the PHP official documentation, they say: "Note: You cannot use both ob_gzhandler() and zlib.output_compression. Also note that using zlib.output_compression is preferred over ob_gzhandler().". And that's the reason that I would prefer to use zlib.output_compression instead of ob_gzhandler. But I would also like to understand why they recommend zlib instead of ob_gzhandler.

Now, the idea is to compress all the output done by PHP, all CSS and JavaScript files. The PHP output is very simple, all the pages in my script are accessed through index.php and by the query arguments, it will decide which page to show and then, only an echo is required to show that page. For the CSS and JavaScript, I opted to create 2 different PHP files, named scripts.php and styles.php. Both do the same exact thing. In the case of scripts.php, it will gather all the content from various .js files and output all of them. In the styles.php, the same thing is done, with the different that only one .css file is read to be outputted; but if some day I have more than one .css file (for instance, for beowser compatbility), I will gather them all in the same way I'm doing for JAvaScript. The Content-Type is correctly defined using the header() function before the output is done in both files.

Recently, a friend of mine suggested the following code:

if(/* the browser accepts gzip encoded content */){
	header("Content-Encoding: gzip");
	echo gzencode($output);
} else echo $output;

And I want to know if there is any issue that one might not know about outputting code using this gzencode() method and that specific header. Would I have any problems using it like this or this is the way to go?

Well, I'm open to suggestions but more importantly, I would like to enlighted (if possible) about PHP output compression.

P.S: Sorry for this big testament and for the bad english, I had to translate this from a Portuguese post I did to some other f?rum and I'm kinda in a rush...

17 answers to this question

Recommended Posts

  • 0

I can't think of anything bad about using it.

I should note that you might output things like U+0007 and such. :p

Since you're using an if statement to check whether gzip is accepted via the HTTP_ACCEPT header before applying the gzip header, I think your code is pretty safe to use. I can't see any reason to NOT use it.

Oh, and you might convert it to hexadecimal or something first. Maybe something like rawurlencode() would suit your needs.

  • 0

You are talking about the latest method with gzencode() that I talked about right?

But, does this method works like ob_gzhandler(), zlib.output_compression or is different from both?

About the hexadecimal thing, you are suggesting to use rawurlencode() on $output? What would be the benefits of doing that?

  • 0

You could always ask your host if they have mod_gzip installed on their Apache server. This will gzip up the output of your PHP scripts (when someone requests the page) then send the gzipped content to the users web browser where it will be unzipped (most browsers support this IE6/7, Firefox, Opera, Safari etc). It comes at a small processing cost, but it does cache the content presumably and can make a webpage 1/5 its original size.

To compress/speed up Javascript there are a few things you can do. Make sure all your commonly used JS is in a separate includes file, then it will be cached by the users web browser. Also you can try gzipping up your JS includes manually, which from experience I can say works nice. Look at this blog post for how to do it. Use 7-Zip with ultra compression to get nice small sizes. The only disadvantage to this is you will need to unzip/rezip the file if you decide to alter it. I tend to use this technique for compressing Javascript frameworks (which don't get edited really).

If you caching your gzipped content with your scripts and returning the cached content first (unless the page has changed) then your scripts are fine. If you just compressing everything on the fly to save on size I don't see that much point in it personally.

Edited by ziadoz
  • 0

Dammit, I thought I had already answered this, now I'll have to write everything again... :s

  ziadoz said:
You could always ask your host if they have mod_gzip installed on their Apache server. This will gzip up the output of your PHP scripts (when someone requests the page) then send the gzipped content to the users web browser where it will be unzipped (most browsers support this IE6/7, Firefox, Opera, Safari etc). It comes at a small processing cost, but it does cache the content presumably and can make a webpage 1/5 its original size.

That won't be possible because this is a script I intend to release to the public and I didn't want to depend on a module and create documentation to talk about it. I want something more simpler.

  ziadoz said:
To compress/speed up Javascript there are a few things you can do. Make sure all your commonly used JS is in a separate includes file, then it will be cached by the users web browser. Also you can try gzipping up your JS includes manually, which from experience I can say works nice. Look at this blog post for how to do it. Use 7-Zip with ultra compression to get nice small sizes. The only disadvantage to this is you will need to unzip/rezip the file if you decide to alter it. I tend to use this technique for compressing Javascript frameworks (which don't get edited really).

I know how to compress JavaScript, for instance, the downlodable packages will have compressed JavaScript files using the JS Packer. But I still want to compress the output because I've read that using those 2 methods together, JS files would be really small.

  ziadoz said:
If you caching your gzipped content with your scripts and returning the cached content first (unless the page has changed) then your scripts are fine. If you just compressing everything on the fly to save on size I don't see that much point in it personally.

About the caching, isn't that a browser issue? I mean, do I have to code anything for the cache to work? As far as my understanding of caching goes, I open a page or something in the browser and it gets automatically cached and when I refresh the page, if the page doesn't have any modifications, it will use the cached version if it wasn't overwritten with something else (cache limits). Or do I have to put some code in place to let the browser know it needs to cache the compressed output?

Compressing everything on the fly will result in smaller file sizes for the user visiting the page. If it the compressing doesn't slow the page load much, than it will speed it up, because the user will have less Kb to download... Well, that's I think it happens.

But anyway, something anyone failed to answer me yet and I even posted this on some other forums is this:

  Quote
zlib.output_compression is preferable over ob_gzhandler but is it preferable over gzencode? But I can't use it anyways, so, it all comes down to this:

Is ob_gzhandler preferable over gzencode or the other way around?

If anyone knows the answer to this question, it would help me a lot...

  • 0
  Quote
Is ob_gzhandler preferable over gzencode or the other way around?

The project I manage (Project Beehive Forum) registers it's own output handler and uses gzencode. The decision to go this route was made a long time ago, and only after struggling to implement ob_gzhandler due to bugs in older versions of PHP4. These bugs have probably all been resolved now but I'm reluctant to switch from something that is working really well.

So yeah, if you're trying to write an application that is going to be use on many many different PHP versions, and you can't be confident that ob_gzhandler is going to work all the time I'd go with writing your own.

If you want to look at our implementation you can view gzipenc.inc.php in the Beehive Forum CVS Repository.

  • 0
  Quote
Dammit, I thought I had already answered this, now I'll have to write everything again... confused.gif

You said you were open to suggestions.

  Quote
Compressing everything on the fly will result in smaller file sizes for the user visiting the page. If it the compressing doesn't slow the page load much, than it will speed it up, because the user will have less Kb to download... Well, that's I think it happens.

I know how it works. What I meant is server side caching. If you have 10 users all requesting the same page, that's 10 on-the-fly compressions of exactly the same page. But if the page output hasn't changed it makes sense to store the compressed version of this content on the server and return that to the 10 users instead of re-compressing the output again. I don't know if PHP zlib does this or not already, but you could look into it.

Edited by ziadoz
  • 0

You misunderstood me... More than 12 hours ago I replied you back and I thought it was posted but it wasn't. So I had rewrite everything. Of course I'm open to suggestions...

About the caching, let's say zlib doesn't do this by automatically. Does php have functions do code it manually or yu are saying I would need to save the compressed output to some file and work from there?

  • 0
  Nazgulled said:
You misunderstood me... More than 12 hours ago I replied you back and I thought it was posted but it wasn't. So I had rewrite everything. Of course I'm open to suggestions...

About the caching, let's say zlib doesn't do this by automatically. Does php have functions do code it manually or yu are saying I would need to save the compressed output to some file and work from there?

Apologies, I thought you were annoyed before. :)

Yes, I mean by saving the file and perhaps comparing the page output with an md5 sum against the stored compressed version. I don't know if this will necessarily be any better though (since you have to actually get the PHP page output first to compare it against what is stored), just an idea. I read about it in a blog somewhere, can't find the link though.

  • 0

Yeah, caching would be a good a idea but it may require lot of work...

You know, the software that is producing these forums, Invision, I always liked them and the way they code and when they used to release the software for free of charge I would take a look at their source code and learn from them and I don't know exactly how they do it, but I always thought that the way they gzip the pages (notice the "GZIP? Yes" text on the right bottom corner) is the way to go or at least one good way to do it...

But I don't really know how they compress the page nor if the code is readable anymore since they went commercial. And I didn't want infringe any copyright laws.

  • 0

@Kudos

That's what I was thinking in using, but that "warning" on the PHP manual about using zlib.output_compression over og_gzhandler left me thinking...

@ziadoz

Just read that article and both modes work the same way (as the author states it) the only difference is that you get to play with the HTML before output, otherwise, they are equal.

  • 0

Well to put it simply, you have 2 options. A) get a server that supports zlib.compression B) use ob_gzhandler. Personally I use ob_gzhandler over zlib, because of the restrictions of shared hosting and because I always use ob anyway. Your post has got me thinking and I since I now run my own servers I may switch over :)

  • 0

Yeah, I think I'll just add an option in the configuration are to enable compression or not, using ob_gzhandler because it looks like it's the same as doing gzencode() by the article that ziadoz posted. And then, on the documentation, I can let the users know that they can still enable zlib compression with some .htaccess settings and compression disabled in the configuration. Then, they can do whatever they want.

Thanks for all your help guys!

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • The Irony... China wouldn't be what it is today without Apple 😂
    • Microsoft makes it easier to find PC specs in Windows 11 Settings by Taras Buria Windows 11 has already received several improvements that make it easier to learn about your computer's specifications. Recently, Microsoft released Spec Cards for the System > About section, which provide basic information about the PC's main components, such as processor, memory, storage, graphics card, and video memory. Now, the Settings app is getting a new way to find your device info. Microsoft wants to display basic device information right on the Home page of the Settings app. The latest preview builds from the Dev and Beta Channels introduced a new "Your device info" card for the Settings' Home page. It displays specs like processor name and speed, graphics card and the amount of video memory, storage, and RAM. The card also has a link to the "About" section, where you will find more information about your computer, its Windows edition, product ID, and the recently introduced FAQ section that answers common hardware-related questions. The "Your device info" card joins the existing cards on the Settings app's home page. While the section offers useful information like quick access to Bluetooth devices, Wi-Fi, personalization, and recommended settings, users received it with mixed reactions, as many considered it another way for Microsoft to promote its services and subscriptions like Microsoft 365, OneDrive, and Game Pass (seriously, who thinks about Game Pass when opening Settings?). Now, the Settings' Home page is a bit more useful, as it saves you a few clicks when checking your computer's specs. If you want to test the new "Your device info" card, update your PC to build 26200.5622 or newer (Dev Channel). Just keep in mind that Microsoft is rolling it out gradually, and it requires signing in with a Microsoft Account in the United States. Other changes in build 26200.5622 include a new Settings section for Quick Machine Recovery, widget improvements, more app recommendations in the "Open with" dialog, and more. Check out the full release notes here.
    • Ponies will finally have good games to play after replaying Last of Us for the 100th time. Oh and I lied, Silent Hill f looks pretty great too, but we already knew about that.
  • Recent Achievements

    • Week One Done
      jbatch earned a badge
      Week One Done
    • First Post
      Yianis earned a badge
      First Post
    • Rookie
      GTRoberts went up a rank
      Rookie
    • First Post
      James courage Tabla earned a badge
      First Post
    • Reacting Well
      James courage Tabla earned a badge
      Reacting Well
  • Popular Contributors

    1. 1
      +primortal
      397
    2. 2
      +FloatingFatMan
      177
    3. 3
      snowy owl
      170
    4. 4
      ATLien_0
      167
    5. 5
      Xenon
      134
  • Tell a friend

    Love Neowin? Tell a friend!