• 0

How to output compressed content using PHP?


Question

In my latest PHP script I decided to use compression for the output, one of the reasons is that I'm using lots of JavaScript and all that code compress looks sweet. I don't have any concrete question but I need someone that already knows about the subject to enlighten me a bit on the best way to compress the output via PHP.

For instance, at first sight, I decided to go with the method I though it was best. Use an .htaccess file with the following:

php_flag zlib.output_compression on
php_value zlib.output_compression_level 9

This way, all the output via PHP file would be compress and this would work for everything I'm doing, including JavaScript and CSS. Anyway, this method brings some issues to the table:

1) I just tested my script on a remote server and it didn't work. It was a free PHP5 server and something tells me that the webhost has PHP ini settings modifications through .htaccess blocked. And there are many servers like this one...

2) I would like to avoid the use of .htaccess. For instance, what if someone is using ISS, this will certainly not work for them.

Another option would be to use the ini_set() function to set the PHP configuration but in all the tests I did, something didn't work and I don't understand why. All my scripts aggregate all the content to be outputed to the browser in a single variable and then I simply "echo" that variable. I tried to use ini_set to set "zlib.output_compression" to "On" and also "zlib.output_compression_level" to "9", right before echoing the content variable. But something didn't work. Maybe the syntax for ini_set() was incorrect? I mean, the dots to separate the ini setting or the underscores or double-quotes or even case sensitivity is in place? I don't know if ini_set() is sensible to all this things. Anyway, this would have been the best option for every browser, OS and server, as long as the zlib module was enabled of course.

I also found another way to do it, using the ob_start("ob_gzhandler") function, but:

1) I saw lots of code examples and I don't really know the best way to use it. I mean, I was able to use it bu, it add several lines of code that I didn't really understand what were they for and if the were really necessary. I tried to comment them and the script worked anyways with output compression but I don't really know if thoses lines would be important in the future.

2) From the PHP official documentation, they say: "Note: You cannot use both ob_gzhandler() and zlib.output_compression. Also note that using zlib.output_compression is preferred over ob_gzhandler().". And that's the reason that I would prefer to use zlib.output_compression instead of ob_gzhandler. But I would also like to understand why they recommend zlib instead of ob_gzhandler.

Now, the idea is to compress all the output done by PHP, all CSS and JavaScript files. The PHP output is very simple, all the pages in my script are accessed through index.php and by the query arguments, it will decide which page to show and then, only an echo is required to show that page. For the CSS and JavaScript, I opted to create 2 different PHP files, named scripts.php and styles.php. Both do the same exact thing. In the case of scripts.php, it will gather all the content from various .js files and output all of them. In the styles.php, the same thing is done, with the different that only one .css file is read to be outputted; but if some day I have more than one .css file (for instance, for beowser compatbility), I will gather them all in the same way I'm doing for JAvaScript. The Content-Type is correctly defined using the header() function before the output is done in both files.

Recently, a friend of mine suggested the following code:

if(/* the browser accepts gzip encoded content */){
	header("Content-Encoding: gzip");
	echo gzencode($output);
} else echo $output;

And I want to know if there is any issue that one might not know about outputting code using this gzencode() method and that specific header. Would I have any problems using it like this or this is the way to go?

Well, I'm open to suggestions but more importantly, I would like to enlighted (if possible) about PHP output compression.

P.S: Sorry for this big testament and for the bad english, I had to translate this from a Portuguese post I did to some other f?rum and I'm kinda in a rush...

17 answers to this question

Recommended Posts

  • 0

I can't think of anything bad about using it.

I should note that you might output things like U+0007 and such. :p

Since you're using an if statement to check whether gzip is accepted via the HTTP_ACCEPT header before applying the gzip header, I think your code is pretty safe to use. I can't see any reason to NOT use it.

Oh, and you might convert it to hexadecimal or something first. Maybe something like rawurlencode() would suit your needs.

  • 0

You are talking about the latest method with gzencode() that I talked about right?

But, does this method works like ob_gzhandler(), zlib.output_compression or is different from both?

About the hexadecimal thing, you are suggesting to use rawurlencode() on $output? What would be the benefits of doing that?

  • 0

You could always ask your host if they have mod_gzip installed on their Apache server. This will gzip up the output of your PHP scripts (when someone requests the page) then send the gzipped content to the users web browser where it will be unzipped (most browsers support this IE6/7, Firefox, Opera, Safari etc). It comes at a small processing cost, but it does cache the content presumably and can make a webpage 1/5 its original size.

To compress/speed up Javascript there are a few things you can do. Make sure all your commonly used JS is in a separate includes file, then it will be cached by the users web browser. Also you can try gzipping up your JS includes manually, which from experience I can say works nice. Look at this blog post for how to do it. Use 7-Zip with ultra compression to get nice small sizes. The only disadvantage to this is you will need to unzip/rezip the file if you decide to alter it. I tend to use this technique for compressing Javascript frameworks (which don't get edited really).

If you caching your gzipped content with your scripts and returning the cached content first (unless the page has changed) then your scripts are fine. If you just compressing everything on the fly to save on size I don't see that much point in it personally.

Edited by ziadoz
  • 0

Dammit, I thought I had already answered this, now I'll have to write everything again... :s

  ziadoz said:
You could always ask your host if they have mod_gzip installed on their Apache server. This will gzip up the output of your PHP scripts (when someone requests the page) then send the gzipped content to the users web browser where it will be unzipped (most browsers support this IE6/7, Firefox, Opera, Safari etc). It comes at a small processing cost, but it does cache the content presumably and can make a webpage 1/5 its original size.

That won't be possible because this is a script I intend to release to the public and I didn't want to depend on a module and create documentation to talk about it. I want something more simpler.

  ziadoz said:
To compress/speed up Javascript there are a few things you can do. Make sure all your commonly used JS is in a separate includes file, then it will be cached by the users web browser. Also you can try gzipping up your JS includes manually, which from experience I can say works nice. Look at this blog post for how to do it. Use 7-Zip with ultra compression to get nice small sizes. The only disadvantage to this is you will need to unzip/rezip the file if you decide to alter it. I tend to use this technique for compressing Javascript frameworks (which don't get edited really).

I know how to compress JavaScript, for instance, the downlodable packages will have compressed JavaScript files using the JS Packer. But I still want to compress the output because I've read that using those 2 methods together, JS files would be really small.

  ziadoz said:
If you caching your gzipped content with your scripts and returning the cached content first (unless the page has changed) then your scripts are fine. If you just compressing everything on the fly to save on size I don't see that much point in it personally.

About the caching, isn't that a browser issue? I mean, do I have to code anything for the cache to work? As far as my understanding of caching goes, I open a page or something in the browser and it gets automatically cached and when I refresh the page, if the page doesn't have any modifications, it will use the cached version if it wasn't overwritten with something else (cache limits). Or do I have to put some code in place to let the browser know it needs to cache the compressed output?

Compressing everything on the fly will result in smaller file sizes for the user visiting the page. If it the compressing doesn't slow the page load much, than it will speed it up, because the user will have less Kb to download... Well, that's I think it happens.

But anyway, something anyone failed to answer me yet and I even posted this on some other forums is this:

  Quote
zlib.output_compression is preferable over ob_gzhandler but is it preferable over gzencode? But I can't use it anyways, so, it all comes down to this:

Is ob_gzhandler preferable over gzencode or the other way around?

If anyone knows the answer to this question, it would help me a lot...

  • 0
  Quote
Is ob_gzhandler preferable over gzencode or the other way around?

The project I manage (Project Beehive Forum) registers it's own output handler and uses gzencode. The decision to go this route was made a long time ago, and only after struggling to implement ob_gzhandler due to bugs in older versions of PHP4. These bugs have probably all been resolved now but I'm reluctant to switch from something that is working really well.

So yeah, if you're trying to write an application that is going to be use on many many different PHP versions, and you can't be confident that ob_gzhandler is going to work all the time I'd go with writing your own.

If you want to look at our implementation you can view gzipenc.inc.php in the Beehive Forum CVS Repository.

  • 0
  Quote
Dammit, I thought I had already answered this, now I'll have to write everything again... confused.gif

You said you were open to suggestions.

  Quote
Compressing everything on the fly will result in smaller file sizes for the user visiting the page. If it the compressing doesn't slow the page load much, than it will speed it up, because the user will have less Kb to download... Well, that's I think it happens.

I know how it works. What I meant is server side caching. If you have 10 users all requesting the same page, that's 10 on-the-fly compressions of exactly the same page. But if the page output hasn't changed it makes sense to store the compressed version of this content on the server and return that to the 10 users instead of re-compressing the output again. I don't know if PHP zlib does this or not already, but you could look into it.

Edited by ziadoz
  • 0

You misunderstood me... More than 12 hours ago I replied you back and I thought it was posted but it wasn't. So I had rewrite everything. Of course I'm open to suggestions...

About the caching, let's say zlib doesn't do this by automatically. Does php have functions do code it manually or yu are saying I would need to save the compressed output to some file and work from there?

  • 0
  Nazgulled said:
You misunderstood me... More than 12 hours ago I replied you back and I thought it was posted but it wasn't. So I had rewrite everything. Of course I'm open to suggestions...

About the caching, let's say zlib doesn't do this by automatically. Does php have functions do code it manually or yu are saying I would need to save the compressed output to some file and work from there?

Apologies, I thought you were annoyed before. :)

Yes, I mean by saving the file and perhaps comparing the page output with an md5 sum against the stored compressed version. I don't know if this will necessarily be any better though (since you have to actually get the PHP page output first to compare it against what is stored), just an idea. I read about it in a blog somewhere, can't find the link though.

  • 0

Yeah, caching would be a good a idea but it may require lot of work...

You know, the software that is producing these forums, Invision, I always liked them and the way they code and when they used to release the software for free of charge I would take a look at their source code and learn from them and I don't know exactly how they do it, but I always thought that the way they gzip the pages (notice the "GZIP? Yes" text on the right bottom corner) is the way to go or at least one good way to do it...

But I don't really know how they compress the page nor if the code is readable anymore since they went commercial. And I didn't want infringe any copyright laws.

  • 0

@Kudos

That's what I was thinking in using, but that "warning" on the PHP manual about using zlib.output_compression over og_gzhandler left me thinking...

@ziadoz

Just read that article and both modes work the same way (as the author states it) the only difference is that you get to play with the HTML before output, otherwise, they are equal.

  • 0

Well to put it simply, you have 2 options. A) get a server that supports zlib.compression B) use ob_gzhandler. Personally I use ob_gzhandler over zlib, because of the restrictions of shared hosting and because I always use ob anyway. Your post has got me thinking and I since I now run my own servers I may switch over :)

  • 0

Yeah, I think I'll just add an option in the configuration are to enable compression or not, using ob_gzhandler because it looks like it's the same as doing gzencode() by the article that ziadoz posted. And then, on the documentation, I can let the users know that they can still enable zlib compression with some .htaccess settings and compression disabled in the configuration. Then, they can do whatever they want.

Thanks for all your help guys!

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • I did not spot that! "This article was generated with some help from AI and reviewed by an editor", to me, suggests it was indeed entirely generated by AI and just looked over by Sayan. Bond. Mate 😔
    • Good. They shouldn't be allowed to block this stuff anyway. They're getting paid to host your app to begin with, they get a cut of the sale of the app to start. After the app leaves their servers and is on your device, the dev should be free to offer you alternative options for payments or links to their site directly from within their own app. Since these systems don't use any of Apples services at all, there should also be zero reason they should demand a cut for doing nothing like they're some digital mafia.
    • I suspect this was primarily developed with the Switch 2 in mind, which would explain the poor visuals. It seems like they're hoping the nostalgic GoldenEye 64 crowd is still loyal to Nintendo consoles. Unfortunately, this might be yet another title limited by development for underpowered Nintendo hardware.
    • Amazon's Lab126 ventures into "Physical AI" with new robotics team by Paul Hill Amazon has announced that it’s forming a new agentic AI team within its secretive hard research and development division, Lab126, to begin work on physical AI. Specifically, the company is looking to develop an agentic AI framework for use in robotics, which could start to impact blue-collar jobs, especially at its warehouses. Agentic AI is one of the latest developments in AI, superseding the previous generative AI that took off with the launch of ChatGPT. Agentic AI models are special because they can complete multi-step actions for the user to complete complex tasks. Thanks to all the visual and audio capabilities added to generative AI in previous years, these agentic models can perceive their environment, reason, plan, and act to achieve goals with minimal human intervention. If Amazon can successfully bring agentic AI to robots, they will finally be able to interact with the real world in a way they can’t today, as software running on a computer. Many people are concerned about AI’s impact on white-collar jobs right now, but when Amazon develops physical AI, it will also affect blue-collar manual work. The work is going to be carried out by Amazon’s R&D company, Lab126. It was set up over 20 years ago and has created many iconic Amazon devices, including the Kindle, Fire tablets, Amazon Fire TV, Amazon Echo devices, and more. Who it affects, and how The biggest impact of physical AI developed by Lab126 will be on Amazon’s warehouses and logistics. The company said it wants to create robots that can perform tasks based on natural language instructions. As usual for a big tech company, Amazon claims that these robots will be assistants, but it’s difficult to see how they won’t reduce the need for people. Solely based on Amazon’s plans to automate work in its factories, customers will see an indirect impact from the move through faster deliveries and potentially lower costs. The decision by Amazon to focus on agentic AI in robots is pretty interesting because so far, we’ve mainly been hearing about agentic AI limited to computer applications, such as intelligent web browsers like Opera Neon. Why it's happening Amazon has a reputation for being an efficient company, particularly when it comes to the employment of warehouse workers who are known to have strict restroom breaks. Creating robots that can help speed up warehouse activities will further boost efficiency at the company and could potentially reduce its costs and improve safety. The beginning of work on physical AI is just the next evolution of AI that we could start to hear about in the coming months and years. As agentic AI gets better, companies will be looking to see what they can advance next and physical AI may be where they choose to go next; it certainly seems like this is what Amazon has settled on in this move. If Amazon’s physical AI doesn’t lead to mass layoffs of warehouse employees, it could drastically boost worker safety. Employees could potentially be less fatigued from moving around so much, which could lead to better concentration and fewer accidents. Right now, Amazon claims that these robots will only be assistants and not replacements. While Amazon will certainly be a leader in physical AI, given its massive wealth to throw at the problem, once the technology is available, it will likely be available for sale to other businesses to use, too. Caveats and what to watch for While it’s a notable development, it still sounds like Amazon is in the early stages of developing these physical AI systems, given that it has only just set up the team. We also don’t know what specific products Amazon is planning to build or the timelines for deployment. Ever since generative AI came onto the scene, there has been discussion of AI safety. With AI moving into the physical world, it will also bring up discussion about the safety concerns. Current measures are mainly concerned with AI software running on computers, not when it interacts physically with the world. Finally, and probably the biggest concern, what will these “assistants” do to people’s jobs? Companies will likely find themselves bringing in fewer new hires initially, but it could also displace people from their jobs. Source: CNBC
    • Nintendo Switch 2 launches, where to buy and a list of games that it may not support by Sayan Sen Nintendo announced the Switch 2 back in early April this year and then followed that up with more details related to performance and hardware features later. The company touted 10x the performance of the Switch. However, on the flip side, the battery suffers, and you also need new microSD Express cards for storage. For those who need a refresher, here are the technical specification details of the Switch 2: Specification Details Dimensions Approx. 166mm x 272mm x 13.9mm (with Joy-Con 2 attached); Maximum thickness from control stick tip to ZL/ZR buttons: 30.7mm Weight Approx. 401g (console only); Approx. 534g (with Joy-Con 2 controllers attached) Screen 7.9-inch capacitive touch LCD; 1920x1080 resolution; HDR10 support; VRR up to 120 Hz CPU/GPU Custom processor made by NVIDIA Storage 256 GB UFS (a portion reserved for system use) Communication Wireless LAN (Wi‑Fi 6), Bluetooth; Wired LAN available in TV mode via dock Video Output Up to 3840x2160 at 60 fps via HDMI in TV mode; Supports 120 fps at lower resolutions; HDR10 enabled Audio Output Linear PCM 5.1 channel via HDMI; Stereo speakers Microphone Built-in monaural microphone with noise cancellation, echo cancellation and auto gain control Buttons POWER and Volume buttons USB Ports 2 USB Type-C ports (bottom port for charging/dock connection; top port for accessories/charging) Audio Jack 3.5mm stereo mini plug (CTIA standard) Game Card Slot Supports both Nintendo Switch 2 and Nintendo Switch game cards Expansion Slot microSD Express card slot (compatible with cards up to 2 TB; other microSD cards can copy screenshots and videos) Sensors Accelerometer, gyroscope, brightness sensor Battery Lithium-ion, 5220 mAh; Approx. 2–6.5 hours lifetime; 3-hour charge time in sleep mode Dock Approx. 115mm x 201mm x 51.2mm; Weight: approx. 383g For those looking to get one, major retailers like Walmart, GameStop, Best Buy, and Target have all confirmed that they will have limited console stock from time to time so you will need to be on alert and check back. Nintendo has also published a full list of games that may not work on the Switch 2: Borderlands 3 Chrono Cross: The Radical Dreamers Edition Crash Bandicoot N-Sane Trilogy Guilty Gear XX Accent Core Plus R KarmaZoo Marvel vs. Capcom Fighting Collection: Arcade Classics Mortal Kombat 1 Overwatch 2 Star Wars: Knights of the Old Republic II: The Sith Lords Star Wars Republic Commando Super Mega Baseball 4 Tombi! Special Edition Tony Hawk's Pro Skater 1+2 Touhou Genso Wanderer Reloaded Ty the Tasmanian Tiger HD Warriors: Abyss However, keep in mind that Nintendo last updated the support list last month on May 27th and the company may still be testing these. So keep an eye on the official list of games on this webpage here on Nintendo's site. Have you managed to pick up the Nintendo Switch 2? Let us know in the comments.
  • Recent Achievements

    • Rookie
      GTRoberts went up a rank
      Rookie
    • First Post
      James courage Tabla earned a badge
      First Post
    • Reacting Well
      James courage Tabla earned a badge
      Reacting Well
    • Apprentice
      DarkShrunken went up a rank
      Apprentice
    • Dedicated
      CHUNWEI earned a badge
      Dedicated
  • Popular Contributors

    1. 1
      +primortal
      397
    2. 2
      +FloatingFatMan
      177
    3. 3
      snowy owl
      170
    4. 4
      ATLien_0
      167
    5. 5
      Xenon
      134
  • Tell a friend

    Love Neowin? Tell a friend!