Ubuntu to get new packaging format and app installer


Recommended Posts

According to a recent message posted on the Ubuntu Devel mailing list by Colin Watson, Installer Team leader, Ubuntu might get a new, simplified packaging format and app installer which should make it easier for developers to get their software into Ubuntu. This will target, at least initially, the Ubuntu phone/tablet but it should be usable elsewhere too, even on non-Ubuntu or non-Linux systems.

The already existing packages won't change and Ubuntu will continue to use dpkg and apt, syncing with Debian and so on.

"Click packages" (the new packaging format) is aimed at making it easier to build packages for Ubuntu: no dependencies between applications, no maintainer scripts and each app will be installed in its own directory.

The new package format needs a new installer and there's already a proof of concept low-level app package installer that's entirely new code - highlights of what it can do so far:

  • no dependencies between apps; single implicit dependency on the base system by way of a Click-Base-System field;
  • installs each app to an entirely separate directory;
  • entirely declarative: maintainer scripts are forbidden;
  • base package manager overhead, i.e. the time required to install a trivial package containing a single small file, is about 0.15 seconds on a newish x86 laptop and about 0.6 seconds on a Nexus 7 (and that's with the current prototype implementation in Python; a later implementation could be in C and would then be faster still);
  • not limited to installing as root, although there may be similar constraints elsewhere to ensure that apps can't edit their own code at run-time
  • packages built by feeding the intended output directory tree to a simple Python tool, plus a manifest.json file;
  • building packages requires only the Python standard library, with the intent that it should be possible to build these packages quite easily on non-Ubuntu or even non-Linux systems;
  • binary packaging format sufficiently similar to existing one that we could add support to higher-level tools with minimal effort;
  • strawman design for hooks into system packages, which will be entirely declarative from the app's point of view;
  • unit-tested from the start.

The Ubuntu developers have also looked into similar existing tools such as Listaller or 0install but there are some things which they prefer to do differently; e.g.: Listaller is dependency-based and they prefer this to be as independent as possible and 0install would also need some system integration problems to be solved, so instead, they've decided to create a new installer.

The proof of concept installer is currently under 300 lines of code (Python) and obviously, still needs work. The prototype will be ready in time for UDS next week and there's also going to be an UDS session to discuss this.

Source: http://www.webupd8.o...simplified.html

Even tho the article says it will continue to be compatible with Debian, mark my words, this is their first step in breaking ties with Debian in the far future. I've been saying that now for half a year.

So if I understand correctly, they are basically trying to do what PC-BSD is doing with AppCafe?

Link to comment
Share on other sites

Yeah, they are trying to get away from Debian. They want to be their own distro.

Link to comment
Share on other sites

Even tho the article says it will continue to be compatible with Debian, mark my words, this is their first step in breaking ties with Debian in the far future. I've been saying that now for half a year.

I definitely agree with your assertion. The general consensus among Debian Developers seems to be that Canonical has a severe case of "not developed here" syndrome. They have been carrying more and more of their own patches that never get submitted back upstream. Some prominent packages, such as MySQL, are permanently out-of-sync between Debian and Ubuntu because of Canonical's decision to stick with Upstart - which introduces slight changes to the Debian packaging format that cannot be imported directly upstream without nasty patches to dpkg. The main reason that Unity does not appear in most other distros, including Debian, is that it requires patches to a large number of other packages that potentially break other functionality or introduce other bugs. Ubuntu carries these patches permanently and does not attempt to submit them back upstream, causing problems for everyone involved - including Canonical.

The primary reason that Canonical used Debian as the base for Ubuntu was because they didn't have the resources to maintain all the packages in the Debian repository themselves - now they are starting to believe that they can. Debian is very focused on doing things "the right way" to create the best technical solution for the universal operating system. Canonical is focused on differentiating themselves at all costs.

Link to comment
Share on other sites

Haha - oh christ.

Well that'd be a laugh if people got annoyed of this and decide to jump ship to other distros that run faster and have less junk in them like fedora or suse or whatnot.

Link to comment
Share on other sites

Haha - oh christ.

Well that'd be a laugh if people got annoyed of this and decide to jump ship to other distros that run faster and have less junk in them like fedora or suse or whatnot.

Part of what made Ubuntu so popular to begin with was that it built on Debian's technical excellence by making it attractive and easy for the average person to use - and partially stabilized snapshots of Debian Unstable once every six months. Now that they are moving away from that base, maybe another Debian derivative will take up the mantle.

Link to comment
Share on other sites

I definitely agree with your assertion. The general consensus among Debian Developers seems to be that Canonical has a severe case of "not developed here" syndrome. They have been carrying more and more of their own patches that never get submitted back upstream. Some prominent packages, such as MySQL, are permanently out-of-sync between Debian and Ubuntu because of Canonical's decision to stick with Upstart - which introduces slight changes to the Debian packaging format that cannot be imported directly upstream without nasty patches to dpkg. The main reason that Unity does not appear in most other distros, including Debian, is that it requires patches to a large number of other packages that potentially break other functionality or introduce other bugs. Ubuntu carries these patches permanently and does not attempt to submit them back upstream, causing problems for everyone involved - including Canonical.

The primary reason that Canonical used Debian as the base for Ubuntu was because they didn't have the resources to maintain all the packages in the Debian repository themselves - now they are starting to believe that they can. Debian is very focused on doing things "the right way" to create the best technical solution for the universal operating system. Canonical is focused on differentiating themselves at all costs.

That's nasty.. This could potentially mess up Ubuntu or Linux as a whole to some degree. Kind of sad that obviously Ubuntu is snubbing Debian and the general user base so they can be "different". Ubuntu got their start thanks to the coding of Debian to get it started..

Link to comment
Share on other sites

Then again this might help to set some kind of sane standard for lots of apps distributed as .run and .bundle auto-installers. There's a lot of software (specially proprietary) that could use some kind of package manager like this one instead of going on their own.

Link to comment
Share on other sites

Debian packages are by far the best of the Linux packaging / distribution formats, but I've still found too many dependencies and hang ups to say that the compare to what you find in Windows / OSX for the average user.

TBH, I miss the days when 90+% of installs on OSX were just dragging and dropping an icon. I always thought that was the most logical and user friendly install I've seen, as well as handling updates inside the application (instead of through an application control panel), but apparently no one designing OSes agrees.

  • Like 1
Link to comment
Share on other sites

Haha - oh christ. Well that'd be a laugh if people got annoyed of this and decide to jump ship to other distros that run faster and have less junk in them like fedora or suse or whatnot.

I'm really keeping tabs on Cloverleaf Linux. It's a SuSe based distro that's going to focus on ease of use and features. I feel they can give Ubuntu a run for it's money.

Edit: Oh snap! Dead link in OP. :/

Link to comment
Share on other sites

Then again this might help to set some kind of sane standard for lots of apps distributed as .run and .bundle auto-installers. There's a lot of software (specially proprietary) that could use some kind of package manager like this one instead of going on their own.

I disagree, there is no need, NO NEED, for this at all.

RPM was the original package, it still works OK-ish but other things have came along with better features.

DPKG is pretty good, have to say I like it.

I'm a fan of arch's method of just tar.xz'ing the files and any scripts that need to run when it's installed which I think is pretty similiar for gentoo (gentoo doesn't usually have packages but you can force it to build packages if desired and install without compiling on other computers).

I see nothing gained by having another package container when the other formats are fine for the job, nothing is lacking? Plus this is by the company that started mir and claimed so much garbage without even reading about wayland or their team just being so incapable of knowing what wayland is/does.

I don't have much hope for this and I REALLY DO hope that other distros take ubuntus place, they are becoming too big for their boots and seem to think what they say goes in terms of GNU/linux.

Link to comment
Share on other sites

Wait, why are so many people griping about this? Just use a different distribution if you can't fit any more sand in your crack. When did the FOSS community become so damn sensitive to the actions of distro developers? The article itself has more "reinventing the wheel" complaints than I can keep track of.

As a very--VERY--casual Linux user, I don't give a crap how many other solutions there might already be for this if they aren't being used. Major Linux distributions are still plagued by the community's obsessive addiction to the decades old affliction of "DLL hell". God forbid you want installing software to just be installing the software.

Link to comment
Share on other sites

I disagree, there is no need, NO NEED, for this at all.

RPM was the original package, it still works OK-ish but other things have came along with better features.

DPKG is pretty good, have to say I like it.

I'm a fan of arch's method of just tar.xz'ing the files and any scripts that need to run when it's installed which I think is pretty similiar for gentoo (gentoo doesn't usually have packages but you can force it to build packages if desired and install without compiling on other computers).

I see nothing gained by having another package container when the other formats are fine for the job, nothing is lacking? Plus this is by the company that started mir and claimed so much garbage without even reading about wayland or their team just being so incapable of knowing what wayland is/does.

I don't have much hope for this and I REALLY DO hope that other distros take ubuntus place, they are becoming too big for their boots and seem to think what they say goes in terms of GNU/linux.

There's no need, but the fact is it takes a big entity like Canonical to push a package system for software with no dependencies that proprietary software vendors might be willing to embrace. Gentoo's Portage might do the job just fine, but I don't see it being adopted anywhere othen than Gentoo.

Link to comment
Share on other sites

There's no need, but the fact is it takes a big entity like Canonical to push a package system for software with no dependencies that proprietary software vendors might be willing to embrace. Gentoo's Portage might do the job just fine, but I don't see it being adopted anywhere othen than Gentoo.

What's wrong with debian packages? Even jailbroken devices use debian packages!

"Wait, why are so many people griping about this? Just use a different distribution if you can't fit any more sand in your crack. When did the FOSS community become so damn sensitive to the actions of distro developers? The article itself has more "reinventing the wheel" complaints than I can keep track of."

Because it's ubuntu, they'll push this like it's the next best thing and unfortunately, ubuntu is by far the largest used linux distribution which means everyone listens to them above all other distros put together.

Link to comment
Share on other sites

TBH, I miss the days when 90+% of installs on OSX were just dragging and dropping an icon. I always thought that was the most logical and user friendly install I've seen, as well as handling updates inside the application (instead of through an application control panel), but apparently no one designing OSes agrees.

To this very day it's still largely the same software that uses installers on OS X: Apple's app suites, Adobe Creative Suite, Microsoft Office, etc. Some apps actually went from installers to drag 'n' drop, VMware Fusion per example. Personally I haven't experienced a major shift from drag 'n' drop to installers. The one big thing that always annoyed me about OS X is the lack of a build-in uninstaller that automatically tracks and gets rid of support files.

Link to comment
Share on other sites

Debian packages are by far the best of the Linux packaging / distribution formats, but I've still found too many dependencies and hang ups to say that the compare to what you find in Windows / OSX for the average user.

TBH, I miss the days when 90+% of installs on OSX were just dragging and dropping an icon. I always thought that was the most logical and user friendly install I've seen, as well as handling updates inside the application (instead of through an application control panel), but apparently no one designing OSes agrees.

YES, since i have moved to OS X, i love how everthing is in the ONE folder!

Link to comment
Share on other sites

What's wrong with debian packages? Even jailbroken devices use debian packages!

There's nothing wrong with debian packages but if you'll be installing packages with no dependencies on phones and tablets you don't need the overhead of building indexes, which sometimes takes more time that the package install itself.

Link to comment
Share on other sites

There's nothing wrong with debian packages but if you'll be installing packages with no dependencies on phones and tablets you don't need the overhead of building indexes, which sometimes takes more time that the package install itself.

It's really a question of speed and ease versus security and integration. Self-contained packages like Canonical is considering introducing should be much faster to package because everything is contained in a single directory and much faster to install because there is no indexing, hashing, or system integration to slow it down. On the other hand there is an inherent lack of system security features that can be applied to this approach, such as strict checksums and signature checking, and the packages do not integrate well with the system. Some of the benefit of shared libraries and other deduplication features of modern package managers is lost with this less integrated approach. Strict control over all installed programs is also sacrificed.

Canonical's proposed package system makes sense - and even looks attractive - from a proprietary software perspective, but it is also what Debian and most other distributions would consider the "wrong way" of doing things. Distributions strive to make software more integrated, not less. Despite the fact that they are a successful commercial vendor of open-source software, or perhaps because of it, RedHat has rejected this approach in favor of their more traditional package management system. In fact, one of RedHat's certifications is in properly patching and packaging software for RHEL - not working around it with a secondary packaging system.

On a side note, I'd be interested to hear what the Gentoo developers think of this plan. They tend to consider even Debian's policies too liberal, so I'm sure they will disagree with Canonical's proposed approach. The interesting part will be their technical reasoning. That's always my favorite part of talking to Gentoo developers and users: they are generally well informed and very opinionated.

Link to comment
Share on other sites

On the other hand there is an inherent lack of system security features that can be applied to this approach, such as strict checksums and signature checking, and the packages do not integrate well with the system. Some of the benefit of shared libraries and other deduplication features of modern package managers is lost with this less integrated approach. Strict control over all installed programs is also sacrificed.

Was kind of thinking similar as well as far as security goes. One of Linux's "safety-nets" to malware is that the majority of software a user would get would be from the respective repository.. not 100% of course but would guess the overwhelming majority. "Portable" programs like these are convenient, but I could easily picture (just as an example) tricking gullible users into installing things, say finding a torrent for Half Life 3 beta or Microsoft Office for Linux. You know it's fake, I know its fake, random clueless guy who decided to try Linux because he saw Steam was available might not, and being a self-contained portable app with no fear of missing dependencies just makes it that much easier. Click click done, infected with a lovely rootkit or popup ads.

Also curious as to relative bloat.. some dependencies can be rather large depending on the program. It can be heavy enough mixing up say GTK and QT stuff, but multiple copies/versions for the various programs? That could get potentially rather fat. WinSxS and the GAC has a similar problem.. absurdly handy, especially if you were around for the 9x days, and at the same time it can get rather piggish with enough programs installed.

I also get what you're saying about possibly not integrating well too. Portable programs work rather well with Windows as you have a known set of APIs that's available and can easily have your program "fit in".. extremely unlikely you'll find somebody running a different shell than Explorer for example. With Linux you have an absurd number of possibilities to deal with that can potentially cause whatever program to either run perfectly or explode on startup.

On the plus side though it would solve a couple problems too dealing with dependency issues/conflicts.. dependency hell isn't specific to old versions of Windows and can be a pain in the butt depending on the distro.

I can see Canonical's reasoning more or less.. personally I'd be ok with it as long as it's "in addition to" and not "instead of" their existing package management. There if you want/need it, but won't get in your face if you don't.. think something like Chakra's bundle system. One click self-contained installer for "out of repository" programs that doesn't pollute the file system but it's not a replacement of their standard repos.

Link to comment
Share on other sites

There are no dependency problems on Linux so long as you stay within your distribution's repository. Problems occur when you try to grab software outside of the package management system, which Canonical's proposed system would pseudo-fix for closed source programs programs not in the main repository. It's somewhat like their basic premise behind personal package archives, but taken to the next level.

That was not my point with regards to bloat, however. The point of shared libraries is that there is only one copy loaded in memory no matter how many programs are using it. When each application bundles its own copy of a library there will potentially be many copies loaded in memory simultaneously, negating the general performance gain they otherwise would have received from resource sharing. There is also the matter of keeping each dependency up-to-date. Flaws are regularly found in software, and developers are far less likely to keep on top of patches issued to the libraries they are using so long as their application "just works". Whereas the maintainers of libraries and other development packages in a distribution are dedicated to their piece of software and generally keep it updated and secure. Those two points summarize the reasoning behind Debian's policy of banning all software in the main repository from embedding libraries or using internal forks without bulletproof justification to the contrary.

Link to comment
Share on other sites

This topic is now closed to further replies.