Long Term Use


Recommended Posts

One of the things that always bothered me about Windows was after awhile things got slow or bugged up.

Because installing, uninstalling, changing and editing things cause the drive and everything to get confused, why it's good to wipe and start over once in awhile.

It's also the reason I started minimizing the applications I use, and moving more things to the cloud/web.

My question is though, is Linux the same way? Or does the system work differently then that? Will installing apps, removing apps, and so forth cause problems long term like windows?

Link to comment
Share on other sites

Because installing, uninstalling, changing and editing things cause the drive and everything to get confused, why it's good to wipe and start over once in awhile.

Then use portable apps as much as you can. Problem solved. When i can avoid installing an app i'm quite happy. I hate the registry.

  • Like 1
Link to comment
Share on other sites

Try defragging although it should be automatically set up on Vista ->. I've never expierenced any slow downs after time since Vista. Also make sure you have enough free space (20gb or more should easily suffice). Truthfully, I only found this noticeable on XP but not on previous Windows versions 2000, 98.

Edit: Linux woudn't/shouldn't slow down no matter how many applications you install. I find it keeps itself cleaner than Windows although Vista, 7 and 8 drastically improved on this.

Link to comment
Share on other sites

Linux won't require defragging since it uses a different type of filesystem.

Other than that, any operating system will be slowed down by an excess of background processes running. It's all about maintenance and keeping it lean. The whole Windows rot thing is not very true I find. Loaded registries and hard disk space don't affect performance much. That said, feel free to run tweaking.com windows repair and check the health of your hard drive with something like HDD Sentinel; you could be experiencing a bigger issue.

Link to comment
Share on other sites

You might experience a slow down on linux as you add programs and services, but it should decrease as you remove thing. There is some issues with uninstalling however - when you install program x and it needs dependency y, it will install x and y. If you go to remove x one day, it won't necessarily remove y, so some manual cleanup there might be good.

  • Like 1
Link to comment
Share on other sites

I don't really experience this issue in windows or linux these days. You can slow down any OS if you add too many startup items and background tasks. Linux applications to tend to uninstall cleaner than windows applications though due to the unified package management.

Link to comment
Share on other sites

Linux won't require defragging since it uses a different type of filesystem.

...

Linux (and OS X, etc.) still require defrags, the only way they could avoid it was if the filesystem defragged each file when it was modified (Which is too slow, so nobody does it outside of certain areas)

Link to comment
Share on other sites

Linux (and OS X, etc.) still require defrags, the only way they could avoid it was if the filesystem defragged each file when it was modified (Which is too slow, so nobody does it outside of certain areas)

That is not entirely correct. On HFS+ volumes Mac OS X defragments files smaller than 20 MB on-the-fly. Defragmentation as such is not required on Mac OS X, Apple indicates as much (http://support.apple.com/kb/HT1375) and independent reports have as well (http://osxbook.com/s...gmentation.html).

Linux, depending on the file system, is less fragmentation-resistent, but ext4 is said to be. Anyway, thankfully, SSDs will make the defragmentation maintenance operation one of a bygone era.

  • Like 1
Link to comment
Share on other sites

I did a clean install of Windows, tweaked it to how I wanted it to look and run (no system restore, no multiple users etc), then add just a few basic programs like Office, and then ghosted the partition (to both a second hidden partition and to a USB key). Now I play around with my computer as much as I want to, even infecting it with viruses on purpose to see what damage they do, and then in only a few minutes I can restore it back to a clean working system. I also never install programs that insist on starting up when windows starts up. If I can disable them from auto-starting, I don't use them. I find an alternative program that will do the same thing. Only a hardware upgrade will make mine run faster.

Meant to say "if I can't disable them".

Link to comment
Share on other sites

That is not entirely correct. On HFS+ volumes Mac OS X defragments files smaller than 20 MB on-the-fly. Defragmentation as such is not required on Mac OS X, Apple indicates as much (http://support.apple.com/kb/HT1375) and independent reports have as well (http://osxbook.com/s...gmentation.html).

Linux, depending on the file system, is less fragmentation-resistent, but ext4 is said to be. Anyway, thankfully, SSDs will make the defragmentation maintenance operation one of a bygone era.

As mentioned, that's only for files 20MB or smaller, and it's not part of the filesystem, it's OS logic (No functional difference to the end user, but there's a logical break). And EXT4 is only resistant to fragmentation as long as you pre-allocate the files (yay extents) to the end length (same with other file systems). If you write a solid 20MB of data, then write 1KB to the middle, it's not going to be able to actually place it in the middle of the 20MB block, it'll be placed somewhere else.

Link to comment
Share on other sites

As mentioned, that's only for files 20MB or smaller, and it's not part of the filesystem, it's OS logic (No functional difference to the end user, but there's a logical break).
Splitting the OS and its proprietary filesystem into two separate entities is a complicated proposition, seeing how HFS+ is strictly a Mac filesystem, and its implementation has been improved as OS X evolved (is HFS+ journaling an OS or a filesystem feature, for instance? if a new feature of NTFS such as Quotas or Compression is released together with a specific version of Windows and not backported, does it matter that it is technically ntfs.sys and not Windows if one can't be used without the other?). The defragmentation-on-copy is implemented on the kernel level. OS X uses an assortment of other features such as delayed allocation to make defragmentation a moot point. In normal circumstances there is simply not a sufficient performance gain to justify the wasted time and extensive disk activity caused by defragmentation.
And EXT4 is only resistant to fragmentation as long as you pre-allocate the files (yay extents) to the end length (same with other file systems). If you write a solid 20MB of data, then write 1KB to the middle, it's not going to be able to actually place it in the middle of the 20MB block, it'll be placed somewhere else.
Even if a terribly-written program would decide to do that, defragmentation would probably do more damage than it's worth as a whole to a Linux system unless the program is extremely well written to take into account that files are spread as a strategy to resist fragmentation and improve seek times. The benefits of manual defragmentation, at least under modern versions of Linux and especially Mac OS X, are not readily apparent under normal use. Even under Windows, the built-in defragmentation tool has become less and less thorough because there is a time/performance gained ratio at play here.
Link to comment
Share on other sites

To me it's a myth. If it was necessary to reformat all the time, I would have just stopped using computers altogether.

I do way to many tweaks to be redoing them all the time, nor do I like reinstalling.

People just assume that cuz after you do a fresh install, the computer is faster. Well obviously, there's nothing on it.

It's very simple to keep your computer in top-notch shape without the need to re-format.

Most people that have computers that get sluggish have a crapload of things that start up with windows and are running in the background.

The best thing to do is think... do I really need that, is it worth the RAM it's using, etc. etc.

Rule of thumb... if it's important, keep it. If it's crap, get rid of it. Simple as that.

  • Like 2
Link to comment
Share on other sites

Well Linux (Ubuntu) has a janitor program that can remove old unused kernel's as well as programs that are no longer needed. Also when you update it will tell you there are some that are not needed. As far as fragmentation- The way Linux handles files it is not really a necessity to have one. ( I personally still have an install done on a 800mhz PIII with 512mb of memory from 5 years ago and it still is fast and a slow hard drive.

not to mention-- removal is easier-for the leftovers.

Show hidden files - then go into the (user_name) directory find the one you removed (then delete the folder) since setting files are the only thing most keep when removed and don't take up no more space than a few kilobytes anyhow.

Link to comment
Share on other sites

Thanks guys, I guess I started a conversation more then a simple question lol

Probably won't do anything about it, I am quite light on app usage anyway.

But where do I find that janitor like program?

Link to comment
Share on other sites

You might experience a slow down on linux as you add programs and services, but it should decrease as you remove thing. There is some issues with uninstalling however - when you install program x and it needs dependency y, it will install x and y. If you go to remove x one day, it won't necessarily remove y, so some manual cleanup there might be good.

I believe the most distros provide a mechanism for removing dependencies when they are no longer needed. I'm not overly familiar with the packaging systems for other distributions, beyond superficial use, but Debian's package manager keeps track of whether a package was installed manually or automatically. That way, if you install an application, the package you explicitly requested to be installed will be marked as manually installed and all other packages that it requires will be marked as automatically installed. Then if you choose remove that package, APT knows that the automatically installed dependencies are no longer necessary. For example, you could run sudo apt-get remove vlc to uninstall VLC, then run sudo apt-get autoremove to remove all of its automatically installed dependencies.

If you suspect that you have applications installed that you no longer require, but you aren't exactly sure, APT has provisions to handle that too. You could either run sudo dpkg --get-selections to list all the packages installed on your system and decide for yourself what you no longer need, or use deborphan --guess-all to try to automatically determine which packages should be uninstalled regardless of their manual/automatic selection status. Both of these methods require you to know what each package does and should not be used unless you are sure you know what you are removing, or you run a high risk of damaging your system. If used correctly, however, they can be very powerful. As a general rule of thumb when dealing with deborphan output, never remove a package whose name starts with lib unless you are absolutely sure its unnecessary!

Linux (and OS X, etc.) still require defrags, the only way they could avoid it was if the filesystem defragged each file when it was modified (Which is too slow, so nobody does it outside of certain areas)

As others have pointed out already, most modern file systems do not require explicit defragmentation. EXT2/3/4, HFS+, and UFS do not have online defragmentation utilities included in their tool suite because it is largely unnecessary. They can be explicitly defragemented, however, using the file system check utility (fsck) provided with each file system. An EXT4 volume generally has fragmentation less than 2% until it reaches over 95% capacity, in which case the file placement algorithms used to prevent fragmentation break down. (Incidentally, my current fragmentation on my primary EXT4 volume is reported as 0.19%.) The other aforementioned file systems have similar characteristics (but not quite the same).

Link to comment
Share on other sites

I have never experienced any slowdown, but then again, I'm anal about what I install and always remove unneeded packages. Ubuntu Tweak has a 'Janitor' function that removes a lot of crap, and Synaptic has an option to remove packages after installation. Not sure about KDE or other systems since I'm an "Ubuntu Guy".

Link to comment
Share on other sites

Windows XP does suffer from this, i performed a lot of tests on my own equipment and approx after 6-9months the machine would be a lot slower than when it was first installed.

Windows 7 however ive found to be considerably better, where the performance of the machine is pretty much stable for years at a time, it's something i discussed with my friend the other day who uses his machine for graphic design and gaming and he said the same thing.

I have recently re-installed my home PC with Windows 7 and decided to keep it as clean as possible (although this is not to say that the previous installation wasn't stable and fast, i was replacing hdd's on my home pc). I haven't installed Microsoft Office as im moving towards Google Doc's for reporting, documents and other office files. This machine is purely for Games and Virtualisation and i expect it will run for a good few years until it's either replaced or a hardware component dies.

My main machine is a Mac and has been since around 2005 odd and in my experience none of these machines experienced a slow down over time with one exception which i will come to a little later. I believe this is to do with the fundamental way Mac and Unix as a whole deals with applications which is that all settings are files and many of the resources are bundled with the application (i.e. DLL's) This not only makes it easier to backup but i believe has a big impact on performance, when compared to the Windows registry and DLL hell, the work in Windows 7 (started with Vista) done a lot to improve the reliability and consistency of the registry and they also done a lot to minimise and remove DLL hell which is why Win7 is a lot better at providing a consistent experience.

I will say the only exception to the rule with Mac's and Unix machines when it comes to performance is as mentioned earlier to do with the file system. The method is a good one, ensure that the files are as defragmented as possible when placing them onto the disk, start from the middle of the disk and work your way outwards (instead of at the start which is where DOS & Windows work). This works well as along as you have 10% disk space left, on the mac especially if you go past this and nearly fill your drive then performance goes off a cliff's edge. From my own personal experience, i have accidently filled a Mac's Hard disk a couple of times and in both times, even when freeing up hard disk space the performance of the system never recovered, i found my mac would constantly thrash the disks even with plenty of space, so i would always recommend keeping at least 10GB free with 20GB being the sweet spot.

Link to comment
Share on other sites

My main machine is a Mac and has been since around 2005 odd and in my experience none of these machines experienced a slow down over time with one exception which i will come to a little later. I believe this is to do with the fundamental way Mac and Unix as a whole deals with applications which is that all settings are files and many of the resources are bundled with the application (i.e. DLL's) This not only makes it easier to backup but i believe has a big impact on performance, when compared to the Windows registry and DLL hell, the work in Windows 7 (started with Vista) done a lot to improve the reliability and consistency of the registry and they also done a lot to minimise and remove DLL hell which is why Win7 is a lot better at providing a consistent experience.

I think you have a fundamental misunderstanding of how OS X and UNIX applications work. First, don't confuse traditional UNIX applications with OS X applications. Apple handles their application bundles in a much different way than their FreeBSD upstream does. Second, most programs installed on your Mac don't have libraries bundled with them. They are more self-contained and easier to backup and restore than most Windows applications, but that's mostly due to system architecture not a conscious decision by each application developer. Finally, your assumption that "DLL Hell" is still wreaking havoc in Windows is a little outdated. It is true that Windows 7 doesn't suffer from this type of problem, but neither do Windows 2000 -> Windows Vista. You can read a good, well-documented summary of the situation on Wikipedia.

I will say the only exception to the rule with Mac's and Unix machines when it comes to performance is as mentioned earlier to do with the file system. The method is a good one, ensure that the files are as defragmented as possible when placing them onto the disk, start from the middle of the disk and work your way outwards (instead of at the start which is where DOS & Windows work). This works well as along as you have 10% disk space left, on the mac especially if you go past this and nearly fill your drive then performance goes off a cliff's edge. From my own personal experience, i have accidently filled a Mac's Hard disk a couple of times and in both times, even when freeing up hard disk space the performance of the system never recovered, i found my mac would constantly thrash the disks even with plenty of space, so i would always recommend keeping at least 10GB free with 20GB being the sweet spot.

When a HFS+ partition fills beyond 90%, there is no recovering the performance by deleting files, as you noted. OS X won't ever automatically defragment your hard disk, but don't make the mistake of assuming that's the way other, similar file systems work as well. EXT2/3/4 and UFS are capable of recovering from the scenario you described. You can read more about this failure of HFS+, as well as some of its more egregious problems, in Ars Technica's Mac OS X 10.7 Lion review.

Link to comment
Share on other sites

from my experience this is universal regardless of what device you are using from windows, to IOS

Link to comment
Share on other sites

I have never experienced any slowdown, but then again, I'm anal about what I install and always remove unneeded packages. Ubuntu Tweak has a 'Janitor' function that removes a lot of crap, and Synaptic has an option to remove packages after installation. Not sure about KDE or other systems since I'm an "Ubuntu Guy".

Yea I started using that Janitor feature, even if it doesn't do the best it can, makes me feel better lol Just like CCleaner on Windows.

Link to comment
Share on other sites

ALL O/S's requirement maintenance but in MY usage scenarios I have kept my Gentoo machine running for 5 years without refreshing, where Windows normally 1 year between refreshes if I can find the time.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.