Sign in to follow this  
Followers 0

Long Term Use

24 posts in this topic

Posted

One of the things that always bothered me about Windows was after awhile things got slow or bugged up.

Because installing, uninstalling, changing and editing things cause the drive and everything to get confused, why it's good to wipe and start over once in awhile.

It's also the reason I started minimizing the applications I use, and moving more things to the cloud/web.


My question is though, is Linux the same way? Or does the system work differently then that? Will installing apps, removing apps, and so forth cause problems long term like windows?

Share this post


Link to post
Share on other sites

Posted

[quote]
Because installing, uninstalling, changing and editing things cause the drive and everything to get confused, why it's good to wipe and start over once in awhile.
[/quote]

Then use portable apps as much as you can. Problem solved. When i can avoid installing an app i'm quite happy. I hate the registry.
1 person likes this

Share this post


Link to post
Share on other sites

Posted

Yea I already stated I take care when choosing apps.

That was not my question.

Share this post


Link to post
Share on other sites

Posted

Try defragging although it should be automatically set up on Vista ->. I've never expierenced any slow downs after time since Vista. Also make sure you have enough free space (20gb or more should easily suffice). Truthfully, I only found this noticeable on XP but not on previous Windows versions 2000, 98.

Edit: Linux woudn't/shouldn't slow down no matter how many applications you install. I find it keeps itself cleaner than Windows although Vista, 7 and 8 drastically improved on this.

Share this post


Link to post
Share on other sites

Posted

Linux won't require defragging since it uses a different type of filesystem.
Other than that, any operating system will be slowed down by an excess of background processes running. It's all about maintenance and keeping it lean. The whole Windows rot thing is not very true I find. Loaded registries and hard disk space don't affect performance much. That said, feel free to run tweaking.com windows repair and check the health of your hard drive with something like HDD Sentinel; you could be experiencing a bigger issue.

Share this post


Link to post
Share on other sites

Posted

I've never experienced slow downs on Linux personally.

Share this post


Link to post
Share on other sites

Posted

You might experience a slow down on linux as you add programs and services, but it should decrease as you remove thing. There is some issues with uninstalling however - when you install program x and it needs dependency y, it will install x and y. If you go to remove x one day, it won't necessarily remove y, so some manual cleanup there might be good.
1 person likes this

Share this post


Link to post
Share on other sites

Posted

I don't really experience this issue in windows or linux these days. You can slow down any OS if you add too many startup items and background tasks. Linux applications to tend to uninstall cleaner than windows applications though due to the unified package management.

Share this post


Link to post
Share on other sites

Posted

[quote name='Michael Lacey' timestamp='1357450287' post='595435680']
Linux won't require defragging since it uses a different type of filesystem.
...
[/quote]
Linux (and OS X, etc.) still require defrags, the only way they could avoid it was if the filesystem defragged each file when it was modified (Which is too slow, so nobody does it outside of certain areas)

Share this post


Link to post
Share on other sites

Posted

[quote name='The_Decryptor' timestamp='1357510858' post='595437058']
Linux (and OS X, etc.) still require defrags, the only way they could avoid it was if the filesystem defragged each file when it was modified (Which is too slow, so nobody does it outside of certain areas)
[/quote]That is not entirely correct. On HFS+ volumes Mac OS X defragments files smaller than 20 MB on-the-fly. Defragmentation as such is not required on Mac OS X, Apple indicates as much ([url="http://support.apple.com/kb/HT1375"]http://support.apple.com/kb/HT1375[/url]) and independent reports have as well ([url="http://osxbook.com/software/hfsdebug/fragmentation.html"]http://osxbook.com/s...gmentation.html[/url]).

Linux, depending on the file system, is less fragmentation-resistent, but ext4 is said to be. Anyway, thankfully, SSDs will make the defragmentation maintenance operation one of a bygone era.
1 person likes this

Share this post


Link to post
Share on other sites

Posted

I did a clean install of Windows, tweaked it to how I wanted it to look and run (no system restore, no multiple users etc), then add just a few basic programs like Office, and then ghosted the partition (to both a second hidden partition and to a USB key). Now I play around with my computer as much as I want to, even infecting it with viruses on purpose to see what damage they do, and then in only a few minutes I can restore it back to a clean working system. I also never install programs that insist on starting up when windows starts up. If I can disable them from auto-starting, I don't use them. I find an alternative program that will do the same thing. Only a hardware upgrade will make mine run faster.

Meant to say "if I [i]can't [/i]disable them".

Share this post


Link to post
Share on other sites

Posted

[quote name='Newinko' timestamp='1357512157' post='595437082']
That is not entirely correct. On HFS+ volumes Mac OS X defragments files smaller than 20 MB on-the-fly. Defragmentation as such is not required on Mac OS X, Apple indicates as much ([url="http://support.apple.com/kb/HT1375"]http://support.apple.com/kb/HT1375[/url]) and independent reports have as well ([url="http://osxbook.com/software/hfsdebug/fragmentation.html"]http://osxbook.com/s...gmentation.html[/url]).

Linux, depending on the file system, is less fragmentation-resistent, but ext4 is said to be. Anyway, thankfully, SSDs will make the defragmentation maintenance operation one of a bygone era.
[/quote]

As mentioned, that's only for files 20MB or smaller, and it's not part of the filesystem, it's OS logic (No functional difference to the end user, but there's a logical break). And EXT4 is only resistant to fragmentation as long as you pre-allocate the files (yay extents) to the end length (same with other file systems). If you write a solid 20MB of data, then write 1KB to the middle, it's not going to be able to actually place it in the middle of the 20MB block, it'll be placed somewhere else.

Share this post


Link to post
Share on other sites

Posted

[quote name='The_Decryptor' timestamp='1357514462' post='595437178']As mentioned, that's only for files 20MB or smaller, and it's not part of the filesystem, it's OS logic (No functional difference to the end user, but there's a logical break).[/quote]Splitting the OS and its proprietary filesystem into two separate entities is a complicated proposition, seeing how HFS+ is strictly a Mac filesystem, and its implementation has been improved as OS X evolved (is HFS+ journaling an OS or a filesystem feature, for instance? if a new feature of NTFS such as Quotas or Compression is released together with a specific version of Windows and not backported, does it matter that it is technically ntfs.sys and not Windows if one can't be used without the other?). The defragmentation-on-copy is implemented on the kernel level. OS X uses an assortment of other features such as delayed allocation to make defragmentation a moot point. In normal circumstances there is simply not a sufficient performance gain to justify the wasted time and extensive disk activity caused by defragmentation.

[quote]And EXT4 is only resistant to fragmentation as long as you pre-allocate the files (yay extents) to the end length (same with other file systems). If you write a solid 20MB of data, then write 1KB to the middle, it's not going to be able to actually place it in the middle of the 20MB block, it'll be placed somewhere else.[/quote]Even if a terribly-written program would decide to do that, defragmentation would probably do more damage than it's worth as a whole to a Linux system unless the program is extremely well written to take into account that files are spread as a strategy to resist fragmentation and improve seek times. The benefits of manual defragmentation, at least under modern versions of Linux and especially Mac OS X, are not readily apparent under normal use. Even under Windows, the built-in defragmentation tool has become less and less thorough because there is a time/performance gained ratio at play here.
1 person likes this

Share this post


Link to post
Share on other sites

Posted

To me it's a myth. If it was necessary to reformat all the time, I would have just stopped using computers altogether.
I do way to many tweaks to be redoing them all the time, nor do I like reinstalling.
People just assume that cuz after you do a fresh install, the computer is faster. Well obviously, there's nothing on it.
It's very simple to keep your computer in top-notch shape without the need to re-format.
Most people that have computers that get sluggish have a crapload of things that start up with windows and are running in the background.
The best thing to do is think... do I really need that, is it worth the RAM it's using, etc. etc.
Rule of thumb... if it's important, keep it. If it's crap, get rid of it. Simple as that.
2 people like this

Share this post


Link to post
Share on other sites

Posted

Well Linux (Ubuntu) has a janitor program that can remove old unused kernel's as well as programs that are no longer needed. Also when you update it will tell you there are some that are not needed. As far as fragmentation- The way Linux handles files it is not really a necessity to have one. ( I personally still have an install done on a 800mhz PIII with 512mb of memory from 5 years ago and it still is fast and a slow hard drive.

not to mention-- removal is easier-for the leftovers.

Show hidden files - then go into the (user_name) directory find the one you removed (then delete the folder) since setting files are the only thing most keep when removed and don't take up no more space than a few kilobytes anyhow.

Share this post


Link to post
Share on other sites

Posted

Thanks guys, I guess I started a conversation more then a simple question lol

Probably won't do anything about it, I am quite light on app usage anyway.

But where do I find that janitor like program?

Share this post


Link to post
Share on other sites

Posted

[quote name='bman' timestamp='1357527717' post='595437466']
Thanks guys, I guess I started a conversation more then a simple question lol

Probably won't do anything about it, I am quite light on app usage anyway.

But where do I find that janitor like program?
[/quote]


[url="https://launchpad.net/ubuntu/precise/+package/computer-janitor"]https://launchpad.ne...omputer-janitor[/url]


https://apps.ubuntu.com/cat/applications/precise/computer-janitor-gtk/

Share this post


Link to post
Share on other sites

Posted

[quote name='cybertimber2008' timestamp='1357486953' post='595436332']
You might experience a slow down on linux as you add programs and services, but it should decrease as you remove thing. There is some issues with uninstalling however - when you install program x and it needs dependency y, it will install x and y. If you go to remove x one day, it won't necessarily remove y, so some manual cleanup there might be good.
[/quote]

I believe the most distros provide a mechanism for removing dependencies when they are no longer needed. I'm not overly familiar with the packaging systems for other distributions, beyond superficial use, but Debian's package manager keeps track of whether a package was installed manually or automatically. That way, if you install an application, the package you explicitly requested to be installed will be marked as manually installed and all other packages that it requires will be marked as automatically installed. Then if you choose remove that package, APT knows that the automatically installed dependencies are no longer necessary. For example, you could run [i]sudo apt-get remove vlc[/i] to uninstall VLC, then run [i]sudo apt-get autoremove[/i] to remove all of its automatically installed dependencies.

If you suspect that you have applications installed that you no longer require, but you aren't exactly sure, APT has provisions to handle that too. You could either run [i]sudo dpkg --get-selections[/i] to list all the packages installed on your system and decide for yourself what you no longer need, or use [i]deborphan --guess-all[/i] to try to automatically determine which packages should be uninstalled regardless of their manual/automatic selection status. Both of these methods require you to know what each package does and should not be used unless you are sure you know what you are removing, or you run a high risk of damaging your system. If used correctly, however, they can be very powerful. As a general rule of thumb when dealing with [i]deborphan[/i] output, never remove a package whose name starts with [i]lib[/i] unless you are absolutely sure its unnecessary!

[quote name='The_Decryptor' timestamp='1357510858' post='595437058']
Linux (and OS X, etc.) still require defrags, the only way they could avoid it was if the filesystem defragged each file when it was modified (Which is too slow, so nobody does it outside of certain areas)
[/quote]

As others have pointed out already, most modern file systems do not require explicit defragmentation. EXT2/3/4, HFS+, and UFS do not have online defragmentation utilities included in their tool suite because it is largely unnecessary. They can be explicitly defragemented, however, using the file system check utility ([i]fsck[/i]) provided with each file system. An EXT4 volume generally has fragmentation less than 2% until it reaches over 95% capacity, in which case the file placement algorithms used to prevent fragmentation break down. (Incidentally, my current fragmentation on my primary EXT4 volume is reported as 0.19%.) The other aforementioned file systems have similar characteristics (but not quite the same).

Share this post


Link to post
Share on other sites

Posted

I have never experienced any slowdown, but then again, I'm anal about what I install and always remove unneeded packages. Ubuntu Tweak has a 'Janitor' function that removes a lot of crap, and Synaptic has an option to remove packages after installation. Not sure about KDE or other systems since I'm an "Ubuntu Guy".

Share this post


Link to post
Share on other sites

Posted

Windows XP does suffer from this, i performed a lot of tests on my own equipment and approx after 6-9months the machine would be a lot slower than when it was first installed.

Windows 7 however ive found to be considerably better, where the performance of the machine is pretty much stable for years at a time, it's something i discussed with my friend the other day who uses his machine for graphic design and gaming and he said the same thing.

I have recently re-installed my home PC with Windows 7 and decided to keep it as clean as possible (although this is not to say that the previous installation wasn't stable and fast, i was replacing hdd's on my home pc). I haven't installed Microsoft Office as im moving towards Google Doc's for reporting, documents and other office files. This machine is purely for Games and Virtualisation and i expect it will run for a good few years until it's either replaced or a hardware component dies.

My main machine is a Mac and has been since around 2005 odd and in my experience none of these machines experienced a slow down over time with one exception which i will come to a little later. I believe this is to do with the fundamental way Mac and Unix as a whole deals with applications which is that all settings are files and many of the resources are bundled with the application (i.e. DLL's) This not only makes it easier to backup but i believe has a big impact on performance, when compared to the Windows registry and DLL hell, the work in Windows 7 (started with Vista) done a lot to improve the reliability and consistency of the registry and they also done a lot to minimise and remove DLL hell which is why Win7 is a lot better at providing a consistent experience.

I will say the only exception to the rule with Mac's and Unix machines when it comes to performance is as mentioned earlier to do with the file system. The method is a good one, ensure that the files are as defragmented as possible when placing them onto the disk, start from the middle of the disk and work your way outwards (instead of at the start which is where DOS & Windows work). This works well as along as you have 10% disk space left, on the mac especially if you go past this and nearly fill your drive then performance goes off a cliff's edge. From my own personal experience, i have accidently filled a Mac's Hard disk a couple of times and in both times, even when freeing up hard disk space the performance of the system never recovered, i found my mac would constantly thrash the disks even with plenty of space, so i would always recommend keeping at least 10GB free with 20GB being the sweet spot.

Share this post


Link to post
Share on other sites

Posted

[quote name='REM2000' timestamp='1357900967' post='595447122']
My main machine is a Mac and has been since around 2005 odd and in my experience none of these machines experienced a slow down over time with one exception which i will come to a little later. I believe this is to do with the fundamental way Mac and Unix as a whole deals with applications which is that all settings are files and many of the resources are bundled with the application (i.e. DLL's) This not only makes it easier to backup but i believe has a big impact on performance, when compared to the Windows registry and DLL hell, the work in Windows 7 (started with Vista) done a lot to improve the reliability and consistency of the registry and they also done a lot to minimise and remove DLL hell which is why Win7 is a lot better at providing a consistent experience.
[/quote]

I think you have a fundamental misunderstanding of how OS X and UNIX applications work. First, don't confuse traditional UNIX applications with OS X applications. Apple handles their application bundles in a much different way than their FreeBSD upstream does. Second, most programs installed on your Mac don't have libraries bundled with them. They [i]are[/i] more self-contained and easier to backup and restore than most Windows applications, but that's mostly due to system architecture not a conscious decision by each application developer. Finally, your assumption that "DLL Hell" is still wreaking havoc in Windows is a little outdated. It [i]is[/i] true that Windows 7 doesn't suffer from this type of problem, but neither do Windows 2000 -> Windows Vista. You can read a good, well-documented summary of the situation [url="http://en.wikipedia.org/wiki/DLL_Hell"]on Wikipedia[/url].

[quote name='REM2000' timestamp='1357900967' post='595447122']
I will say the only exception to the rule with Mac's and Unix machines when it comes to performance is as mentioned earlier to do with the file system. The method is a good one, ensure that the files are as defragmented as possible when placing them onto the disk, start from the middle of the disk and work your way outwards (instead of at the start which is where DOS & Windows work). This works well as along as you have 10% disk space left, on the mac especially if you go past this and nearly fill your drive then performance goes off a cliff's edge. From my own personal experience, i have accidently filled a Mac's Hard disk a couple of times and in both times, even when freeing up hard disk space the performance of the system never recovered, i found my mac would constantly thrash the disks even with plenty of space, so i would always recommend keeping at least 10GB free with 20GB being the sweet spot.
[/quote]

When a HFS+ partition fills beyond 90%, there is no recovering the performance by deleting files, as you noted. OS X won't ever automatically defragment your hard disk, but don't make the mistake of assuming that's the way other, similar file systems work as well. EXT2/3/4 and UFS are capable of recovering from the scenario you described. You can read more about this failure of HFS+, as well as some of its more egregious problems, in [url="http://arstechnica.com/apple/2011/07/mac-os-x-10-7/12/#hfs-problems"]Ars Technica's Mac OS X 10.7 Lion review[/url].

Share this post


Link to post
Share on other sites

Posted

from my experience this is universal regardless of what device you are using from windows, to IOS

Share this post


Link to post
Share on other sites

Posted

[quote name='brn' timestamp='1357886915' post='595446966']
I have never experienced any slowdown, but then again, I'm anal about what I install and always remove unneeded packages. Ubuntu Tweak has a 'Janitor' function that removes a lot of crap, and Synaptic has an option to remove packages after installation. Not sure about KDE or other systems since I'm an "Ubuntu Guy".
[/quote]

Yea I started using that Janitor feature, even if it doesn't do the best it can, makes me feel better lol Just like CCleaner on Windows.

Share this post


Link to post
Share on other sites

Posted

ALL O/S's requirement maintenance but in MY usage scenarios I have kept my Gentoo machine running for 5 years without refreshing, where Windows normally 1 year between refreshes if I can find the time.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.