Jump to content



Photo

Long Term Use


  • Please log in to reply
23 replies to this topic

#16 OP +bman

bman

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 03-January 03
  • Location: Ottawa, Ontario
  • OS: Windows 8.1 & Android

Posted 07 January 2013 - 03:01

Thanks guys, I guess I started a conversation more then a simple question lol

Probably won't do anything about it, I am quite light on app usage anyway.

But where do I find that janitor like program?


#17 redvamp128

redvamp128

    Neowinian Senior

  • Joined: 06-October 01

Posted 07 January 2013 - 03:17

Thanks guys, I guess I started a conversation more then a simple question lol

Probably won't do anything about it, I am quite light on app usage anyway.

But where do I find that janitor like program?



https://launchpad.ne...omputer-janitor


https://apps.ubuntu....er-janitor-gtk/

#18 +Karl L.

Karl L.

    xorangekiller

  • Tech Issues Solved: 15
  • Joined: 24-January 09
  • Location: Virginia, USA
  • OS: Debian Testing

Posted 07 January 2013 - 04:04

You might experience a slow down on linux as you add programs and services, but it should decrease as you remove thing. There is some issues with uninstalling however - when you install program x and it needs dependency y, it will install x and y. If you go to remove x one day, it won't necessarily remove y, so some manual cleanup there might be good.


I believe the most distros provide a mechanism for removing dependencies when they are no longer needed. I'm not overly familiar with the packaging systems for other distributions, beyond superficial use, but Debian's package manager keeps track of whether a package was installed manually or automatically. That way, if you install an application, the package you explicitly requested to be installed will be marked as manually installed and all other packages that it requires will be marked as automatically installed. Then if you choose remove that package, APT knows that the automatically installed dependencies are no longer necessary. For example, you could run sudo apt-get remove vlc to uninstall VLC, then run sudo apt-get autoremove to remove all of its automatically installed dependencies.

If you suspect that you have applications installed that you no longer require, but you aren't exactly sure, APT has provisions to handle that too. You could either run sudo dpkg --get-selections to list all the packages installed on your system and decide for yourself what you no longer need, or use deborphan --guess-all to try to automatically determine which packages should be uninstalled regardless of their manual/automatic selection status. Both of these methods require you to know what each package does and should not be used unless you are sure you know what you are removing, or you run a high risk of damaging your system. If used correctly, however, they can be very powerful. As a general rule of thumb when dealing with deborphan output, never remove a package whose name starts with lib unless you are absolutely sure its unnecessary!

Linux (and OS X, etc.) still require defrags, the only way they could avoid it was if the filesystem defragged each file when it was modified (Which is too slow, so nobody does it outside of certain areas)


As others have pointed out already, most modern file systems do not require explicit defragmentation. EXT2/3/4, HFS+, and UFS do not have online defragmentation utilities included in their tool suite because it is largely unnecessary. They can be explicitly defragemented, however, using the file system check utility (fsck) provided with each file system. An EXT4 volume generally has fragmentation less than 2% until it reaches over 95% capacity, in which case the file placement algorithms used to prevent fragmentation break down. (Incidentally, my current fragmentation on my primary EXT4 volume is reported as 0.19%.) The other aforementioned file systems have similar characteristics (but not quite the same).

#19 brn

brn

    Neowinian

  • Joined: 28-December 12
  • Location: Florida
  • OS: Ubuntu
  • Phone: HTC Aria

Posted 11 January 2013 - 06:48

I have never experienced any slowdown, but then again, I'm anal about what I install and always remove unneeded packages. Ubuntu Tweak has a 'Janitor' function that removes a lot of crap, and Synaptic has an option to remove packages after installation. Not sure about KDE or other systems since I'm an "Ubuntu Guy".

#20 REM2000

REM2000

    Neowinian Senior

  • Joined: 20-July 04
  • Location: UK

Posted 11 January 2013 - 10:42

Windows XP does suffer from this, i performed a lot of tests on my own equipment and approx after 6-9months the machine would be a lot slower than when it was first installed.

Windows 7 however ive found to be considerably better, where the performance of the machine is pretty much stable for years at a time, it's something i discussed with my friend the other day who uses his machine for graphic design and gaming and he said the same thing.

I have recently re-installed my home PC with Windows 7 and decided to keep it as clean as possible (although this is not to say that the previous installation wasn't stable and fast, i was replacing hdd's on my home pc). I haven't installed Microsoft Office as im moving towards Google Doc's for reporting, documents and other office files. This machine is purely for Games and Virtualisation and i expect it will run for a good few years until it's either replaced or a hardware component dies.

My main machine is a Mac and has been since around 2005 odd and in my experience none of these machines experienced a slow down over time with one exception which i will come to a little later. I believe this is to do with the fundamental way Mac and Unix as a whole deals with applications which is that all settings are files and many of the resources are bundled with the application (i.e. DLL's) This not only makes it easier to backup but i believe has a big impact on performance, when compared to the Windows registry and DLL hell, the work in Windows 7 (started with Vista) done a lot to improve the reliability and consistency of the registry and they also done a lot to minimise and remove DLL hell which is why Win7 is a lot better at providing a consistent experience.

I will say the only exception to the rule with Mac's and Unix machines when it comes to performance is as mentioned earlier to do with the file system. The method is a good one, ensure that the files are as defragmented as possible when placing them onto the disk, start from the middle of the disk and work your way outwards (instead of at the start which is where DOS & Windows work). This works well as along as you have 10% disk space left, on the mac especially if you go past this and nearly fill your drive then performance goes off a cliff's edge. From my own personal experience, i have accidently filled a Mac's Hard disk a couple of times and in both times, even when freeing up hard disk space the performance of the system never recovered, i found my mac would constantly thrash the disks even with plenty of space, so i would always recommend keeping at least 10GB free with 20GB being the sweet spot.

#21 +Karl L.

Karl L.

    xorangekiller

  • Tech Issues Solved: 15
  • Joined: 24-January 09
  • Location: Virginia, USA
  • OS: Debian Testing

Posted 11 January 2013 - 19:20

My main machine is a Mac and has been since around 2005 odd and in my experience none of these machines experienced a slow down over time with one exception which i will come to a little later. I believe this is to do with the fundamental way Mac and Unix as a whole deals with applications which is that all settings are files and many of the resources are bundled with the application (i.e. DLL's) This not only makes it easier to backup but i believe has a big impact on performance, when compared to the Windows registry and DLL hell, the work in Windows 7 (started with Vista) done a lot to improve the reliability and consistency of the registry and they also done a lot to minimise and remove DLL hell which is why Win7 is a lot better at providing a consistent experience.


I think you have a fundamental misunderstanding of how OS X and UNIX applications work. First, don't confuse traditional UNIX applications with OS X applications. Apple handles their application bundles in a much different way than their FreeBSD upstream does. Second, most programs installed on your Mac don't have libraries bundled with them. They are more self-contained and easier to backup and restore than most Windows applications, but that's mostly due to system architecture not a conscious decision by each application developer. Finally, your assumption that "DLL Hell" is still wreaking havoc in Windows is a little outdated. It is true that Windows 7 doesn't suffer from this type of problem, but neither do Windows 2000 -> Windows Vista. You can read a good, well-documented summary of the situation on Wikipedia.

I will say the only exception to the rule with Mac's and Unix machines when it comes to performance is as mentioned earlier to do with the file system. The method is a good one, ensure that the files are as defragmented as possible when placing them onto the disk, start from the middle of the disk and work your way outwards (instead of at the start which is where DOS & Windows work). This works well as along as you have 10% disk space left, on the mac especially if you go past this and nearly fill your drive then performance goes off a cliff's edge. From my own personal experience, i have accidently filled a Mac's Hard disk a couple of times and in both times, even when freeing up hard disk space the performance of the system never recovered, i found my mac would constantly thrash the disks even with plenty of space, so i would always recommend keeping at least 10GB free with 20GB being the sweet spot.


When a HFS+ partition fills beyond 90%, there is no recovering the performance by deleting files, as you noted. OS X won't ever automatically defragment your hard disk, but don't make the mistake of assuming that's the way other, similar file systems work as well. EXT2/3/4 and UFS are capable of recovering from the scenario you described. You can read more about this failure of HFS+, as well as some of its more egregious problems, in Ars Technica's Mac OS X 10.7 Lion review.

#22 Geoffrey B.

Geoffrey B.

    LittleNeutrino

  • Tech Issues Solved: 6
  • Joined: 25-July 05
  • Location: Newark, Ohio
  • OS: Windows 8.1
  • Phone: Nokia Lumia 928

Posted 11 January 2013 - 19:22

from my experience this is universal regardless of what device you are using from windows, to IOS

#23 OP +bman

bman

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 03-January 03
  • Location: Ottawa, Ontario
  • OS: Windows 8.1 & Android

Posted 11 January 2013 - 20:16

I have never experienced any slowdown, but then again, I'm anal about what I install and always remove unneeded packages. Ubuntu Tweak has a 'Janitor' function that removes a lot of crap, and Synaptic has an option to remove packages after installation. Not sure about KDE or other systems since I'm an "Ubuntu Guy".


Yea I started using that Janitor feature, even if it doesn't do the best it can, makes me feel better lol Just like CCleaner on Windows.

#24 tim_s

tim_s

    Default

  • Joined: 07-January 13
  • OS: OSX (Macbook Pro i7), Windows 7 (Gaming), Gentoo
  • Phone: Samsung Galaxy SIII, iPhone 4s

Posted 14 January 2013 - 20:10

ALL O/S's requirement maintenance but in MY usage scenarios I have kept my Gentoo machine running for 5 years without refreshing, where Windows normally 1 year between refreshes if I can find the time.



Click here to login or here to register to remove this ad, it's free!