Jump to content



Photo

Few basic questions


  • Please log in to reply
7 replies to this topic

#1 Guth

Guth

    Alba Gu Bráth

  • Joined: 30-December 05
  • Location: Scotland
  • OS: Windows 8 Pro
  • Phone: Nokia Lumia

Posted 06 January 2013 - 13:58

Just looking for answers for a few things I've always wondered but google hasn't helped.

1. Is a long uptime bad for performance? (google returns talk about servers)
I sometimes dont restart my pc for a week or two on end. Is this bad for performance? Will it make my computer slow down?
I sometimes think it feels more snappy after a restart but may be placebo.


2. Why does restarting sometimes fix errors, especially after installing a program. Also, why do some software require restarts and some dont?

3. Dual core processors, is this like 2 CPU's in one. Is a 2Ghz dual core the same as having 2x1GHz CPU's?

4. Defragging. Why isnt this so important anymore. Also, what "frags" the computer if it needs defragged lol?

5. Whats more important, the amount of ram or the speed at which it works

6. At school we learned how data is stored, 1's and 0's. For numbers and letters.
I dont get how pictures are stored as 1's and 0's. How does colour work lol
< got a google answer for this one

Sorry I know these are real basic but I've just always wondered


#2 Hum

Hum

    totally wAcKed

  • Tech Issues Solved: 10
  • Joined: 05-October 03
  • Location: Odder Space
  • OS: Windows XP, 7

Posted 06 January 2013 - 14:06

Long on-times will keep the system heated up -- which could wear parts out faster.

Heat is not good for a drive's magnetic surface.

I would leave it off when you won't be using the PC for a couple of hours -- unless you were doing a download.

A once a day Restart may be good, because it may shut off sneaky programs, running in the background, that may not have completely stopped, when you Exited them.

Fixing errors may require taking out bad Registry entrys -- I use CCleaner, to keep it clean.

Running an Error check may put back missing, bad entrys.

You don't defrag SSD's as far as I know.

I still defrag a regular hard drive.

It gets 'fragmented' because it is a Random Access storage -- the system may write anywhere there is an empty space.

So the larger files may get written 5 or 6 places on the drive.

I believe the File index then needs to record an entry for each fragment location, slowing down the file access.

This can become quite a tangled up mess, over time, cause stress to the mechanical parts of your drive, having to search and reverse, back and forth.


3. Dual core processors, is this like 2 CPU's in one. Is a 2Ghz dual core the same as having 2x1GHz CPU's?


Dual cores should be a little faster than 2 separate processors, since the electrons have a shorter distance to travel, less path switching, etc.

I would just refer to the Benchmarks for the CPUs that you want to compare:

http://www.cpubenchmark.net/

5. Whats more important, the amount of ram or the speed at which it works ?


I would say, both. ;)

#3 Nick H.

Nick H.

    Neowinian Senior

  • Tech Issues Solved: 16
  • Joined: 28-June 04
  • Location: Switzerland

Posted 06 January 2013 - 14:13

1. Is a long uptime bad for performance? (google returns talk about servers)
I sometimes dont restart my pc for a week or two on end. Is this bad for performance? Will it make my computer slow down?
I sometimes think it feels more snappy after a restart but may be placebo.

Sometimes if a program doesn't shut down correctly, some of the resources that it was using will be still taken up even though the program isn't running anymore. A restart is sure to free up those resources. (At least, I think that's correct).

2. Why does restarting sometimes fix errors, especially after installing a program. Also, why do some software require restarts and some dont?

Some updates require a restart of your system because they modify system files that are currently in use. Restarting the computer frees those files, allowing them to be modified.

I can't answer question 3. I've asked the guys here about processors before though, and the guys have been able to explain it to me.

4. Defragging. Why isnt this so important anymore. Also, what "frags" the computer if it needs defragged lol?

You have to understand how a mechanical HDD works. A HDD is made up of separate disks that are always spinning, and a magnetic head that reads and writes information to the disks. However, let's say that you have a file that is 4GB. The magnetic head doesn't wait for the disk to spin to a location that has exactly 4GB of free space. Instead, it will drop parts of the file in to any free space that it comes across first. This means that the file is spread out around the disk; it is fragmented. Defragging the HDD is basically telling the HDD to move the bits of the files around so that the entire file is in one location.
You're right, it's not as necessary as it once was, especially with SSD's, but as I've avoided that technological step for the moment I haven't looked in to why you don't need to defrag them. Again, someone else here should be able to answer that.

5. Whats more important, the amount of ram or the speed at which it works

I've never really understood that myself. I look forward to reading an answer from someone.

#4 OP Guth

Guth

    Alba Gu Bráth

  • Joined: 30-December 05
  • Location: Scotland
  • OS: Windows 8 Pro
  • Phone: Nokia Lumia

Posted 06 January 2013 - 14:27

thanks for the answers guys! Really interesting

I never knew that HDDs just put the data anywhere! Thats so interesting.

Computers are so amazing, I figure myself tech savvy (as in I can use them quite well) but these things never cease to amaze me.

So i should still defrag on windows 8? I read somewhere it wasn't so important now but I never thought at the time, I guess it was talking about SSD's.

I have a 1TB HDD as my windows install disk with only 30% free now so I think I will do a defrag tonight :)

#5 Hardcore Til I Die

Hardcore Til I Die

    Neowinian Senior

  • Joined: 18-February 07
  • Location: England

Posted 06 January 2013 - 14:39

Just looking for answers for a few things I've always wondered but google hasn't helped.

1. Is a long uptime bad for performance? (google returns talk about servers)
I sometimes dont restart my pc for a week or two on end. Is this bad for performance? Will it make my computer slow down?
I sometimes think it feels more snappy after a restart but may be placebo.


2. Why does restarting sometimes fix errors, especially after installing a program. Also, why do some software require restarts and some dont?

3. Dual core processors, is this like 2 CPU's in one. Is a 2Ghz dual core the same as having 2x1GHz CPU's?

4. Defragging. Why isnt this so important anymore. Also, what "frags" the computer if it needs defragged lol?

5. Whats more important, the amount of ram or the speed at which it works

6. At school we learned how data is stored, 1's and 0's. For numbers and letters.
I dont get how pictures are stored as 1's and 0's. How does colour work lol
< got a google answer for this one

Sorry I know these are real basic but I've just always wondered


3 - a single core processor can only process one instruction at a time. A dual core can process two at a time, a tri-core three, a quad core four, a penta-core five, a hexa-core six, etc etc. This means that an application can use "multi-threading" to enable the processor to run multiple things at the same time. A 2GHz dual core processor would be more like having a 4GHz single core processor, assuming that software is written to take advantage of both cores (i.e. it makes use of multi-threading).

The other benefit is that two processing cores running at 50% speed would use less than power than one screaming along as fast as it can.

4. defragging is just as important as it always has been if you want the best performance out of a HDD, it's just that so many HDD's are so fast nowadays that they're pretty efficient even when working with fragmented files. Ten years ago, drives were a lot slower and the overhead of working with fragmented files was more noticeable.


You're right, it's not as necessary as it once was, especially with SSD's, but as I've avoided that technological step for the moment I haven't looked in to why you don't need to defrag them. Again, someone else here should be able to answer that.


SSD's have no moving parts, so it's just as quick to access a file no matter where it is on the disk, whether it's spread into various parts or all in one place.

With a HDD it's quicker to access the file when it's all in one place as the magnetic head doesn't have to keep moving around.

#6 Nick H.

Nick H.

    Neowinian Senior

  • Tech Issues Solved: 16
  • Joined: 28-June 04
  • Location: Switzerland

Posted 06 January 2013 - 14:42

SSD's have no moving parts, so it's just as quick to access a file no matter where it is on the disk, whether it's spread into various parts or all in one place.

With a HDD it's quicker to access the file when it's all in one place as the magnetic head doesn't have to keep moving around.

I knew the part about HDD's, and once I applied some logic to the idea of SSD's I came to the same conclusion. Thanks for clarifying. (Y)

#7 +Phouchg

Phouchg

    Resident Misanthrope

  • Tech Issues Solved: 9
  • Joined: 28-March 11
  • Location: Neowin Detainment Camp

Posted 06 January 2013 - 14:43

1. See above. It's called memory leaks - due to certain complexity of sharing resources among all the programs OS thinks that some resources are still in use, because software didn't tell that it doesn't need it anymore (because it was terminated prematurely, probably because of an error... or a careless developer just didn't care, as it happens). Eventually, it starts to take while to keep up with all the bogus data.

2. Because some software developers are careless eejits. Save for driver installation and system updates, I allow no restart to take place and it works just fine.

3. It's two cores both running at said 2 GHz in a single package. Kind of like two separate processors, but still sharing certain resources like cache and... how to say it simply... scheduler, the thing that tells processor what to do next.
Beware of hyperthreading, though - it says it has twice as much cores even though it doesn't and gets away with it most of the time, because modern CPUs are very complex devices and actually succeed doing other tasks while waiting for data to arrive for another task.

4. Good answers already by good people above.

5. For general use - the amount. It's already the fastest type of memory, except for CPU's own cache (which is for temporary use, however). Bandwidth of RAM in modern computers exceeds 20 GB/s, access time is several tens of nanoseconds. There's little use in speeding that up, if an average SSD is 100 times slower to read/write.
Case in point, I've been running 8 GB at 1333 MT/s CL6 (so heavily OC'ed). Now running 16 GB at 1600 MT/s CL9 (20% "slower"), and there's no difference whatsoever.

#8 Mike

Mike

    Neowinian Senior

  • Joined: 11-August 02

Posted 06 January 2013 - 14:54

2. Restarting shouldn't fix anything and shouldn't be used to fix anything except an issue with the OS. If a program has a memory leak, just restart that program. Any resources a processes was using or had reserved will be freed when it terminates. If anything, restarting will slow things down because the OS will cache files that have been read in RAM and so after a restart that cache will be empty and things will take (slightly) longer to open which is why bench marks have cold start times for applications and hot start times.

5. More RAM will stop swapping back to the hard drive and so will speed things up when you're hitting the storage limit of your RAM. Faster RAM will speed up operations of storing / reading things in the RAM so both can have an effect on performance. I would always go for the fastest RAM possible and enough RAM plus a bit so your PC won't start swapping daily.