So I have a friend who I've been hammering my head against a brick wall with.
He is adamant that all digital data degrades over time.
He thinks that if you copy a digital file enough times, that it's 'quality' becomes poorer, even if there is no hardware or software fault involved.
He claims to have mp3's that have gotten worse over the years because he has copied them from one HD to a new one many times. This is his main belief as well, that digital audio *WILL* degrade when being copied, no matter what method is being used, even if it means copying a perfect flac file. His example is, if it's copied say, 10 million times, you'll notice the quality loss. (not recoded, not compressed, but literally copied)
The best example I could come up with to counter this is, imagine a torrent of an application. If it were to loose 'quality', then it wouldn't work after a period of time. If this were to happen in an application, it would simply cease to work at all. His counter argument to this is, the app would install and work but it would become less stable. Whaaaa??!!
He also think that file sharing networks will by design degrade the 'quality' of any given file.
Now I've tried to point out that if this were the case then computers in fact wouldn't work as they'd be corrupting data constantly and after a few boots, your computer would fail to work any longer because information is constantly getting cached from hard drive to ram and back again, it's a friggin fundamental rule of computing that digital data should not degrade!
Can any of you think of better examples to try and get across to this person that the way he thinks about digital degredation is simply wrong?...
Then again, this is also a person that claims to have had a hard drive open with the platters exposed for six months, with a copy of windows xp on it, working without any sort of error.... hmmmm.