Largest file you have ever seen (in size)?


Recommended Posts

Hey

Whats the largest file (size) you have ever encountered?

As a reminder, the following are the max sizes a file can be on some common (and not so common) file systems:

Tape for the Elektronika BK: 64 kB

FAT32: 4 GB

ext3: 2 TB

ext4: 16 TB

exFAT: 127 PB

HFS Plus: 8 EB

NTFS: 16 EB

GPFS: 512 YB

Link to comment
Share on other sites

Could just generate a txt file with the same characters over and over in it to fill the largest size and then claim to be the victor of this question. Funny part is it would compress down to like 100k zipped since it's a repeating character and header info.

Link to comment
Share on other sites

Could just generate a txt file with the same characters over and over in it to fill the largest size and then claim to be the victor of this question. Funny part is it would compress down to like 100k zipped since it's a repeating character and header info.

or make a gigantic virtualbox system file...

If you want to go ahead and waste your time to just post on this thread, go ahead.

Thing is, I doubt you have enough space, drives and/or need for 512 YB.....

Link to comment
Share on other sites

I've dealt with some text files that are several hundred GB in size. They're mostly collections used for research.

I'm currently working with a 800GB text file containing nothing but Tweets. My biggest probably is actually just finding somewhere to store all the data. I doubt many people have dealt with files > 3TB simply because the storage is difficult (unless you're using tapes etc.).

Link to comment
Share on other sites

I've dealt with some text files that are several hundred GB in size. They're mostly collections used for research.

I'm currently working with a 800GB text file containing nothing but Tweets. My biggest probably is actually just finding somewhere to store all the data. I doubt many people have dealt with files > 3TB simply because the storage is difficult (unless you're using tapes etc.).

Are these just tweets for tweets with information (Person who sent it, ID of tweet, time, etc)?

Just curious :)

Link to comment
Share on other sites

Could just generate a txt file with the same characters over and over in it to fill the largest size and then claim to be the victor of this question. Funny part is it would compress down to like 100k zipped since it's a repeating character and header info.

or make a gigantic virtualbox system file...

Why do you guys have to make things so difficult?

"cat /dev/urandom > blob"

At least I would call it blob...

Link to comment
Share on other sites

300-something GB partition image as a single file.

Have several of them (but usually not so large) on the service machine's DAS (stores whole images or stupid "important files" of clients while their disks are being RMA'd)

Link to comment
Share on other sites

I don't usually work with large sized files, but perhaps it was a RAW image file or a PSD file.

Link to comment
Share on other sites

Around 200GB, an uncompressed video file. That would be the easiest way to get a very large video file I think. Just a 1080p capture source with around 150mb/s. That's 9GB per minute so just leave it for a few weeks and you could get a petabyte of data.

Link to comment
Share on other sites

Are these just tweets for tweets with information (Person who sent it, ID of tweet, time, etc)?

Just curious :)

Basically, yes. I have access to the Twitter gardenhose, which spits out roughly 10% of all Tweets as they are posted.

This page lists all the details which are included with the ~34M Tweets I get to analyse every day:

https://dev.twitter.com/docs/platform-objects/tweets

Link to comment
Share on other sites

I did not understand a word you said....

He's probably talking about the 42 kb zip file. I have it on my machine actually, as I'd like to test it out one day. Details:

One example of a Zip bomb is the file 42.zip which is a zip file consisting of 42 kilobytes of compressed data, containing five layers of nested zip files in sets of 16, each bottom layer archive containing a 4.3 gigabyte (4 294 967 295 bytes; ~ 3.99 GiB) file for a total of 4.5 petabytes (4 503 599 626 321 920 bytes; ~ 3.99 PiB) of uncompressed data.

Can read more about it here: http://en.wikipedia.org/wiki/Zip_bomb

Link to comment
Share on other sites

  • Eric locked this topic
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.