"DR" <so*******************@yahoo.comwrote in message
news:eZ**************@TK2MSFTNGP02.phx.gbl...
Why is its substantialy slower to load 50GB of gzipped file (20GB gzipped
file) then loading 50GB unzipped data? im using
System.IO.Compression.GZipStream and its not maxing out the cpu while
loading the gzip data! Im using the default buffer of the stream that i
open on the 20GB gzipped file and pass it into the GZipStream ctor. then
System.IO.Compression.GZipStream takes an hour! when just loading 50GB
file of data takes a few minutes!
Define "loading." Do you have 50+ GB of ram? But seriously, you're adding
a lot of extra processing when opening compressed data. With uncompressed
data, you're just retrieving as much as you can hold in memory (virtual or
otherwise)... or simply copying data into memory.
Try opening a 5MB file, it should "load" about as fast as the 50GB
uncompressed file. That's because you're not actually "loading" the whole
50GB at once.
With the compressed file on the other hand, you are processing the whole
thing before any real progress can be made.
Personally if I had 50GB of data that I needed to compress (I don't have
that much on 2 hard drives by the way, and I can listen to music for a week
straight without hearing the same song twice), I wouldn't use a zip format.
-Roger Frost