Mail Archives: djgpp/1996/11/09/21:49:55
> Robert Babcock (babcock AT shell1 DOT cybercom DOT net) wrote:
> : mert0407 AT sable DOT ox DOT ac DOT uk (George Foot) writes:
> : If the goal is to shrink the distribution files without requiring the
> : use of a utility which may not be easily available to DOS users, you
> : could first make uncompressed ZIP files, then compress those.
>
> Sorry, I don't really understand tar (yes, I'm a Dos user...), but I
> thought the point of the original article was that tar could achieve
> better compression ratios than zip? The quoted figures certainly looked
> impressive...
All tar does is take each file and it's name, attribs, date... and
concatenate that into one big file (hence archive it). It doesn't do
any compression at all. In Unix you then compress this tar file with
Compress or GnuZip. The reason why this can get significantly better
compression than zip is that zip compresses each file seperately,
ignoring any similarity between files. In source distributions the
similarity can be quite great.
The above proposition is to zip all the files with no compression
(this just emulates what tar does) then zip the big file again, but
with compression.
I would still go for a new archiver :) The question is what would the
minimum requirements for this be? As in whats the smallest amount of
Physical memory expected on the users computer?
Malcolm
- Raw text -