ftp.delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1996/11/09/21:49:55

Message-Id: <199611100241.PAA26841@papaioea.manawatu.gen.nz>
Comments: Authenticated sender is <malcolm AT mail DOT manawatu DOT gen DOT nz>
From: "Malcolm Taylor" <malcolm AT manawatu DOT gen DOT nz>
Organization: Grafik Software
To: mert0407 AT sable DOT ox DOT ac DOT uk (George Foot), djgpp AT delorie DOT com
Date: Sun, 10 Nov 1996 15:38:28 +1200
Subject: Re: Why not to use 'tar' before packing DJGPP?
Reply-to: malcolm AT manawatu DOT gen DOT nz

> Robert Babcock (babcock AT shell1 DOT cybercom DOT net) wrote:
> : mert0407 AT sable DOT ox DOT ac DOT uk (George Foot) writes:
> : If the goal is to shrink the distribution files without requiring the
> : use of a utility which may not be easily available to DOS users, you
> : could first make uncompressed ZIP files, then compress those.
> 
> Sorry, I don't really understand tar (yes, I'm a Dos user...), but I 
> thought the point of the original article was that tar could achieve 
> better compression ratios than zip? The quoted figures certainly looked 
> impressive...

All tar does is take each file and it's name, attribs, date... and 
concatenate that into one big file (hence archive it). It doesn't do 
any compression at all. In Unix you then compress this tar file with 
Compress or GnuZip. The reason why this can get significantly better 
compression than zip is that zip compresses each file seperately, 
ignoring any similarity between files. In source distributions the 
similarity can be quite great.
The above proposition is to zip all the files with no compression 
(this just emulates what tar does) then zip the big file again, but 
with compression.

I would still go for a new archiver :) The question is what would the 
minimum requirements for this be? As in whats the smallest amount of 
Physical memory expected on the users computer? 

Malcolm

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019