General archive commands


Table of Contents

cmd:tar & cmd:gzip 
Usage 
Help 
Help 
Update 
Backing up your data with tar? Fine, but forget compression 
Exclude files from tar 
How do I exclude a file from a tar ? 
GNU tar and hidden directories 
Backing up whole disk 
How to remove a file from a tar archive 
Expand archive into directories 
Expand archive into directories 
Preserve file links 
Preserve file links 
Preserve file links 
Preserving symbol link dirs 
Preserving symbol link dirs 
Preserving symbol link dirs 

cmd:tar & cmd:gzip 

Usage 

short form 

tar -cvzf tfile files
tar -xvzf tfile

traditional form 

And if your tar lacks support of the "z" flag, I strongly recommend using

gzip -d -c file.tar.gz | tar xvf -
tar cvf - files | gzip >file.tgz

instead of the two commands you suggest above --- both for performance and diskusage reasons.

for i in *gz ; do  gzip -d -c $i | tar xvf - ; done

Help 

Main operation mode:
 -t, --list              list the contents of an archive
 -x, --extract, --get    extract files from an archive
 -c, --create            create a new archive
 -d, --diff, --compare   find differences between archive and file system
 -r, --append            append files to the end of an archive
 -u, --update            only append files newer than copy in archive
 -A, --catenate          append tar files to an archive
Operation modifiers:
 -k, --keep-old-files       don't overwrite existing files when extracting
 -U, --unlink-first         remove each file prior to extracting over it
     --recursive-unlink     empty hierarchies prior to extracting directory
 -S, --sparse               handle sparse files efficiently
 -O, --to-stdout            extract files to standard output
Device selection and switching:
 -f, --file=ARCHIVE             use archive file or device ARCHIVE
Archive format selection:
 -z, --gzip, --ungzip               filter the archive through gzip
 -Z, --compress, --uncompress       filter the archive through compress
     --use-compress-program=PROG    filter through PROG (must accept -d)
Local file selection:
 -h, --dereference            dump instead the files symlinks point to
     --no-recursion           avoid descending automatically in directories
 -N, --newer=DATE             only store files newer than DATE
     --newer-mtime            compare date and time when data changed only
Informative output:
 -v, --verbose         verbosely list files processed
 -w, --interactive     ask for confirmation for every action

Help 

taken from the Debian User's Guide:

Large-Scale Copying

tar -cSpf - /usr/local | tar -xvSpf - -C /destination

The first tar command will archive the existing directory and pipe it to the second. The second command will unpack the archive into the location you specify with -C.

Update 

Once you've performed a full backup of your files with tar, you can perform an incremental backup to keep the backup archive up-to-date.

Suppose you performed the full backup on Feb. 20, 2000, and a week later you wanted to add to the archive all the files that were created or altered since then. Use tar's -u option for this purpose. For example:

tar -uvf /home/bryan/backup/full-backup.tar /home/bryan

This command adds all the files that aren't present within the archive or that were altered since the creation of the backup archive (full-backup.tar).

Backing up your data with tar? Fine, but forget compression 

It's possible to set up a system for backing up your data with the tar utility, but don't be tempted to use tar's built-in compression commands. Sure, you can create a tar archive that's compressed with gzip (to do so, type tar -czf followed by a space, the name of the archive you want to create, and the files you want to archive). But tar's update (-u) option won't work with compressed archives, so you won't be able to implement a backup system that makes periodic updates of only those files that have been created or changed since the last update.

Is this a horrendous flaw? Not really. Data security experts agree that compression shouldn't be used for backing up your data; compression increases the chance that the data will contain errors or become corrupted. For mission-critical data, it's not worth the risk.