How can we GZip every file separately?
I don't want to have all of the files in a big tar.
You can use gzip *
Note:
-k (--keep) option to keep the original files. Easy and very fast answer that will use all your CPU cores in parallel:
parallel gzip ::: *
GNU Parallel is a fantastic tool that should be used far more in this world where CPUs are only getting more cores rather than more speed. There are loads of examples that we would all do well to take 10 minutes to read... here
find . -name XYZ -print0 | parallel -0 gzipAfter seven years, this highly upvoted comment still doesn't have its own full-fledged answer, so I'm promoting it now:
gzip -r .
This has two advantages over the currently accepted answer: it works recursively if there are any subdirectories, and it won't fail from Argument list too long if the number of files is very large.
-k if you want to keep the original files.If you want to gzip every file recursively, you could use find piped to xargs:
$ find . -type f -print0 | xargs -0r gzip
gzip -9r .find . -type f -print0 | xargs -0r gzip is better.gzip *, you may also need -maxdepth 1 in find.The following command can run multiple times inside a directory (without "already has .gz suffix" warnings) to gzip whatever is not already gzipped.
find . -maxdepth 1 -type f ! -name '*.gz' -exec gzip "{}" \;
A more useful example of utilizing find is when you want to gzip rolling logs. E.g. you want every day or every month to gzip rolled logs but not current logs.
# Considering that current logs end in .log and
# rolled logs end in .log.[yyyy-mm-dd] or .log.[number]
find . -maxdepth 1 -type f ! -name '*.gz' ! -name '*.log' -exec gzip "{}" \;