The most common solution is to use find with xargs to backup

Something like this:

find /var/www/html/ -type f -name my*.csv  -print0 | xargs -0 tar cfvzP /backupdir/backupfile

The more files you have, especially over 1000 – this will produce weird results.

  1.  Number of files in the archive doesn’t match the output of the find
  2. xargs will split the input and the archive can be overwritten several times. getting a much smaller tar.gz than expected.
  3. only the most recent stuff or depending on your find – only the last 100 or even worse are in the archive.

If you have large numbers – like 4000 to 10000 – consider breaking it up.

Here is a solution that works no matter how many files within reason

 

find /var/www/html -type f -name my*.csv  -print0 | tar -czvf /backupdir/backupfile --null -T -

 

 

Leave a Reply