bug-cpio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-cpio] cpio can't read large files


From: Sean Fulton
Subject: Re: [Bug-cpio] cpio can't read large files
Date: Sun, 15 Jan 2006 16:08:20 -0500
User-agent: Mozilla Thunderbird 1.0.7 (Windows/20050923)

I will give it a shot. Thank you very much for the help. I will report back how it works.

sean


Sergey Poznyakoff wrote:

Sean Fulton <address@hidden> wrote:

We also had an issue with tar/find combination as follows:

the index list is:
/directory
/directory/file1
/directory/file2

Cpio just backs up the directory, tar would back up the contents of the directory (/directory/file1, /directory/file2), then the files,

Oh, yes, I've forgotten to mention this. By default, if tar encounters a
directory in the file list, it will recursively descend to this
directory and back up its contents. So, if you simply pipe the find
output to tar, you'll get each file added twice. To avoid this, use
--no-recursion option to tar. So, the corrected invocation from my
previous post will be:

find $filesys -mount $SKIPSTRING -print |
tee $LISTFILE |
tar --rsh-command=/usr/bin/ssh \
   --one-file-system \
   -c -z -f $BU_HOST:/backups/$MYHOST/$DAY/$c_filesys.cpio.gz \
   -P -T - \
--no-recursion \ $filesys

Also, as I mentioned previously, if /directory had no contents, tar would skip it, which caused a lot of problems on restore because upload and temp directories weren't on the archive. Does it currently address that?

Tar archives every file or directory name it is given, no matter what its
contents is.

Regards,
Sergey


--
Sean Fulton
GCN Publishing, Inc.
Internet Design, Development and Consulting For Today's Media Companies
http://www.gcnpublishing.com
(914) 937-2451, x203






reply via email to

[Prev in Thread] Current Thread [Next in Thread]