bug-gnu-utils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Bug-gnu-utils] Tar very slow with many files / hard links


From: Samuli Karkkainen
Subject: [Bug-gnu-utils] Tar very slow with many files / hard links
Date: Wed, 20 Sep 2000 22:28:55 +0300

When trying to read a largish set of files majority of which are hard
links, tar version 1.13.11 (from Redhat 6.1) becomes very slow. The
problem appears when I do the following (*):

mount /backup
cp -ax / /backup/1
cp -al /backup/1 /backup/2
cp -al /backup/2 /backup/3
[...]
cp -al /backup/49 /backup/50
tar -c -f /dev/st0 /backup

So there is about 200,000 inodes and for each at least 50 directory
entries, making some 1,000,0000 directory entries. The tar command
above starts fine, but at some point it gets very, very slow and
consumes all available CPU. It takes days to finish, when it should
finish in a few hours.

A glance at the sources indicates that hard links are searched for
using a linked list...


(*) That's a slight simplification. In place of "cp -al olddir newdir"
there is actually
  cp -al olddir newdir
  rsync --archive --hard-links --whole-file --sparse --one-file-system \
        --delete --force / newdir
So that I end up with a snapshot of the root filesystem, with files
unchanged since previous snapshot being hard links to the previous
snapshot.

-- 
  Samuli Kärkkäinen                   |\      _,,,---,,_
 address@hidden /,`.-'`'    -.  ;-;;,_------
http://www.woods.iki.fi              |,4-  ) )-,_. ,\ (  `'-'
                                     '---''(_/--'  `-'\_)


reply via email to

[Prev in Thread] Current Thread [Next in Thread]