[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

extracting milions of symlinks

From: Ondrej Dubaj
Subject: extracting milions of symlinks
Date: Mon, 2 Dec 2019 10:07:33 +0100

I would like to ask about behaviour of tar when extracting millions of symlink. We have a problem, where we extract 10 mil. symlinks on 1000 files and tar allocates in the end approx. 3GB of memory for extracting. Is it expected behaviour? Memory is allocated during the whole process of extraction and it is not freed until the very end. There are no major leaks during extraction, but I would rather be certain that this is ok, as there is already an issue created on redhat bugzilla with full description here:


Is there a way that the memory can be freed also during extraction in case it is not needed anymore?
Thanks for your help!

Best regards,
Ondrej Dubaj

reply via email to

[Prev in Thread] Current Thread [Next in Thread]