bug-coreutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#35531: problem with ls in coreutils


From: L A Walsh
Subject: bug#35531: problem with ls in coreutils
Date: Fri, 03 May 2019 22:26:29 -0700
User-agent: Thunderbird

On 5/1/2019 3:03 PM, Viktors Berstis wrote:
> When running "ls" or "ls -U" on a windows directory containing 50000 
> files, ls takes forever.  Something seems to be highly inefficient in there.
>   
---
    it sounds like you are running ls with no options
(nothing in environment and no switches on the command line).

    Is this the case?  If is, I'm stumped unless whoever
compiled that had it set to do some things by default.

    Basically on Windows, anything that you might get away with on
linux with a stat call, takes an 'open' call on windows.   That gets
costly.  Anything that appends a classifyer to the end of the file
(like ls -F, --classify or --file-type) or that would display any
of the data or size information (ls -l would be right out!).  The
only thing 'ls' could display without such a penalty is the file
name.  However that only apply to stock ls, and since we don't know
what options might have been enabled for that 'ls' (including any
default usage of switches such as those mentioned above), it's
hard to say exactly what the problem is.

    A suggestion -- try installing a minimal snapshot of 'Cygwin'
('cygwin.org') and try env -i /bin/ls on cygwin's command line
in that directory and see how fast that is.  If it is slow,  then
something excessively weird is going on that is the wonder of a closed
source Windows.  However, my hunch would have it be 'fast', but since
I don't know the cause, can't say if that would help or not.

    One further possibility that I'd think unlikely: the directory could
be very fragmented and take a long time to (5minutes?! really unlikely,
almost has to be the missing stat call) read...though the figures
you are stating sound out of bounds for a fragmented directory.
Still, if you grab the 'contig' tool from the sysinternals site (a
windows subsite), it can show you the number of fragments a file
is split into -- and can be used on directories:
/prog/Sysinternals/cmd> contig -a -v .

Contig v1.6 - Makes files contiguous
Copyright (C) 1998-2010 Mark Russinovich
Sysinternals - www.sysinternals.com
------------------------
Processing C:\prog\Sysinternals\cmd:
Scanning file...
Cluster: Length
0: 3
File size: 12288 bytes
C:\prog\Sysinternals\cmd is in 1 fragment
------------------------
Summary:
     Number of files processed   : 1
     Average fragmentation       : 1 frags/file


========
Other than those options, not sure what else to suggest to narrow
it down, but thought i'd at least mention a few possibilities.

Good luck!






reply via email to

[Prev in Thread] Current Thread [Next in Thread]