[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: cut fails with "cut: memory exhausted" when taking a large slice
From: |
Jim Meyering |
Subject: |
Re: cut fails with "cut: memory exhausted" when taking a large slice |
Date: |
Wed, 21 Apr 2004 21:35:21 +0200 |
Mordy Ovits <address@hidden> wrote:
> When I try to cut a 412,569,600 byte chunk from the middle of a 9GB file, cut
> fails. I believe it is trying to allocate memory for the full 412MB, which
> is not the best way to do it. It's best to do it in chunks, like cp does.
>
> Also, does cut support files larger than 4GB?
>
> This in on Linux 2.4.22 with glibc 2.3 and coreutils-5.0-6.1.92mdk
>
> Even if its a bug, can you recommend another tool that will allow taking a
> large slice from a huge file?
Thanks for the report.
This is fixed in more recent releases.
cut did have a pretty serious limitation up until 5.0.90,
so since your version looks like it's based on an earlier
version, I'm not surprised that it failed.
>From the NEWS file for coreutils-5.0.91:
** Fewer arbitrary limitations
cut requires 97% less memory when very large field numbers or
byte offsets are specified.
If you want to continue using cut, you'll have better
luck with the latest:
ftp://ftp.gnu.org/gnu/coreutils/coreutils-5.2.1.tar.gz
ftp://ftp.gnu.org/gnu/coreutils/coreutils-5.2.1.tar.bz2
Or, just use head and tail with their --bytes=N options.
e.g., head --bytes=N < FILE | tail --bytes=412569600
where N is chosen so that the head command outputs everything
in the file up to and including the desired range of bytes.
Re: cut fails with "cut: memory exhausted" when taking a large slice, Ken Wolcott, 2004/04/21