[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Reading huge files
From: |
Giorgos Keramidas |
Subject: |
Re: Reading huge files |
Date: |
Thu, 11 Jan 2007 00:33:51 +0200 |
User-agent: |
Gnus/5.11 (Gnus v5.11) Emacs/22.0.92 (berkeley-unix) |
On Wed, 10 Jan 2007 22:27:57 +0200, Eli Zaretskii <eliz@gnu.org> wrote:
>> From: Mathias Dahl <mathias.dahl@gmail.com>
>> Date: Wed, 10 Jan 2007 08:39:16 +0100
>>
>> Eli Zaretskii <eliz@gnu.org> writes:
>>
>> > If this is a 32-bit machine, then the Lisp integer overflows at
>> > 256MB. If that's a 64-bit machine, then you don't need any tricks
>> > to edit 700MB files.
>>
>> Okay. Then I probably never scrolled that far when I tested :)
>
> If you never scrolled past 256MB, then you indeed wouldn't notice.
>
>> I guess
>> the head + tail trick would work though, I suspect those tools does
>> not have this limitation.
>
> Yes, that should work.
FWIW, I've also uses split(1) with great success, when I had to
process *huge* log files and tcpdump output, ranging from 500 MB
to almost 1 GB of text.
Handling multi-hundred-megabyte files works fine, for instance,
with something like:
freebsd % split -b 50m huge-file
or with:
linux % split -b 50000000 huge-file
Message not available
Message not available