[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: opening large files (few hundred meg)

From: Stefan Monnier
Subject: Re: opening large files (few hundred meg)
Date: Wed, 30 Jan 2008 15:01:44 -0500
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/23.0.50 (gnu/linux)

>> Perhaps you could process the file in chunks, using the optional args
>> to insert-file-contents to put subsets of the file into a buffer.
>> I haven't tried this myself, so I am not even sure it would work.

> No need to try: it won't work.  As I wrote earlier in this thread, the
> problem is that Emacs cannot address offsets into the buffer larger
> than 0.5 gig, and this problem will cause the arguments to
> insert-file-contents to overflow exactly like when you read the entire
> file.

You don't have to use the built in limits of insert-file-contents: you
can extract parts of the file using `dd' first (using Elisp floats to
represent the larger integers).

Also it'd be easy enough to extend insert-file-contents (at the C level)
to accept float values for BEG and END (or pairs of integers) so as to
be able to represent larger values.

It's quite doable.  The way I see it, a large-text-buffer would
generally have 3 chunks of N megabytes each, point being in the
middle one.  The 1st and 3rd chunks would be covered with
a `point-entered' property that would automatically slide the window
forward or backward to bring point back into the middle chunk.
That wouldn't be sufficient to make it all work, but it's probably
a good starting point.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]