[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: opening large files (few hundred meg)

From: Eli Zaretskii
Subject: Re: opening large files (few hundred meg)
Date: Thu, 31 Jan 2008 00:04:02 +0200

> From: Stefan Monnier <>
> Date: Wed, 30 Jan 2008 15:01:44 -0500
> >> Perhaps you could process the file in chunks, using the optional args
> >> to insert-file-contents to put subsets of the file into a buffer.
> >> I haven't tried this myself, so I am not even sure it would work.
> > No need to try: it won't work.  As I wrote earlier in this thread, the
> > problem is that Emacs cannot address offsets into the buffer larger
> > than 0.5 gig, and this problem will cause the arguments to
> > insert-file-contents to overflow exactly like when you read the entire
> > file.
> You don't have to use the built in limits of insert-file-contents: you
> can extract parts of the file using `dd' first (using Elisp floats to
> represent the larger integers).

I was responding to a suggestion to use the optional args of
insert-file-contents to slice the file.  There are lots of other ways
of doing that, but they are unrelated to insert-file-contents being
able to read just a portion of a file, and to my response which you

> Also it'd be easy enough to extend insert-file-contents (at the C level)
> to accept float values for BEG and END (or pairs of integers) so as to
> be able to represent larger values.

One can hack Emacs to do anything -- this is Free Software, after
all.  But the OP wanted a way to visit large files without any
hacking, just by using existing facilities.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]