help-gnu-emacs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Handling large files with emacs lisp?


From: Klaus-Dieter Bauer
Subject: Handling large files with emacs lisp?
Date: Tue, 4 Jun 2013 14:52:55 +0200

Hello!

Is there a method in emacs lisp to handle large files (hundreds of MB)
efficiently? I am looking specifically for a function that allows
processing file contents either sequentially or (better) with random
access.

Looking through the code of `find-file' I found that
`insert-file-contents' and `insert-file-contents-literally' seem to be
pretty much the most low-level functions available to emacs-lisp. When
files go towards GB size however, inserting file contents is
undesirable even assuming 32bit emacs were able to handle such large
buffers.

Using the BEG and END parameters of `insert-file-contents' however has
a linear time-dependence on BEG. So implementing buffered file
processing for large files by keeping only parts of the file in a
temporary buffer doesn't seem feasible either.

I'd also be interested why there is this linear time dependence. Is
this a limitation of how fseek works or of how `insert-file-contents'
is implemented? I've read[1] that fseek "just updates pointers", so
random reads in a large file, especially on an SSD, should be
constant-time, but I couldn't find further verification.

kind regards, Klaus

PS: I'm well aware that I'm asking for something, that likely wasn't
    within the design goals of emacs lisp. It is interesting to push
    the limits though ;)

------------------------------------------------------------

[1] https://groups.google.com/d/msg/comp.unix.aix/AXInTbcjsKo/qt-XnL12upgJ


reply via email to

[Prev in Thread] Current Thread [Next in Thread]