[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: opening large files (few hundred meg)

From: Tim X
Subject: Re: opening large files (few hundred meg)
Date: Thu, 31 Jan 2008 16:57:31 +1100
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/23.0.50 (gnu/linux)

Xah Lee <> writes:
> But now i have a problem, which i “discovered” this week. What to do
> when the file is huge? Normally, one can still just do huge files
> since these days memories comes in few gigs. But in my particular
> case, my file happens to be 0.5 gig, that i couldn't even open it in
> emacs (presumbly because i need a 64 bit OS/hardware. Thanks). So,
> given the situation, i'm thinking, perhaps there is a way, to use
> emacs lisp to read the file line by line just as perl or python. (The
> file is just a apache log file and can be process line by line, can be
> split, can be fed to sed/awk/grep with pipes. The reason i want to
> open it in emacs and process it using elisp is more just a
> exploration, not really a practical need)

I can understand the motivation. However, as you point out in your post,
the log file you want to process is line oriented and as you also point
out, perl is good a line oriented text processing (Actually, it can
handle other things just fine as well as exemplified by the many modules
that deal with large multi-line structures, such as XML files. 

As mentioned, I can understand the motivation to do something just to
see if it can be done. However, I fail to see any real practicle use in
an emacs mode that would allow editing of extremely large files. As you
pointed out, the emacs solution is good when the programmer/user wants
to move around change text and maybe even change structure using emac'
support for various structures. However, I can't see anybody doing this
type of editing on files that are hundreds of megs in size and if they
are, they really need to re-think what they are doing.  I cannot think
of a single use case where you would have hand edited files that are
hundreds of megs in size. Files of this type are typically generated by
applications and through things like logging. You don't get hand crafted
XML files that are 500Mb in size unless your mad or enjoy inflicting
pain on yourself. 

My personal stance is that you should use the most appropriate tool for
the job not simply the tool you find the coolest or the one which you
are most comfortable with - something about "if the only tool you have
is a hammer, everything looks like a nail" comes to mind. 

I can't see a use case for editing extremely large files with a text
editor and I think there are plenty of good tools for this already. The
fact that once you move to a 64 bit platform, the maximum file size
increases to the point that means there would be even less need/demand
for special modes to edit files too large to be read into emacs in one
go.Personally, I'd rather see effort put towards other areas which would
prove more beneficial. For example, it would be great to see Emacs w3
revived and efforts put in to add javascript support so that you could
visit more sites without having to leave emacs and get all that emacs
goodness as well. It would be good to see an interface into various
pacakge management systems, such as apt for Debian based systems. It
would be good to see further development or more developers working on
some of the really good packages to make them even better (i.e. auctex,
planner-mode, org-mode, glient, SES, VM, etc or just a whole new mode to
add functionality that hasn't yet been thought of and which would/may
have a real benefit to users.

Note that I'm not trying to create an arguement and as I stated, I can
fully appreciate the desire to just see if it can be done. I just don't
see any real benefit apart from the intellectual exercise (which may be
sufficient justification for many). 


tcross (at) rapttech dot com dot au

reply via email to

[Prev in Thread] Current Thread [Next in Thread]