[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Lynx-dev] how to write mailcap entry for mutt AND lynx

From: Rado S
Subject: Re: [Lynx-dev] how to write mailcap entry for mutt AND lynx
Date: Wed, 29 Nov 2006 15:54:50 +0100
User-agent: Mutt/1.5.13cvs (2006-08-15)

=- Henry Nelson wrote on Wed 29.Nov'06 at 11:16:19 +0900 -=

> I recently installed "pdftotext" so I could read pdf attachments
> in mutt. It's been fantastic. To get it to work automagically, I
> put the following line in my personal mailcap:
>      application/pdf; pdftotext -layout %s -; copiousoutput
> This is great for mutt, but in lynx the output is a continuous
> stream, which when it ends, there's none of the document left on
> the screen.
> Mostly I'd prefer lynx to do as it did before I changed mailcap,
> i.e., not know how to render it and offer to download the file.
> It would also be okay to render the file, but keep the text
> loaded as the current document so I could do searches and
> navigate on the document.

The problem is that both treat the data differently:
mutt filters via mailcap, i.e. takes output of mailcap as input
for its own pager,
while lynx just dumps the ouput.

I'd prefer lynx would do the same, i.e. not dump the output but
put the output of mailcap into lynx's pager (maybe via temp-file).

> So, per the Subject, how do you write a mailcap entry that will
> work for both mutt and lynx? Many thanks.

As it is now, no way with standard stuff. You might do what I do:
use wrappers for both mutt + lynx which define a env var, in
mailcap use a wrapper which dispatches by this env var to either
dump it for mutt's pager or send it to |less in case of lynx.

If lynx would page the mailcap output like mutt, then I could get
rid of this ugly construction.

© Rado S. -- You must provide YOUR effort for your goal!
Even if it seems insignificant, in fact EVERY effort counts
for a shared task, at least to show your deserving attitude.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]