pan-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Pan-users] reading headers efficiently due to bandwidth limitation


From: Adam Shrode
Subject: Re: [Pan-users] reading headers efficiently due to bandwidth limitation
Date: Fri, 08 Oct 2004 15:34:19 -0500
User-agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7) Gecko/20040715

mansh9 mansh9 wrote:

Hi Fellow PAN users,

A Newbie looking for some help:

. My ISP has limited the NNTP bandwidth so I do not have the luxury to download large number of headers quickly anymore (some of my preferred groups has more than a million headers), is there a way to pre screen files to download, i.e. only download certain headers which match selection criteria?

. Is there a way to control how many header PAN download each time, I figure if I can limit download to a reasonable 300K each, then for a 1.2 Milion header newsgroup, I would do it 4 times. Reason behind this is when I try download too many headers, after 8-10 hours, PAN just close and after I re-start, it will download for a few hours, then close again, may be I need to set a switch somewhere.

. Can I batch PAN to download files automatically if it meets a certain selection criteria? i.e. command line or through a script file?

Thanks in advance


Don't think there is although you have 'brag', but I don't think that's for body text, just attachments. The filtering options only apply to what you have already downloaded. Not what you downloaded.

You can control how many headers are downloaded for each group by right clicking on the group and selecting "More Download Options." It is the third radio button that allows only a specified amount of most recent headers to be downloaded.

Pan doesn't have commandline functionality. You can use something like nget here. This allows you to match text messages using regexp and download them. Set it up as a cron job and you are all set. Different tools for different jobs.

Regards




reply via email to

[Prev in Thread] Current Thread [Next in Thread]