[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Pan-users] Re: Download of DAMAGED incompletes....

From: Duncan
Subject: [Pan-users] Re: Download of DAMAGED incompletes....
Date: Fri, 06 Feb 2004 11:35:39 -0700
User-agent: Pan/ (As She Crawled Across the Table)

Alberto BARSELLA posted <address@hidden>,
excerpted below,  on Thu, 05 Feb 2004 20:11:47 +0100:

>       a new problem which does not seem to be addressed by the last
> version (.91).  I have some binaries on my newsserver which are missing
> some parts, but not in the "normal" way.  After downloading headers,
> they are in place, but when I download the articles, I'm confronted
> with:
> Thu, 05 Feb 2004 20:01:03 - Getting article "<SUBJECT HERE> yEnc
> (11/27)" body failed: 423 No such article number in this group
> I have no idea what is the cause, but pan is unable to handle the
> download.

OK.  This is caused when the overview index on the server gets out of sync
with the actual message store.  

There are a number of reasons it can happen, but here on Cox's news
servers, for instance, the hiwinds software they were running was somehow
bugged in its handling of cancel messages, causing uncanceled messages to
get deleted from the spools as well.  At the time, the system wasn't high
powered enough to do full spam filtering, and Cox was using the services
of the spam cancelbots to help.  However, due to this bug, they had to
turn cancel processing off.  They've upgraded since then, and are working
on a spam filter now (currently deployed but just adding a header, thus
allowing in-production testing), another reason I'd like PAN to be able to
filter on non-overview headers.. oh, well..), but hiwinds software simply
doesn't seem very reliable when doing cancels on as large a volume -- a
terabyte a day I think it is now at times, as Cox is now handling on their
incoming feeds.  Thus, they've kept cancels off, which suits me just fine
because of the very real abuses of the cancel system out there.

Anyway, that's the basic problem, an overview index, from which the
overviews (aka headers, but that's inaccurate since it isn't ALL headers,
just the ones in the overview) are displayed, is out of sync with what's
actually in the spool in terms of messages, so when you attempt to
download that message, there isn't one by that number available, and you
get the error.

> It stops at this segment and I get nothing saved to disk.
> I've tried to save article text instead of attachment, but same problem.
> I've tried with manual decode, but since a fatal error occurs in
> mid-download the download is aborted. I've also tried to expand threads
> and download manually ALL parts one by one, which works, except that I
> can't download part (01/27), since adding that part automatically adds
> the entire thread.

There are two basic approaches to handling article d/l and saves.  You are
using the one that doesn't work in cases like this, attempting to d/l and
save the attachment directly.  The other way, d/ling to cache, and THEN
saving, should work better, but  of course requires a larger cache,
especially if done routinely.  It's the way I work, and working with
for-the-most-part single-part images, I still have the cache set at 4 gig,
with a normal cache size 1-2 gig.  If one were to do that routinely with
mp3s, it'd probably need to be closer to 8 gig max, 4 gig regular size,
and for those doing it with CD ISOs, MPEGs, or DVD images, the PAN max 20
gig might feel a bit small at times.

So.. if you do this much, consider increasing your cache size.  At a
minimum, it needs to be as big as the number of as yet unsaved attachments
you are working on.

Anyway, what you do is this.  Instead of hitting save attachment directly
from the overview and letting it download and auto-save, just hit
download, not save.  With multi-parts, you WILL need to expand them so you
can select all the parts, when telling it to d/l, or it will only get the
first part of each multi-part.

After the parts that are there are downloaded, THEN use either save, if
they are all there, or manual decode, if necessary because a part or more
is missing. 

A variation on this technique was pointed out earlier on this list.  If
you have multiple servers available but either none of them have the
entire post or you want to d/l the parts available from your bundled
service ISP server and get the rest from your pay-per-gig server, use the
above technique to d/l the ones you can from the one server, save them to
a PAN folder, then go to the next server and get the missing parts, and
save THEM to the folder, then the next server if you have three, and so
on, until you get all the parts or at least all the parts on all the
servers you have available.  Then, use save or manual decode from the PAN
folder you've been saving them to.

Another hint..  For MPEGS and similar files that work without the entire
thing, you can download the first part or few parts, then manual decode,
and do a preview on what you have, to see if you want to bother with the
rest or not.  (I haven't tried this, but it WAS one of the reasons
given when asking for the manual decode function, and it SHOULD work. 
It's possible you may have to move the parts to a folder, as above, to
keep PAN from trying to go get the rest, tho.  Again, I'm not sure as I
haven't tried it.)

Duncan - List replies preferred.   No HTML msgs.
"They that can give up essential liberty to obtain a little
temporary safety, deserve neither liberty nor safety." --
Benjamin Franklin

reply via email to

[Prev in Thread] Current Thread [Next in Thread]