[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Wget

From: address@hidden
Subject: Re: Wget
Date: Sun, 23 Jan 2022 10:52:19 +0200 (SAST)


I see. Fascinating story. But it probably then is too late to fix that issue 
now (old wget, old servers, old antivirus on old router - they are all possible 
sources of errors.)

Also, as was pointed out here by others (was it Tim?) as well as me: The 
continuation mechanism by its very nature will always be vulnerable to a number 
of possible conditions - some errors. others just "life happening" (like files 
being changed between the downloads. So if and when you use the -c option, be 
aware that what you get may not be what you want/need and may in fact be 
broken. So check your results. 

Your examples are all 7z archives. Most archive files are easily checked, since 
they give you an error when you unpack them. I am not so sure about iso files, 
but the ones I dealt with so far had md5 checksums available and I highly 
recommend using them. 

All in all, my feeling is that there is nothing here that needs fixing, 
probably nothing much that can be fixed. And then the situation probably has 
improved a lot since then, with most servers now handling the protocol 
correctly and disruptions less likely due to better (and faster) networks. Plus 
you no longer share your computer with others (I hope.) 

As I suggested in my previous post, there is one possibility to check for error 
situations. It is not 100% certain, but it may be very helpful in many cases. I 
call it the overlapping stitching. It would have to be incorporated into wget 
(or rather wget2, since it is a new feature.)

Start your resumed download with the last few bytes that you already have and 
compare. If they don't match you are obviously not downloading from the same 
file. Possible actions are to break off with an error message,. to start the 
download from the beginning either abandoning or saving the old fragment. 

Also remember that some file types will easily reveal if the are broken (most 
media files and compressed archives, for instance.) Others may be editable or 
salvageable in another way (typically text file or some types of office 
documents.) Others, such as any executables, could be rather disastrous when 
used without checking (think of a dll with some critical function that runs 
into execution exceptions; that could bring down your system for good.)

In closing, let me remind us older ones of the days of the modem or acoustic 
coupler, when after hours of downloading you had 60% of the file, the next 
attempt starting from byte 0 again and making it to the 20% mark and the third 
one not even getting to 10% ... for all those files out there that were static 
it was a huge achievement to have a tool that would - like a honey badger - not 
let go until it had the whole thing. And that is why I started using wget and 
to this day never stopped. And with a bit of "batch scripting" you can also 
save yourself a lot of work ...


----- Original Message -----
From: "kmb697" <kmb697@yandex.ua>
To: "Tim Ruehsen" <tim.ruehsen@gmx.de>
Cc: "bug-wget" <bug-wget@gnu.org>
Sent: Saturday, January 22, 2022 11:54:55 AM
Subject: Re:Wget

I can't reproduce that situation now.
I had not Internet in my house in 2010-2015 years.
I used share work computer in my workplace (not my house).
This computer could use several peoples.
They could shutdown it or reboot it.
>From their actions I didn't control to reboot (to shutdown) computer.
I downloaded there Linux iso and other staff.
I runed wget secretly.

I don't work that (that place) already several years.
And I know that firm doesn't that computer now.

I used wget therefore I downloaded miscellaneouses programs from sourcefore.net 
Wget can get link of file from sourceforge.net.
Other downloader can get only html page.
In that event.
My Links for example:
And other.
I downloaded also from other place.

I want to say your: maybe continuation is not reliable in wget.
Maybe wget has bad algorithm continuation.

On the router in that network worked an ativirus then.
It could damage files maybe.

No MITM attacks.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]