[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

lynx-dev Recursive dump

From: Vieri Di_paola
Subject: lynx-dev Recursive dump
Date: Fri, 14 Jan 2000 17:24:55 +0100 (CET)


Can Lynx dump recursively all the pages of a specified site? For instance,
can I download recursively all the web pages of and
exclude any other sites like I do not wish to use wget
for this task (is wget capable of excluding sites?).

I know that the parameter -localhost disables URLs that point to remote
hosts. How can I fill in the following command line in order to do the
recursive job?
  lynx -dump -source -localhost

Or should I use -realm which restricts access to URLs in the starting
realm? What's a realm (excuse my ignorance)?

Should I use -traversal and -crawl?

If what I asked is possible, can I get the same directory structure as in
the remote host?


Vieri Di Paola

reply via email to

[Prev in Thread] Current Thread [Next in Thread]