[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

importing a huge directory tree

From: Strouhal, Glenn
Subject: importing a huge directory tree
Date: Mon, 16 Oct 2000 13:03:27 -0600

Hello all,

Although I read the docs, I'm a CVS virgin and am running out of ideas.

I have CVS compiled, installed and has been init'ed, and can import my home
directory as a test.  When trying to import the current production website
across the network I ran out of memory (930+MB used) on the production
webserver.  Whoops.

I have a .cvsignore file.  I thought it would have slimmed down the import
significantly more than it did.  But maybe it doesn't work like I thought or
it's syntax is different from what I used.

1) what are the rules, tips, tricks, etc that other people have used to
import large websites?
2) My plan was to import the website from it's doc root excluding backups
and binaries using the cvsignore file.  Is there a better way?
3) Is there anyway I can specify a MAXMEM to use or commit import after so
many files?
4) Anything else you figure I forgot or overlooked?



reply via email to

[Prev in Thread] Current Thread [Next in Thread]