[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: importing a huge directory tree

From: TC
Subject: Re: importing a huge directory tree
Date: Mon, 16 Oct 2000 15:38:11 -0700

This can't be source! it must be all the images...
So create just the dir structure with just the source php, asp, htm, html,
cgi etc, then import that structure.
Then checkit out somewhere, & then copy
the images into that structure & do cvs add cvs commit by each dir with the
images.... don't forget to protect the jpg, gif with either -kb or
create cvswrappers I think it would look something like this
*.gif -k 'b'
*.bmp -k 'b'
*.ico -k 'b'
*.jpp -k 'b'

-----Original Message-----
From: Strouhal, Glenn <address@hidden>
To: 'address@hidden' <address@hidden>
Date: October 16, 2000 12:13 PM
Subject: importing a huge directory tree

>Hello all,
>Although I read the docs, I'm a CVS virgin and am running out of ideas.
>I have CVS compiled, installed and has been init'ed, and can import my home
>directory as a test.  When trying to import the current production website
>across the network I ran out of memory (930+MB used) on the production
>webserver.  Whoops.
>I have a .cvsignore file.  I thought it would have slimmed down the import
>significantly more than it did.  But maybe it doesn't work like I thought
>it's syntax is different from what I used.
>1) what are the rules, tips, tricks, etc that other people have used to
>import large websites?
>2) My plan was to import the website from it's doc root excluding backups
>and binaries using the cvsignore file.  Is there a better way?
>3) Is there anyway I can specify a MAXMEM to use or commit import after so
>many files?
>4) Anything else you figure I forgot or overlooked?
>Info-cvs mailing list

reply via email to

[Prev in Thread] Current Thread [Next in Thread]