[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
checksum failure on heavily loaded client and large files
From: |
Peter Toft |
Subject: |
checksum failure on heavily loaded client and large files |
Date: |
Tue, 4 Jul 2006 23:53:38 +0200 (CEST) |
I have problems with CVS - start reading the thread
http://lists.gnu.org/archive/html/info-cvs/2006-06/msg00005.html
and there was a second round at
http://lists.gnu.org/archive/html/info-cvs/2006-06/msg00103.html
What I see is
* for a CVS client (CVS 1.12.9 or 1.12.13),
* a server with same CVS version
* submitting over a somewhat slow network using SSH as bearer (SSH 3.6),
* where I have a single CPU Linux Red Hat AW 3 client - heavy CPU load!!
* the server is a 16 CPU Linux Red Hat AS 3
* our XML files are of this order for version 1.12
wc test.xml,v
245334 388016 2144072 test.xml,v
wc test.xml.mdb
59377 67126 431144 test.xml
* Roughly it is so that a new version of "test,xml" creates
fundamentally new content of the file i.e. diff would show
almost all (if not all) lines are changed.
* The CVS server and client are in different time zones - but both should
be NTP synched.
In this case I see from time to time that a cvs update gives
P test.xml
cvs update: checksum failure after patch to ./test.xml; will refetch
cvs client: refetching unpatchable files
cvs update: warning: `test.xml' was lost
I am getting more worried about this....
All help is highly appreciated!
Best
Peter Toft, Ph.D. address@hidden http://pto.linux.dk
"First they ignore you, then they ridicule you, then they fight you, then you
win."
-- Mahatma Gandhi
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- checksum failure on heavily loaded client and large files,
Peter Toft <=