getfem-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Getfem-users] Getfem in parallel, run error


From: Guerric Beaugrand
Subject: [Getfem-users] Getfem in parallel, run error
Date: Mon, 10 Jun 2013 10:45:43 +0200

Dear Getfem users,

I think I need your help. I have two processor core, I successfully compiled the library in parallel mode (mpich2, mumps 4.10) and when I run the elastostatic file with the command:
mpirun-np 2. / elastostatic. / elastostatic.param
I have the following errors:

####################################################################
min = 2.22462e-06
Nb elt to be refined : 392
Input Error: Incorrect ncuts.
Partition time 0.00189495
Trace 2 in getfem_models.cc, line 1827: Mass term assembly for Dirichlet condition
Trace 2 in getfem_models.cc, line 1869: Source term assembly for Dirichlet condition
Input Error: Incorrect ncuts.
Partition time 0.0017879
Total number of variables : 8980
Trace 2 in getfem_models.cc, line 3265: Stiffness matrix assembly for isotropic linearized elasticity
terminate called after throwing an instance of 'gmm::gmm_error'
  what():  Error in getfem_assembling_tensors.cc, line 1818 : 
error: the convex 683 is not part of the mesh
Total number of variables : 8980

=====================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   EXIT CODE: 134
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
=====================================================================================
APPLICATION TERMINATED WITH THE EXIT STRING: Aborted (signal 6)
####################################################################

I tried several version of metis, but it changes nothing. I saw Daniele Bettella (https://mail.gna.org/public/getfem-users/2008-07/msg00006.html) had had the same problem, but the answer is not in the archives.

Someone already solved this problem? Any ideas?

Thank you in advance,

regards,
Guerric

reply via email to

[Prev in Thread] Current Thread [Next in Thread]