help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Trouble with MPITB for MatlabR2007b + lam/mpi7.1.4+ icc/ifort10.0.026+op


From: yoshi
Subject: Trouble with MPITB for MatlabR2007b + lam/mpi7.1.4+ icc/ifort10.0.026+openSUSE10.2
Date: Sun, 24 Feb 2008 18:31:57 -0800 (PST)

Dear Sirs

I am a beginer to MPITB.

Sorry if My question may be silly.

I have trouble with MPIITB for Matlab(MPITB-FC3-R14SP1-LAM711.gz)

Last week, I installed icc/ifort10.0.026+Lam/mpi-7.1.4 on openSUSE10.2 with
XEON(QUAD).
I used lam7.1.4 because I couldn't find lam7.1.1.
 Lamtests7.1.4 and himenoBMT worked well.

Because I found all of MPI?????. mexgln under
$(MATLAB)/toolbox/mpitb/mpi/MEX could not work, I executed  Makefile with
changing mexgln to mexa64 and successfully  got  MPI_???.mexa64 and put them
into  $(MATLAB)/toolbox/mpitb/mpi/MEX.

I am trying to test TUTORIAL but couldn't.

MPI_COMM_WORLD, MPI_COMM_SELF, and MPI_COMM_NULL has trouble.
MPICOMM_WORLD.mexa64, MPI_COMM_SELF.mexa64, and MPI_COMM_NULL.mexa64 was
successfully compiled and made.

What else shuold I do to use MPITB with Matlab?  

Output by TUTORIAL is belows

Set SSI rpi to tcp with the command:
  putenv(['LAM_MPI_SSI_rpi=tcp']), MPI_Init
Help on MPI: help mpi
>> [info flag]=MPI_Initialized             % info=0, flag=0

info =

     0


flag =

     0

>> MPI_Init

ans =

     0

>> [info flag]=MPI_Initialized             % flag=1

info =

     0


flag =

     1

>>
......................................
>> [info rank]=MPI_Comm_rank(MPI_COMM_WORLD)
??? Error using ==> MPI_COMM_WORLD
Out of memory. Type HELP MEMORY for your options.

>> [info size]=MPI_Comm_size(MPI_COMM_SELF)
??? Error using ==> MPI_COMM_SELF
Out of memory. Type HELP MEMORY for your options.

>>

-- 
View this message in context: 
http://www.nabble.com/Trouble-with-MPITB-for-MatlabR2007b-%2B-lam-mpi7.1.4%2B-icc-ifort10.0.026%2BopenSUSE10.2-tp15672952p15672952.html
Sent from the Octave - General mailing list archive at Nabble.com.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]