help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Parallel octave - working fine? (fwd)


From: Brian Dushaw
Subject: Parallel octave - working fine? (fwd)
Date: Sat, 16 Aug 2003 03:23:55 -0500

(I tried to send this to the parallel-octave author at 
address@hidden, but it bounced back with
no such person there - perhaps someone on this mail list has 
the answer to this question...)

parallel octave toolbox is at:
http://www.aoki.ecei.tohoku.ac.jp/octave/


---------- Forwarded message ----------
Subject: Parallel octave - working fine?

Hello,
   I have been poking around with octave on our cluster computer lately, and
I ran across your fine system for implementing communications between 
master/slave
computers running octave.  This has compiled and run for me just fine - I use 
octave 2.1.35 (Redhat linux 7.3) and octave-parallel_0.7.2 (0.7.3 would not
compile using octave 2.1.35).
   To get my feet wet with parallel octave, I thought I would write a script
to mimic the pi.f test code that comes with MPI.  I've attached this test
code; you may find it useful as an example for people.

   I am also writing because it sure looks like there is a hang up somewhere 
with the communications - the single processor part of the script takes much
less time to compute pi than the parallel version that uses 6 computers.  I
don't understand this - can you point to a way that I might redesign this
code to get the desired speed up in computation?  The parallel version takes
6 times LONGER than the single processor version; making NMAX very large does
not give the parallel component any advantage.


  I've set up my startup script, .octaverc, to have the following two lines:
myid=0;
hosts=["n50";"n51";"n52";"n53";"n54";"n55"];
where myid is 1,2,3,4, or 5 for the slave nodes.  The "hosts" variable is
there so that "sockets = connect(hosts);" will start using it.  (These are
the names of the 6 computers in our cluster, of course.)

  Thanks again for parallel octave!  Just what I was looking for!

B.D.

Attachment: testpi.m
Description: Text document


reply via email to

[Prev in Thread] Current Thread [Next in Thread]