POVray is a free demo that is computationally intensive. Packages are available for both MPI and PVM. The steps below are only for the MPI installation for POVray. Please read ahead before installation.
Instructions for Linux Beowulf Cluster
Serial version installation
Install povray on
/usr/local
on single local user with NO MPI or PVM. Download the package povlinux.tgz
$ tar zxvf povlinux.tgz $ cd povray31 $ ./install $ mv /usr/local/lib/povray31 /usr/local/ $ cd /usr/local/povray31 $ cp povray.ini $HOME/.povrayrc $ cp -R scenes $HOME $ cd /usr/local/bin $ ln -s x-povray povray $ cd /etc $ vi profile
Add the following into /etc/profile
export POVPATH=/usr/local/povray31 export POVINI=${HOME}/.povrayrc export POVINC=${POVPATH}/include
Do the following
$ source /etc/profile
OR restart computer at this point
$ cd $HOME/scenes/advanced/ $ povray -I[input.pov] -L${POVINC} -w640 -H480 +D
Note
- -w switch gives the width
- -H switch gives the height
- +D switch tells povray to display while rendering
MPI Version Installation
Patch POVray31 for MPI usage by downloading povuni_s.tgz and mpi-povray-1.0.patch.gz.
$ cp povuni_s.tgz /usr/local/ $ cd /usr/local/ $ tar zxvf povuni_s.tgz $ rm povuni_s.tgz $ cp mpi-povray-1.0.patch.gz /usr/local/povray31 $ cd /usr/local/povray31 $ gunzip mpi-povray-1.0.patch.gz $ cat mpi-povray-1.0.patch | patch -p1 $ cd /usr/local/povray31/source/mpi-unix $ which mpicc $ make
At this point check that the PATH of mpicc is located in your /etc/profile and it is exported as an environment variable. This should be already done if the xCat installation instructions where followed. Reboot the computer at this point if required. Also make sure that the mpich was compiled with ssh. Make sure that the patch for mpi-pov was compiled the a ssh compiled version of MPICH
Note: if you are using IA64, remove the -m386 flag on the FLAGs on your compile options
$ vi /etc/profile
Add the following into the /etc/profile file
export RSHCOMMAND="ssh"
source the /etc/profile file OR reboot the machine
$ cp /usr/local/povray31/source/mpi-unix/mpi-x-povray /usr/local/bin $ cd $HOME/scenes/advanced $ mpirun -np 6 /usr/local/bin/mpi-x-povray -Ichess2.pov +L${POVINC} -w640 -H480 +D
From here you should be able to see the output of the POVray drawn on screen immediately. If you are looking for benchmarks, scripts should be written to parse a few *.pov files and then the aggregate timings should be taken. Plots can be created from these to see how the cluster is scaling.
Somehow the MPI version of POV does not utilize the management nodes’ processing during render. this is unlike PVM as far as i can remember.
Remember to edit /usr/local/mpich/share/machines.LINUX to include all the nodes in the cluster to be used in compute. syntax is “hostname:proc”.
Also note that the execution of the programs can be done as a user. it doesn’t have to be root. To make the maintenance of the user directories easier, i suggest exporting the following directories
/usr /home /etc
Then we don’t have to keep creating users everywhere on the child nodes. In my cluster, user exists in all the nodes.
Installation on MacOSX Tiger
The sources used here are found in the attachments. A DMG file with compiled versions of MPI-POV is also included.
Serial version installation
POVray 3.6.1 compiles cleanly using their unix sources. Use the normal (configure, make, make install ) steps to get it running.
These steps below pertains to 3.1g. Obtain the UNIX sources from the POVRay website.
In the source/unix/makefile include the new CFLAGS for OSX
# MacOSX Darwin8.x CFLAGS = -O3 -c -ansi $(SRCINC) $(LIBPNGINC) $(ZLIBINC)
Comment out the rest of the CFLAGs
Now, in the source/libpng/makefile.std make the following changes on the CFLAGS
# MacOSX Darwin8.x CFLAGS=-I$(ZLIBINC) -O3 -c -ansi
Copy the makefile.std into source/libpng/makefile
Fnally, in source/zlib/Makefile do the same for the CFLAGS
# MacOSX Darwin8.x CFLAGS = -O3 -c -ansi
To COMPILE,
- Make the directory that you want to install POVray into
-
$ sudo mkdir -p /usr/local/povray31g
-
- Configure and make the zlib component
-
$ cd ~/temp/povray31/source/zlib $ ./configure --prefix=/usr/local/povray31g $ make $ sudo make install
-
- Compile the libpng component
-
$ cd source/libpng # make sure that your make file is there # edit the makefile to change the PREFIX to your install location # PREFIX=/usr/local/povray31g $ make $ sudo make install
-
- Compile the POVRay component *
$ cd source/unix # edit the makefile to customize your install location # POVLIBDIR = /usr/local/povray31g/lib # POVPATH = /usr/local/povray31g $ make newunix # NOTE that at this stage, i used the GCC 4.0.2 sources that is NOT provided by XCODE tools $ sudo mkdir -p /usr/local/povray31g/bin $ sudo cp x-povray /usr/local/povray31g/bin/ $ sudo mkdir -p /usr/local/povray31g/etc $ sudo cp povrayrc /usr/local/povray31g/etc/
i have some minor changes that is required. Instead of COPYING the compiled binary x-povray into the /usr/local/povray31g/bin directory by hand, i did a
$ sudo make install
However, the MAKE will complain that there is a MAN directory missing so you just have to make it
$ mkdir -p /usr/local/povray31g/man/man1
from there, remember we have the povrayrc file? I copied that to my home directory
$ cp /usr/local/povray31g/etc/povrayrc ~/.povrayrc
Then i added the LIBRARY PATHS section of the RC file
Library_Path=/usr/local/povray31g/lib ;; (+Ldir) Library_Path=/usr/local/povray31g/include Library_Path=/usr/local/povray31g/src/povray31/include
HOWEVER do note that we DO NOT have any example files, so i actually grabbed them off the povlinux.tgz from the official POVRAY website. I placed the sources into /usr/local/povray31g/src/povray31/include. Once this is done, I could run POVRay.
Another note is the way to include the POVRAY file for rendering. i initial had problems as described below, but found the solution for it
Problem: povray doesn’t run, gives error:
Never found section in file //. INI file error. Bad option syntax or error opening .INI/.DEF file 'Persistence'.
Solution: It turns out you need to specify the .pov file with the ‘I’ option, e.g. +Iwaves.pov or -Iwaves.pov, otherwise povray tries to interpret the POV file as an .ini file.
MPI Version Installation
Note: THERE IS NO MPI version on povray 3.6.1!!! You would still need the 3.1g in order to be able to do MPI versions of POVRAY
Get the patches for MPI povray at http://www.verrall.demon.co.uk/mpipov/
Instructions from the website is as follows
- Dowload the patch. (mpi-povray-1.0.patch.gz)
- Unpack the source: tar xvfz povuni_s.tga
- Apply the patch: cd povray31; gzip -dc mpi-povray*.patch.gz | patch -p1
- cd povray31/source/mpi-unix
- Monkey with the Makefile to get the right options for your platform and MPI implementation. (MPICH provides mpicc which makes life easier.)
- Build a binary: make newxwin
- Run it: mpirun -np ? ./mpi-x-povray [options]
Note You need to include the CFLAGS in the source/mpi-unix directory’s MAKEFILE described earlier. Also note that i needed to change a little stuff in render.c, povray.c, optin.c and optout.c in order to make things compile cleanly. mainly it is changing the include file path for
mpipov.h to mpipov/mpipov.h
line 62: in render.h was also commented out ( it is something about double declaring the function on extern DBL maxclr something )
The GCC used is the same one that was used to compile your earlier succesful serial version. Else, you will get the BUS ERROR again.
i got around this by changing my MPICC settings using
CC=mpicc -cc=/usr/local/gcc-4.0.2/bin/gcc
Testing
Once you have the MPI based POVRAY compiled, you will have a binary called mpi-x-povray, copy that into your path and recheck the MPI settings.
- Go to an X11 window.
- Copy the /usr/local/povray31g/etc/povrayrc file to ~/.povrayrc
- Edit the ~/.povrayrc file to suit your needs. Pay special attention to the library paths.
Once that is done, use the XTERM from X11 and fire up the mpi-povray. My test command was
$ mpirun -np 2 /usr/local/povray31g/bin/mpi-x-povray +I/usr/local/povray31g/src/povray31/scenes/advanced/woodbox.pov +D
With this you can clearly see the advantages of parallel processing.