Programming/openMPI2017. 2. 5. 12:39

음.. 구조를 일단 파악을 해봐야 하나..

odroid 쪽은 mpich2 패키지가 없네.. 버전이 달라서 그런가?

아무튼 mpiexec나 mpirun 나 둘다 똑같이 실행파일이 연결되네

lrwxrwxrwx 1 root root 24  2월  5 12:43 /usr/bin/mpirun -> /etc/alternatives/mpirun*

lrwxrwxrwx 1 root root 25  2월  5 12:43 /usr/bin/mpiexec -> /etc/alternatives/mpiexec*

lrwxrwxrwx 1 root root 21  2월  5 12:43 /etc/alternatives/mpirun -> /usr/bin/mpirun.mpich*

lrwxrwxrwx 1 root root 22  2월  5 12:43 /etc/alternatives/mpiexec -> /usr/bin/mpiexec.mpich* 

lrwxrwxrwx 1 root root 13  2월  7  2016 /usr/bin/mpirun.mpich -> mpiexec.hydra*

lrwxrwxrwx 1 root root 13  2월  7  2016 /usr/bin/mpiexec.mpich -> mpiexec.hydra*


$ mpiexec --help


Usage: ./mpiexec [global opts] [local opts for exec1] [exec1] [exec1 args] : [local opts for exec2] [exec2] [exec2 args] : ...


Global options (passed to all executables):


  Global environment options:

    -genv {name} {value}             environment variable name and value

    -genvlist {env1,env2,...}        environment variable list to pass

    -genvnone                        do not pass any environment variables

    -genvall                         pass all environment variables not managed

                                          by the launcher (default)


  Other global options:

    -f {name}                        file containing the host names 


호스트 파일은 아래와 같이 넣는데 얘도 winbind 적용 가능하려나?

ub3:4  # this will spawn 4 processes on ub3

ub2:2  # this will spawn 2 processes on ub2

ub1    # this will spawn 1 process on ub1

ub0    # this will spawn 1 process on ub0 


[링크 : https://help.ubuntu.com/community/MpichCluster]

    [링크 : http://www.brianjp93.com/blog/building-a-beowulf-cluster-ubuntu-1404-server-lts/]

[링크 : https://www.digitalocean.com/.../how-to-create-a-beowulf-cluster-using-ubuntu-12-04-vps-instances]

[링크 : http://techtinkering.com/2009/12/02/setting-up-a-beowulf-cluster-using-open-mpi-on-linux/]

설정방법을 대충보니...

빌드 돌릴 녀석이 host node가 되고, 얘가 nfs로 공유할 디렉토리를 생성해서 work node들이 여기를 연결

빌드에는 ssh를 key적용해서 암호없이도 할 수 있도록 해야 하는 듯..

옛날에 있던 녀석이니.. rlogin이나 telnet으로는 안되려나?


$ mpiexec -info

HYDRA build details:

    Version:                                 3.2

    Release Date:                            Wed Nov 11 22:06:48 CST 2015

    CC:                              gcc   -Wl,-Bsymbolic-functions -Wl,-z,relro 

    CXX:                             g++   -Wl,-Bsymbolic-functions -Wl,-z,relro 

    F77:                             gfortran  -Wl,-Bsymbolic-functions -Wl,-z,relro 

    F90:                             gfortran  -Wl,-Bsymbolic-functions -Wl,-z,relro 

    Configure options:                       '--disable-option-checking' '--prefix=/usr' '--build=arm-linux-gnueabihf' '--includedir=${prefix}/include' '--mandir=${prefix}/share/man' '--infodir=${prefix}/share/info' '--sysconfdir=/etc' '--localstatedir=/var' '--disable-silent-rules' '--libdir=${prefix}/lib/arm-linux-gnueabihf' '--libexecdir=${prefix}/lib/arm-linux-gnueabihf' '--disable-maintainer-mode' '--disable-dependency-tracking' '--enable-shared' '--enable-fortran=all' '--disable-rpath' '--disable-wrapper-rpath' '--sysconfdir=/etc/mpich' '--libdir=/usr/lib/arm-linux-gnueabihf' '--includedir=/usr/include/mpich' '--docdir=/usr/share/doc/mpich' '--with-hwloc-prefix=system' 'CPPFLAGS= -Wdate-time -D_FORTIFY_SOURCE=2 -I/build/mpich-52KREu/mpich-3.2/src/mpl/include -I/build/mpich-52KREu/mpich-3.2/src/mpl/include -I/build/mpich-52KREu/mpich-3.2/src/openpa/src -I/build/mpich-52KREu/mpich-3.2/src/openpa/src -D_REENTRANT -I/build/mpich-52KREu/mpich-3.2/src/mpi/romio/include' 'CFLAGS= -g -O2 -fstack-protector-strong -Wformat -Werror=format-security -O2' 'CXXFLAGS= -g -O2 -fstack-protector-strong -Wformat -Werror=format-security -O2' 'FFLAGS= -g -O2 -fstack-protector-strong -O2' 'FCFLAGS= -g -O2 -fstack-protector-strong -O2' 'build_alias=arm-linux-gnueabihf' 'MPICHLIB_CFLAGS=-g -O2 -fstack-protector-strong -Wformat -Werror=format-security' 'MPICHLIB_CPPFLAGS=-Wdate-time -D_FORTIFY_SOURCE=2' 'MPICHLIB_CXXFLAGS=-g -O2 -fstack-protector-strong -Wformat -Werror=format-security' 'MPICHLIB_FFLAGS=-g -O2 -fstack-protector-strong' 'MPICHLIB_FCFLAGS=-g -O2 -fstack-protector-strong' 'LDFLAGS=-Wl,-Bsymbolic-functions -Wl,-z,relro' 'FC=gfortran' 'F77=gfortran' 'MPILIBNAME=mpich' '--cache-file=/dev/null' '--srcdir=.' 'CC=gcc' 'LIBS=-lpthread '

    Process Manager:                         pmi

    Launchers available:                     ssh rsh fork slurm ll lsf sge manual persist

    Topology libraries available:            hwloc

    Resource management kernels available:   user slurm ll lsf sge pbs cobalt

    Checkpointing libraries available:       blcr

    Demux engines available:                 poll select 


'Programming > openMPI' 카테고리의 다른 글

openMPI 서비스 설치하기...?  (0) 2019.04.02
opempi 패키지  (0) 2017.02.04
openmpi with heterogenerous  (0) 2017.02.03
openmpi with openmp  (0) 2017.02.03
OpenMP 그리고 OpenMPI  (2) 2011.09.19
Posted by 구차니