Code Saturne build

  • Torben
  • Topic Author
  • Offline
  • Expert Boarder
  • Expert Boarder
More
10 years 10 months ago #5003 by Torben
Code Saturne build was created by Torben
Hello. I have decided to manually build a Code Saturne 1.3.3 in a try order to get the mpi funtionality. Now the gui and basic functionality seems working and calculations can be added and executed. But ...

The chr.med and other result files remain in the tmp_saturne folder instead of being moved to the study folder as I would expect them to.

Any hints on what could be missing? I enclose the resume file.<br /><br />Post edited by: Torben, at: 2010/12/01 13:26
Attachments:

Please Log in or Create an account to join the conversation.

  • Torben
  • Topic Author
  • Offline
  • Expert Boarder
  • Expert Boarder
More
10 years 10 months ago #5004 by Torben
Replied by Torben on topic Re:Code Saturne build
Sorry, apparently I didnt enlose the resume file, so here it is listed:

[code:1]========================================================
STARTING TIME : 12011213
CS_HOME : /home/code_saturne/Noyau/ncs-1.3.3
ECS_HOME : /home/code_saturne/Enveloppe/ecs-1.3.3
#============================================================================
#
# Code_Saturne version 1.3
#
#
#
# This file is part of the Code_Saturne Kernel, element of the
# Code_Saturne CFD tool.
#
# Copyright (C) 1998-2008 EDF S.A., France
#
# contact: This email address is being protected from spambots. You need JavaScript enabled to view it.
#
# The Code_Saturne Kernel is free software; you can redistribute it
# and/or modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2 of
# the License, or (at your option) any later version.
#
# The Code_Saturne Kernel is distributed in the hope that it will be
# useful, but WITHOUT ANY WARRANTY; without even the implied warranty
# of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with the Code_Saturne Kernel; if not, write to the
# Free Software Foundation, Inc.,
# 51 Franklin St, Fifth Floor,
# Boston, MA 02110-1301 USA
#
#============================================================================
#
# Architecture
NOM_ARCH=`uname -s`
if [ &quot;$NOM_ARCH&quot; = &quot;Linux&quot; ] ; then
if [ &quot;`hostname | cut -c1-6`&quot; = &quot;tantal&quot; ] ; then
NOM_ARCH=Linux_CCRT
elif [ &quot;`uname -m`&quot; = &quot;ia64&quot; ] ; then
NOM_ARCH=Linux_IA64
elif [ &quot;`domainname 2&gt;/dev/null`&quot; = &quot;cluster-chatou&quot; ] ; then
NOM_ARCH=Linux_Ch
elif [ -d /bgl/BlueLight/ppcfloor ] ; then
NOM_ARCH=Blue_Gene_L
elif [ -d /bgsys/drivers/ppcfloor ] ; then
NOM_ARCH=Blue_Gene_P
else
MACHINE=`uname -m`
case &quot;$MACHINE&quot; in
*86) NOM_ARCH=Linux ;;
x86_64) NOM_ARCH=Linux_x86_64 ;;
ia64) NOM_ARCH=Linux_IA64 ;;
*) NOM_ARCH=Linux_$MACHINE ;;
esac
fi
fi

# Code_Saturne version

if [ &quot;$NOM_ARCH&quot; = &quot;Linux_CCRT&quot; -o &quot;$NOM_ARCH&quot; = &quot;Linux_IA64&quot; ] ; then
CS_ROOT=/home/cont002/saturne
elif [ &quot;$NOM_ARCH&quot; = &quot;Blue_Gene_L&quot; ] ; then
CS_ROOT=/gpfs2/home/saturne
elif [ &quot;$NOM_ARCH&quot; = &quot;Blue_Gene_P&quot; ] ; then
CS_ROOT=/gpfs/home/saturne
else
CS_ROOT=/home/saturne
fi
CS_HOME=/home/code_saturne/Noyau/ncs-1.3.3
ECS_HOME=/home/code_saturne/Enveloppe/ecs-1.3.3
CSGUI_HOME=/home/code_saturne/Interface/ics-1.3.3
SYRCS_HOME=/home/code_saturne/opt/syr_cs-2.1.1
#
if [ &quot;$NOM_ARCH&quot; = &quot;Linux&quot; -a -d /home/prevalcs/HOMARD ] ; then
CSHOMARD_HOME=/home/prevalcs/HOMARD
else
CSHOMARD_HOME=
fi
#
# Path
PATH=$CS_HOME/bin:$ECS_HOME/bin:$PATH

# Libraries for the GUI (if necessary)
if [ -d ${CS_ROOT}/opt/libxml2-2.6.19/arch/${NOM_ARCH}/lib ] ; then
LD_LIBRARY_PATH=$CS_ROOT/opt/libxml2-2.6.19/arch/$NOM_ARCH/lib:$LD_LIBRARY_PATH
fi

# Export variables
export NOM_ARCH CS_ROOT CS_HOME ECS_HOME CSGUI_HOME
export SYRCS_HOME CSHOMARD_HOME
export PATH LD_LIBRARY_PATH

# Paths and libraries for MPI
if [ &quot;$NOM_ARCH&quot; = &quot;Linux&quot; -o &quot;$NOM_ARCH&quot; = &quot;Linux_x86_64&quot; ] ; then
CS_MPI_PATH=None/bin
elif [ &quot;$NOM_ARCH&quot; = &quot;Linux_Ch&quot; ] ; then
CS_MPI_PATH=None/bin
else
CS_MPI_PATH=None/bin
fi
export CS_MPI_PATH
USER : tpa
ARCHITECTURE : Linux
========================================================
MACHINE :
Linux localhost 2.6.32-21-generic #32-Ubuntu SMP Fri Apr 16 08:10:02 UTC 2010 i686 GNU/Linux
N PROCS : 1
PROCESSORS : default
========================================================
CASE : CAS1
CONFIG. :
DATA : /home/tpa/X/CAS1/DATA
FORT : /home/tpa/X/CAS1/FORT
RESU : /home/tpa/X/CAS1/RESU
REP. RUN : /home/tpa/tmp_Saturne/X.CAS1.12011213
EXECUTABLE : cs13.exe
LIB :
f COMPILER :
c COMPILER :
f OPTIONS :
End date : 12011213
========================================================[/code:1]

Please Log in or Create an account to join the conversation.

More
10 years 10 months ago #5005 by Claus
Replied by Claus on topic Re:Code Saturne build
Just a note, I use 2.0RC2 and can utilize MPI just fine - on ubuntu 10.04 that is.

Can you attach you listing file instead, that will hint if the calculation was terminated prematurely.

Regards,

Claus

Code_Aster release : STA11.4 on OpenSUSE 12.3 64 bits - EDF/Intel version

Please Log in or Create an account to join the conversation.

  • Torben
  • Topic Author
  • Offline
  • Expert Boarder
  • Expert Boarder
More
10 years 10 months ago #5006 by Torben
Replied by Torben on topic Re:Code Saturne build
So I recompiled Code_Saturne. That seems to have removed the problem. Also openmpi runs very nice. Up to now I have not checked the gain in performance. But thank you for your time.
By the way - it seems that Code Aster as delivered with Salome-MECA 2010.2 draws on both CPUs apparently independent of openmpi so maybe they use a different technology.

Please Log in or Create an account to join the conversation.

More
10 years 10 months ago #5007 by Claus
Replied by Claus on topic Re:Code Saturne build
In some cases it's close to 100%(!) increase in speed here when using MPI in CS.

About the Aster bundled with MECA, yes, I've noticed that as well - it is some trick that the Intel compiler does, but the version I've compiled myself using Intel compilers does not use the cores as efficiently. Im really curious what makes the SaloméMECA version run so efficiently.

Glad you got CS working :)

Regards,

Claus

Code_Aster release : STA11.4 on OpenSUSE 12.3 64 bits - EDF/Intel version

Please Log in or Create an account to join the conversation.

Moderators: catux
Time to create page: 0.126 seconds
Powered by Kunena Forum