Build SalomeMeca2024 MPI ------------------------ J Cugnoni 27.7.2025 1) Download sif image from Code-aster.org 2) create sandbox to edit SIF image content singularity build --sandbox SalomeMeca2024_custom salome_meca-lgpl-2024.1.0-1-20240327-scibian-11.sif 3) enter writable environment within sandbox singularity shell --writable SalomeMeca2024_custom 4) follow guide here: https://gitlab.com/codeaster-opensource-documentation/opensource-installation-development/-/blob/main/devel/compile.md # in short: cd /opt/ mkdir codeaster cd /opt/codeaster git clone https://gitlab.com/codeaster/src.git git clone https://gitlab.com/codeaster/devtools.git # build (all prerequisites already in container) cd src ./waf configure ./waf install -j 8 ./waf install test -n zzzz506c # check output of test result to make sure that job runned properly with MPI # add custom mpi version in the list of available install echo "vers : testing_mpi:/opt/codeaster/install/mpi/share/aster" >> /opt/salome_meca/V2024.1.0_scibian_univ/tools/Code_aster_frontend-202410/etc/codeaster/aster #fix for wrong as_run version for mpi testing version => may be another solution would be better.. but hard to find how to modify the final SalomeMeca environment vars properly mv /usr/local/bin/as_run /usr/local/bin/as_run_23 ln -s /usr/local/bin/as_run /opt/salome_meca/V2024.1.0_scibian_univ/tools/Code_aster_frontend-202410/bin/as_run # fix run_aster to run in MPI mode under SalomeMeca AsterStudy environment: after many debug steps, I found that the "OMPI_xxxx" variables set when running MPI job in AsterStudy are causing mpiexec to fail... # also the wrong version of as_run is used (2023 coming from SalomeMeca) and fails to identify the correct path to the testing_mpi Code-Aster install => need to prepend PATH with /usrl/local/bin to find the right as_run # PATCH: to fix this, we need to modify run_aster_main.py in /opt/codeaster/install/mpi/lib/aster/run_aster/run_aster_main.py # at line ~476; comment "proc = run(cmd, shell=True, check=False)" and add the following lines # cmdpfx ="lst=`env | grep OMPI_ | cut -d = -f 1`; for item in $lst; do echo 'unset ' $item; unset $item; done; export PATH=/usr/local/bin:$PATH; " # proc = run(cmdpfx+cmd, shell=True, check=False, capture_output=False) * thanks to ChatGPT, here is a sed command to automate this change (complicated...): cd /opt/codeaster/install/mpi/lib/aster/run_aster sed -i -E '/^[[:space:]]*proc = run\(cmd, shell=True, check=False\)/ { s/^([[:space:]]*)proc = run\(cmd, shell=True, check=False\)/\1cmdpfx = "lst=`env | grep OMPI_ | cut -d = -f 1`; for item in \$lst; do echo '\''unset '\'' \$item; unset \$item; done; export PATH=\/usr\/local\/bin:\$PATH; "\ \ \1proc = run(cmdpfx+cmd, shell=True, check=False, capture_output=False)/ }' run_aster_main.py 5) leave sandbox & build new image exit singularity build SalomeMeca2024_custom.sif SalomeMeca2024_custom # delete sandbox files (when you are sure all is good..., if not you can modify the image with singularity shell --writable SalomeMeca2024_custom) rm -rf SalomeMeca2024_custom 6) install launcher & test singularity run --app install SalomeMeca2024_custom.sif ./SalomeMeca2024_custom Fixes / debugging steps: - in case of ERROR related to user name not specified, delete the folders ~/.astkrc ~/.asterstudy and delete the pertinent salome 2024 subfolders and files in ~/.config/aster/ also delete /tmp/salome* - MPI version runs correctly from a shell session (select MPI vvesion & nb mpi cpus in ASterstudy, export Case , and run as_run myStudy.export from case directory) like so: singularity shell SalomeMeca2024_custom cd caseDirectory /opt/codeaster/install/mpi/bin/run_aster RunCase.export - MPI version with mpi_nb_cpus=1 works fine from Asterstudy directly but not with mpi_nb_cpus > 1 => issue with run script assuming a batch system I guess... # main issue discovered with MPI version: - tried to trace / debug asterstudy code and all seem fine there. "ismpi" test is ok, "isremote" tests as well and the job configuration, export files, launchscript etc all are ok - however MPI version does not start correctly when run from Salome environment (Asterstudy but also from any shell with Salome Meca env variables set). - But even from a Salome Shell , it runs well when launching "env -i /opt/codeaster/install/mpi/bin/run_aster export" which reset env variables to default. - The solver runs well when launched from a session started by "singularity shell ImageName" ; this shell has only basic env variable sets, which supports the previous finding - Tried to find which env variable is screwing the MPI execution of CodeAster testing (nb_mpi_cpus=1 works... but not >1). But could not find the culprit. - Next test: patch run_aster script to delete the environment before execution - Finally found one issue: wrong version of as_run used in launcher_script due to PATH; fix = prepend PATH with /usr/local/bin to use the new as_run - Second issue (critical in fact): mpiexec fails to run without message with just error code 1 when run from AsterStudy launcher environment. fix = unset all OMPI_xxxx env variables