AsterO'dactyle
Hello all,
I have some good news and some bad news.
I managed to compile a stable MPI-version with CA 16.7 in this container salome_meca-lgpl-2024.1.0-1-20240327.
The bad news is: I only managed to do this WITHOUT Metis and without Petsc. I can live without Petsc, but without Metis, the MPI-version does NOT make much sense, now does it?
However, I noticed the following: even in the stable 16.7 in this container there is no Metis. Why is that? Is this intentional, I suppose so? The basic output of this stable 16.7 looks like this (why does it show 'testing' in the first line?):
-- CODE_ASTER -- VERSION : DÉVELOPPEMENT STABILISÉE (testing) --
Version 16.7.0 modifiée le 10/12/2024
révision 5f2fdcddabc5 - branche 'HEAD'
Copyright EDF R&D 1991 - 2025
Exécution du : Fri Jan 10 10:43:02 2025
Nom de la machine : HP-Z8-G4-Workstation
Architecture : 64bit
Type de processeur : x86_64
Système d'exploitation : Linux-6.8.0-51-generic-x86_64-with-glibc2.31
Langue des messages : fr (UTF-8)
Version de Python : 3.9.2
Version de NumPy : 1.19.5
Parallélisme MPI : inactif
Parallélisme OpenMP : actif
Nombre de processus OpenMP utilisés : 1
Version de la librairie HDF5 : 1.10.3
Version de la librairie MED : 4.1.1
Version de la librairie MFront : 4.1.0
Version de la librairie MUMPS : 5.5.1c
Librairie PETSc : non disponible
Version de la librairie SCOTCH : 7.0.1
My questions are, for the moment, because I did not do a test run yet:
- Does this MPI-version even make sense without Metis, will Scotch work (it never worked in my CA-versions in the past)?
- I downloaded the prerequisites codeaster-prerequisites-20240327-oss.tar.gz, there is a Metis metis-5.1.0.tar.gz inside, but I cannot untar it, I get an errror something like 'not an archive'. So I cannot include Metis here.
Maybe AsterO'dactyle, can you please comment? Thanks a lot,
Mario.
I will do some testing with this MPI version today. The basic setup is this:
-- CODE_ASTER -- VERSION : DÉVELOPPEMENT STABILISÉE (testing) --
Version 16.7.0 modifiée le 10/01/2024
révision n/a - branche 'n/a'
Copyright EDF R&D 1991 - 2025
Exécution du : Fri Jan 10 10:33:42 2025
Nom de la machine : HP-Z8-G4-Workstation
Architecture : 64bit
Type de processeur : x86_64
Système d'exploitation : Linux-6.8.0-51-generic-x86_64-with-glibc2.31
Langue des messages : fr (UTF-8)
Version de Python : 3.9.2
Version de NumPy : 1.19.5
Parallélisme MPI : actif
Rang du processeur courant : 0
Nombre de processeurs MPI utilisés : 1
Parallélisme OpenMP : actif
Nombre de processus OpenMP utilisés : 1
Version de la librairie HDF5 : 1.10.9
Version de la librairie MED : 4.1.1
Version de la librairie MFront : 4.2.0
Version de la librairie MUMPS : 5.6.2.2c
Librairie PETSc : non disponible
Version de la librairie SCOTCH : 7.0.4
That's disappointing: I run into an error even before files are copied:
WARNING: If MPI_Abort is called during execution, result files could not be copied.
Running: mpiexec -n 2 --tag-output /opt/salome_meca/V2024.1.0_scibian_univ/tools/Code_aster_MPI-1670/bin/run_aster --wrkdir /tmp/run_aster_jeby4z5j --status-file /home/mario/.tmp_run_aster/run_aster_w8modc0o/status /home/mario/.tmp_run_aster/run_aster_w8modc0o/export.0
++ using container salome_meca-lgpl-2024.1.0-1-20240327-scibian-11.sif
++ prepend system 'libGL*' libs
What can I do? It cannot write to the /tmp folder it seems. Maybe the used CA version is not correct? Perhaps someone with more knowledge is able to help?