SIGSEGV fault for large problems

More
4 years 8 months ago #2436 by gcdiwan

(i believe "masterinverse") without telling you (not very pretty, I know), which is why mpi_poisson.py works.

If this is sequential, I am guessing i will only be limited with the memory - i will give this a try but this will probably be too slow.

or with how the job is started.


Even with
Code:
mpirun -n 8 ngspy mpi_poisson.py
or
Code:
mpirun -n 8 python3 mpi_poisson.py
I have no success and I get SIGSEGV faults.

Are you using the MUMPS built with NGSolve or a separate MUMPS install? We have had issues with MUMPS 5.1 for larger problems. Upgrading to 5.2 resolved those.

If you use a seperate MUMPS install you have to make sure that that MUMPS and NGSolve have been built with the same MPI libraries



There's no other MUMPS installation on the cluster other than the one that gets downloaded with CMake and the version is 5.2.0.

Thinking about the compatibilities: do you think the following is an issue:
For ngsolve i seem to have (in my ngsolve-src/build/CMakeCache.txt)
Code:
CMAKE_CXX_COMPILER:FILEPATH=/opt/gridware/depots/3dc76222/el7/pkg/compilers/gcc/8.3.0/bin/g++ CMAKE_C_COMPILER:FILEPATH=/opt/gridware/depots/3dc76222/el7/pkg/compilers/gcc/8.3.0/bin/gcc CMAKE_Fortran_COMPILER:FILEPATH=/opt/gridware/depots/3dc76222/el7/pkg/compilers/gcc/8.3.0/bin/gfortran

whereas, within the Makefile.inc (in build/ngsolve-src/build/dependencies/src/project_mumps/Makefile.inc) for mumps, i have:
Code:
CC = /opt/gridware/depots/3dc76222/el7/pkg/mpi/openmpi/2.0.2/gcc-4.8.5/bin/mpicc FC = /opt/gridware/depots/3dc76222/el7/pkg/mpi/openmpi/2.0.2/gcc-4.8.5/bin/mpif90 FL = /opt/gridware/depots/3dc76222/el7/pkg/mpi/openmpi/2.0.2/gcc-4.8.5/bin/mpif90

Thanks for your help, much appreciated!
More
4 years 8 months ago #2437 by lkogler
What does "mpicc --version" produce?
If MPI was built with gcc 4.8.5 and NGSolve with gcc 8.3 that might lead to problems.
Ideally you want to use the same compilers.
Do you have access to an MPI version that has been compiled with a compiler version that can also compile NGSolve (e.g gcc 7.3/8.3)?
More
4 years 8 months ago #2438 by JuliusZ
I can confirm this issue.
I had the same problem and compiling openmpi with gcc 8.3.0 fixed it.
More
4 years 8 months ago #2439 by lkogler
Maybe we should show a warning when the compiler versions mismatch.
More
4 years 8 months ago #2440 by JuliusZ
Probably.
But maybe a brief section in the documentation would already be sufficient.
I will share my experience once I got more experience with MPI and also ngs-petsc such that user users can install working versions from the first moment on.
More
4 years 8 months ago #2441 by gcdiwan

What does "mpicc --version" produce?
If MPI was built with gcc 4.8.5 and NGSolve with gcc 8.3 that might lead to problems.
Ideally you want to use the same compilers.


I have:
Code:
mpicc --version gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-16)

Do you have access to an MPI version that has been compiled with a compiler version that can also compile NGSolve (e.g gcc 7.3/8.3)?


I am actually confused to see the output of module load for mpi build of ngsolve: so when i load the mpi build for ngsolve via module load, i see:
Code:
module load apps/ngsolve_mpi/6.2/gcc-8.3.0+python-3.8.0+mpi-2.0.2+atlas-3.10.3+scalapack_atlasshared-2.0.2 libs/gcc/8.3.0 | OK apps/python/3.8.0/gcc-4.8.5 | -- libs/gcc/system ... VARIANT (have alternative: libs/gcc/8.3.0) | OK libs/atlas/3.10.3/gcc-8.3.0 | -- libs/gcc/8.3.0 ... SKIPPED (already loaded) | OK mpi/openmpi/2.0.2/gcc-4.8.5 | -- libs/gcc/system ... VARIANT (have alternative: libs/gcc/8.3.0) | OK libs/scalapack_atlasshared/2.0.2/gcc-8.3.0+openmpi-2.0.2+atlas-3.10.3 | -- libs/gcc/8.3.0 ... SKIPPED (already loaded) | -- mpi/openmpi/2.0.2/gcc-4.8.5 ... SKIPPED (already loaded) | -- libs/atlas/3.10.3/gcc-8.3.0 ... SKIPPED (already loaded) | OK apps/ffmpeg/4.1.3/gcc-4.8.5 | -- libs/gcc/system ... VARIANT (have alternative: libs/gcc/8.3.0) | OK


So I thought I have a 8.3 version available (as an alternative) but I see only 4.8.5 builds:
Code:
module load mpi/openmpi/1. mpi/openmpi/1.10.2/gcc-4.8.5 mpi/openmpi/1.4.5/gcc-4.8.5 mpi/openmpi/1.6.5/gcc-4.8.5 mpi/openmpi/1.8.5/gcc-4.8.5

So most likely I need to have a openmpi built with gcc 8.3?
Time to create page: 0.100 seconds