ForceElement example fails when using MPI
cpppetsc::Mesh::fromConnectivity()
fails for connectivity
containing elements of different order when run in parallel (n>1). The bug can be reproduced by executing mpiexec -n 2 mnt/io/build/examples/ae108-examples-ForceElement
.
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
- Show closed items
No child items are currently assigned. Use child items to break down this issue into smaller parts.
Link issues together to show that they're related.
Learn more.
Activity
-
Newest first Oldest first
-
Show all activity Show comments only Show history only
- Bastian Telgen added Bug label
added Bug label
- Bastian Telgen assigned to @webmanue
assigned to @webmanue
- Author Maintainer
root@90167695a505:/mnt/io/build/examples# mpiexec -n 2 ae108-examples-ForceElement [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Argument out of range [0]PETSC ERROR: Invalid number of face corners 1 for dimension 2 [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.12.4, Feb, 04, 2020 [0]PETSC ERROR: ae108-examples-ForceElement on a named 90167695a505 by Unknown Thu Feb 24 15:55:26 2022 [0]PETSC ERROR: Configure options --build=x86_64-linux-gnu --prefix=/usr --includedir=${prefix}/include --mandir=${prefix}/share/man --infodir=${prefix}/share/info --sysconfdir=/etc --localstatedir=/var --with-silent-rules=0 --libdir=${prefix}/lib/x86_64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-debugging=0 --shared-library-extension=_real --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lumfpack -lamd -lcholmod -lklu" --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="-lptesmumps -lptscotch -lptscotcherr" --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-superlu=1 --with-superlu-include=/usr/include/superlu --with-superlu-lib=-lsuperlu --with-superlu_dist=1 --with-superlu_dist-include=/usr/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/x86_64-linux-gnu/hdf5/openmpi -L/usr/lib/openmpi/lib -lhdf5 -lmpi" --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-hypre=1 --with-hypre-include=/usr/include/hypre --with-hypre-lib=-lHYPRE_core --prefix=/usr/lib/petscdir/petsc3.12/x86_64-linux-gnu-real --PETSC_ARCH=x86_64-linux-gnu-real CFLAGS="-g -O2 -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-Bsymbolic-functions -Wl,-z,relro -fPIC" MAKEFLAGS=w [0]PETSC ERROR: #1 DMPlexGetNumFaceVertices() line 3492 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/dm/impls/plex/plex.c [0]PETSC ERROR: #2 DMPlexCreateNeighborCSR() line 547 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/dm/impls/plex/plexpartition.c [0]PETSC ERROR: #3 DMPlexCreatePartitionerGraph_Native() line 54 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/dm/impls/plex/plexpartition.c [0]PETSC ERROR: #4 DMPlexCreatePartitionerGraph() line 429 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/dm/impls/plex/plexpartition.c [0]PETSC ERROR: #5 PetscPartitionerPartition() line 979 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/dm/impls/plex/plexpartition.c [0]PETSC ERROR: #6 DMPlexDistribute() line 1631 in /build/petsc-zg3KH7/petsc-3.12.4+dfsg1/src/dm/impls/plex/plexdistribute.c [0]PETSC ERROR: #7 handleError() line 44 in ../cpppetsc/src/include/ae108/cpppetsc/ParallelComputePolicy.h -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 63. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
Edited by webmanue - Bastian Telgen changed the description
changed the description
@telgenb Great find, thanks! The problem also occurs on Ubuntu Jammy.
Unfortunately, this example is one of the examples that don't have CI checks yet (see #64 (closed)) since the output is HDF5. However, the specific problem would have been caught by the simpler check that the return code is zero.
Please register or sign in to reply