\hspace{0.5cm}The paper from \cite{MEYER2021103936} outlines a procedure for generating random realizations of larger flow networks, taking an existing base network, obtained from a scan of a porous medium, as input. Said network consists of spherical pores that are connected by cylindrical throats. The \emph{netflow} package that implements the algorithm from above also provides functionality for solving and analyzing the flow through such networks. The former of which is now to be parallelized in a distributed fashion so that it may support larger networks with millions of pores.

\hspace{0.5cm}The paper from \cite{MEYER2021103936} outlines a procedure for generating random realizations of larger flow networks, taking an existing base network, obtained from a scan of a porous medium, as input. Said network consists of spherical pores that are connected by cylindrical throats. The \emph{netflow} package that implements the algorithm from above also provides functionality for solving and analyzing the flow through such networks. The former of which is now to be parallelized in a distributed fashion so that it may support larger networks with millions of pores.

\section{Parallel Flow Solver}

\hspace{0.5cm}In order to interface the chosen C API of PETSc with the \emph{netflow} Python module, we rely on Cython to wrap the C source, that is subsequently compiled with all required compilation and linking flags of PETSc. This allows us to invoke a \verb|solve_py| function from Python delegating the relevant parameters, namely the system matrix and right-hand-side vector, to C code, which in turn utilizes PETSc to iteratively solve the system in parallel with the available MPI processes. To avoid data duplication of the fairly large and sparse system matrix, it is only assembled on the root rank and then communicated in parts to the corresponding ranks through PETSc's \verb|Assembly| routines. The iterative method chosen to solve the pressure system arising from the flow network is GMRES together with a left algebraic multi-grid preconditioner supplied via hypre \cite{hypre-web-page}. In order to assess the quality of the pressure-solution obtained by this solver, we study the fluxes induced by the pore pressures for a given base network containing 2636 pores.

\subsection{PETSc Interface}

\hspace{0.5cm}In order to interface the chosen C API of PETSc with the \emph{netflow} Python module, we rely on Cython to wrap the C source, that is subsequently compiled with all required compilation and linking flags of PETSc. This allows us to invoke a \verb|solve_py| function from Python delegating the relevant parameters, namely the system matrix and right-hand-side vector, to C code.

\subsection{Solver}

\hspace{0.5cm}The actual solver written in C then utilizes PETSc to iteratively approximate the solution of the system in parallel with the available MPI processes. To avoid data duplication of the fairly large and sparse system matrix, it is only assembled on the root rank and then communicated in parts to the corresponding ranks through PETSc's \verb|Assembly| routines. The iterative method chosen to solve the pressure system, arising from the flow network, is GMRES together with a left algebraic multi-grid preconditioner supplied via hypre \cite{hypre-web-page}.

\subsection{Results}

\hspace{0.5cm}In order to assess the quality of the pressure-solution obtained by this solver, we study the fluxes induced by the pore pressures for a given base network containing 2636 pores. In particular, we look at the sum of all in- and out-going fluxes per pore, obtained from the function \verb|flux_balance|, and aggregated over all pores of the network. As expected from the conservation of mass, the mean is close to $0$ ($\approx10^{-10}$) and the maximum is $\approx10^{-7}$, which is in complete agreement with existing single-core solvers implemented in \emph{netflow}.