### Minor change to thesis

parent 3680384f
 ... ... @@ -63,12 +63,13 @@ PetscErrorCode solve(const PetscScalar *data, const PetscInt *col_indices, // Initialize Mat object from csr arrays Mat An; // Set matrix values from arrays in csr format (no assembly needed) // Set matrix values from arrays in csr format ierr = MatCreate(PETSC_COMM_WORLD, &An); CHKERRQ(ierr); ierr = MatSetSizes(An, n_local, n_local, n_rows, n_rows); CHKERRQ(ierr); ierr = MatSetFromOptions(An); CHKERRQ(ierr); ierr = MatSetUp(An); CHKERRQ(ierr); // Assemble Mat on root if (rank == root) { for (PetscInt i = 0; i < n_rows; ++i) { PetscInt start = row_ptr[i]; ... ...
 ... ... @@ -20,5 +20,5 @@ \abx@aux@defaultrefcontext{0}{MEYER2021103936}{none/global//global/global} \abx@aux@defaultrefcontext{0}{petsc-web-page}{none/global//global/global} \abx@aux@defaultrefcontext{0}{hypre-web-page}{none/global//global/global} \@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\contentsline {figure}{\numberline {1}{\ignorespaces Pressures $p_{\mathrm {in}}$ and $p_{\mathrm {out}}$ are applied to in-pores and out-pores respectively, driving the network flow. The resulting pressure system is solved with the respective solvers from above and the mean total pore flux is shown in each case.}}{3}\protected@file@percent } \@writefile{lof}{\defcounter {refsection}{0}\relax }\@writefile{lof}{\contentsline {figure}{\numberline {1}{\ignorespaces Pressures $p_{\mathrm {in}}$ and $p_{\mathrm {out}}$ are applied to in-pores and out-pores respectively, driving the network flow. The resulting pressure system is solved with the respective solvers from above and the mean total pore flux is shown in each case. With PETSc using 4 processes to solve the system. (AMG = algebraic multi-grid, CG = conjugate gradients, ILU = incomplete LU-preconditioning + GMRES)}}{3}\protected@file@percent } \newlabel{fig:balance}{{1}{3}}
 This is pdfTeX, Version 3.14159265-2.6-1.40.20 (TeX Live 2019/Debian) (preloaded format=pdflatex 2021.4.27) 18 OCT 2021 22:18 This is pdfTeX, Version 3.14159265-2.6-1.40.20 (TeX Live 2019/Debian) (preloaded format=pdflatex 2021.4.27) 20 OCT 2021 11:18 entering extended mode restricted \write18 enabled. %&-line parsing enabled. ... ... @@ -748,7 +748,7 @@ Package logreq Info: Writing requests to 'thesis.run.xml'. Here is how much of TeX's memory you used: 18979 strings out of 483107 379453 string characters out of 5964630 978636 words of memory out of 5000000 979636 words of memory out of 5000000 33735 multiletter control sequences out of 15000+600000 537760 words of font info for 42 fonts, out of 8000000 for 9000 59 hyphenation exceptions out of 8191 ... ... @@ -764,7 +764,7 @@ public/amsfonts/cm/cmsy10.pfb> Output written on thesis.pdf (3 pages, 149813 bytes). Output written on thesis.pdf (3 pages, 151422 bytes). PDF statistics: 64 PDF objects out of 1000 (max. 8388607) 45 compressed objects within 1 object stream ... ...
No preview for this file type
No preview for this file type
 ... ... @@ -42,15 +42,15 @@ \vspace{5ex} \begin{multicols}{2} \section{Introduction} \hspace{0.5cm}The paper from \cite{MEYER2021103936} outlines a procedure for generating random realizations of larger flow networks, taking an existing base network, obtained from a scan of a porous medium, as input. Said network consists of spherical pores that are connected by cylindrical throats. The \emph{netflow} package that implements the algorithm from above also provides functionality for solving and analyzing the flow through such networks. The former of which is now to be parallelized in a distributed fashion so that it may support larger networks with millions of pores. \hspace{0.5cm}The paper from \cite{MEYER2021103936} outlines a procedure for generating random realizations of larger flow networks, taking an existing base network, obtained from a scan of a porous medium, as input. Said network consists of spherical pores that are connected by cylindrical throats. The \emph{netflow} package that implements the algorithm from above also provides functionality for solving and analyzing the flow through such networks. The former of which is now to be parallelized in a distributed fashion so that it may support larger networks with up to 100 millions of pores. \section{Parallel Flow Solver} \subsection{PETSc Interface} \hspace{0.5cm}In order to interface the chosen C API of PETSc with the \emph{netflow} Python module, we rely on Cython to wrap the C source in Python, that is subsequently compiled with all required compilation and linking flags of PETSc. This allows us to invoke a \verb|solve_py| function from Python delegating the relevant parameters, namely the system matrix and right-hand-side vector, to C code. \hspace{0.5cm}In order to interface the chosen C API of PETSc with the \emph{netflow} Python module, we rely on Cython to wrap the C source in Python, that is subsequently compiled with all required compilation and linking flags of PETSc. This allows us to invoke a \verb|solve_py()| function from Python delegating the relevant parameters, namely the system matrix and right-hand-side vector, to the C function \verb|solve()|. \subsection{Solver} \hspace{0.5cm}The actual solver written in C then utilizes PETSc to iteratively approximate the solution of the system in parallel with the available MPI processes. To avoid data duplication of the fairly large and sparse system matrix, it is only assembled on the root rank and then communicated in parts to the corresponding ranks through PETSc's \verb|Assembly| routines. The iterative method chosen to solve the pressure system, arising from the flow network, is GMRES together with a left algebraic multi-grid preconditioner supplied via hypre \cite{hypre-web-page}. \hspace{0.5cm}The actual solver written in C then utilizes PETSc to iteratively approximate the solution of the system in parallel with the available MPI processes. For this purpose, the system matrix and r.h.s. have to be assembled to PETSc objects \verb|Mat| and \verb|Vec| respectively. To avoid data duplication this is only done on the root rank and then communicated in parts to the corresponding ranks through PETSc's \verb|Assembly| routines. The iterative method chosen to solve the non-symmetric pressure system, arising from the flow network, is GMRES together with a left algebraic multi-grid preconditioner supplied via hypre \cite{hypre-web-page}. \subsection{Results} \hspace{0.5cm}In order to assess the quality of the pressure-solution obtained by this solver, we study the fluxes induced by the pore pressures for a given base network comprised of 2636 pores. In particular, we look at the sum of all in- and out-going fluxes per pore, obtained from the function \verb|flux_balance|, and aggregated over all pores of the network. As expected from the conservation of mass, the mean is close to $0$ ($\approx 10^{-10}$) and the maximum is $\approx 10^{-7}$, which is in complete agreement with existing single-core solvers implemented in \emph{netflow}, see Figure ~\ref{fig:balance}. \hspace{0.5cm}In order to assess the quality of the pressure-solution obtained by this solver, we study the fluxes induced by the pore pressures for a given base network comprised of 2636 pores. In particular, we look at the sum of all in- and out-going fluxes per pore, obtained from the function \verb|flux_balance()|, and aggregated over all pores of the network. As expected from the conservation of mass, the mean is close to $0$ ($\approx 10^{-10}$) and the maximum is $\approx 10^{-7}$, which is in complete agreement with existing single-core solvers implemented in \emph{netflow}, see Figure ~\ref{fig:balance}. \end{multicols} ... ... @@ -59,7 +59,7 @@ \begin{figure}[h] \centering \includegraphics[width=0.8\textwidth]{plots/flux_balance.png} \caption{Pressures $p_{\mathrm{in}}$ and $p_{\mathrm{out}}$ are applied to in-pores and out-pores respectively, driving the network flow. The resulting pressure system is solved with the respective solvers from above and the mean total pore flux is shown in each case.} \caption{Pressures $p_{\mathrm{in}}$ and $p_{\mathrm{out}}$ are applied to in-pores and out-pores respectively, driving the network flow. The resulting pressure system is solved with the respective solvers from above and the mean total pore flux is shown in each case. With PETSc using 4 processes to solve the system. (AMG = algebraic multi-grid, CG = conjugate gradients, ILU = incomplete LU-preconditioning + GMRES)} \label{fig:balance} \end{figure} ... ...
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!