Skip to content

Commit

Permalink
adding msi_matrix_getField
Browse files Browse the repository at this point in the history
  • Loading branch information
seegyoung committed Sep 14, 2017
1 parent e5d5686 commit 7639b2c
Show file tree
Hide file tree
Showing 6 changed files with 91 additions and 40 deletions.
6 changes: 6 additions & 0 deletions api/msi.cc
Original file line number Diff line number Diff line change
Expand Up @@ -257,6 +257,12 @@ msi_matrix* msi_matrix_create(int matrix_type, pField f)
#endif
}

pField msi_matrix_getField(pMatrix mat)
{
return mat->get_field();
}


//*******************************************************
void msi_matrix_assemble(pMatrix mat)
//*******************************************************
Expand Down
7 changes: 5 additions & 2 deletions api/msi.h
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ void msi_finalize(pMesh m);

// field creation with multiple variables
pField msi_field_create (pMesh m, const char* /* in */ field_name,
int /*in*/ nv, int /*in*/ nd, pShape shape=NULL);
int /*in*/ nv, int /*in*/ nd);
int msi_field_getNumVal(pField f);
int msi_field_getSize(pField f);

Expand Down Expand Up @@ -59,6 +59,8 @@ typedef msi_matrix* pMatrix;
/** matrix and solver functions with PETSc */
pMatrix msi_matrix_create(int matrix_type, pField f);
void msi_matrix_delete(pMatrix mat);
pField msi_matrix_getField(pMatrix mat);

void msi_matrix_assemble(pMatrix mat);

void msi_matrix_insert(pMatrix mat, int row, int column, int scalar_type, double* val);
Expand All @@ -68,9 +70,10 @@ void msi_matrix_addBlock(pMatrix mat, pMeshEnt elem, int rowVarIdx, int columnVa
void msi_matrix_setBC(pMatrix mat, int row);
void msi_matrix_setLaplaceBC (pMatrix mat, int row, int size, int* columns, double* values);

void msi_matrix_multiply(pMatrix mat, pField inputvec, pField outputvec);

void msi_matrix_solve(pMatrix mat, pField rhs, pField sol);
int msi_matrix_getNumIter(pMatrix mat);
void msi_matrix_multiply(pMatrix mat, pField inputvec, pField outputvec);

// auxiliary
void msi_matrix_write(pMatrix mat, const char* file_name, int start_index=0);
Expand Down
Binary file removed doc/MSI.pdf
Binary file not shown.
103 changes: 70 additions & 33 deletions doc/msi-api.tex
Original file line number Diff line number Diff line change
Expand Up @@ -74,9 +74,13 @@ \subsection{Initialization and Finalization}
\begin{verbatim}
void msi_start(
pMesh /* in */ m,
pOwnership /* in */ o=NULL)
pOwnership /* in */ o=NULL,
pShape /* in */ s=NULL)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a mesh and ownership handle, initialize MSI services for solver-PUMI interactions. If the ownership is not provided, the default is set to $NULL$. If the ownership is $NULL$, the PUMI's default ownership rule is used (a part with the minimum process rank is the owning part of duplicate copies).
Given a mesh, an ownership handle, and a field shape handle, initialize MSI services for solver-PUMI interactions. The ownership and field shape are optional (default: $NULL$). Ownership rule determines the owner copy among duplicated mesh entity copies by partitioning or ghosting. If the ownership handle is $NULL$, the PUMI's default rule is used (a part with the minimum process rank is the owning part of duplicate copies). Field shape defines the node distribution where the coordinates and field values (DOF's) are stored. If the shape is \emph{NULL}, the field shape saved in mesh (returned by \texttt{pumi$\_$mesh$\_$getShape}) is used. If it is not \textit{NULL}, the field shape of the mesh is set to $s$.

Combining ownership rule and field shape, local/global node numbering are generated.
See PUMI User's Guide for more information on ownership rule, field shape and numberings.

Note that the following operations should be performed prior to this function.
\begin{itemize}
Expand All @@ -87,7 +91,7 @@ \subsection{Initialization and Finalization}
\end{itemize}

\begin{verbatim}
int msi_scorec_finalize()
int msi_finalize()
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Finalize the MSI services and clears all internal data. Note that the following operations should follow to complete further finalizations.
\begin{itemize}
Expand All @@ -97,32 +101,6 @@ \subsection{Initialization and Finalization}
\item MPI finalization
\end{itemize}

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\subsection{Mesh Entity}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
In terms of mesh entity operation, MSI provides the operations only related to field ID of individual mesh entity as those are not supported in PUMI. For the rest of mesh entity operations including setting/getting field data (DOF) over nodes, use the API's in \texttt{PUMI.h}.


\begin{verbatim}
void msi_ment_getFieldID (
pMeshEnt /* in */ e,
pField /* in */ f,
int /* in */ i,
int* /* out */ start_dof_id,
int* /* out */ end_dof_id_plus_one)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a mesh entity handle, field handle and the index of node $i$, return the starting local ID and the ending local ID plus one for field data (DOF) of $i^{th}$ node of the entity.

\begin{verbatim}
void msi_ment_getGlobalFieldID (
pMeshEnt /* in */ e,
pField /* in */ f,
int /* in */ i,
int* /* out */ start_dof_id,
int* /* out */ end_dof_id_plus_one)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a mesh entity handle, field handle and the index of node $i$, return the starting global ID and the ending global ID plus one for field data (DOF) of $i^{th}$ node of the mesh entity.

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\subsection{Field}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Expand All @@ -145,9 +123,9 @@ \subsection{Field}
int /* in */ nd,
pShape /* in */ s=NULL)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given field name, the number of values per node ($nv$), the number of field data per value of node ($nd$), and a shape function, \emph{(i)} create a field for all nodes (owned, non-owned part boundary and ghost), \emph{(ii)} switch the memory space for field data to a contiguous array, and \emph{(iii)} initialize the field data. The \emph{size}, the number of field data per node is \textit{nv}$*$\textit{nd}. The size of contiguous array for each process is \textit{size}$*$\textit{nn} for real number and \textit{size}$*$\textit{nn}$*$2 for complex number, where \textit{nn} is the number of local nodes on each process. The field data type (real or complex) is determined at time of configuration. See Section~\ref{install} for how to configure MSI with complex number.
Given a mesh instance, field name, the number of values per node ($nv$), the number of field data per value of node ($nd$), and a shape function, \emph{(i)} create a field for all nodes (owned, non-owned part boundary and ghost), \emph{(ii)} switch the memory space for field data to a contiguous array, and \emph{(iii)} initialize the field data. The \emph{size}, the number of field data per node is \textit{nv}$*$\textit{nd}. The size of contiguous array for each process is \textit{size}$*$\textit{nn} for real number and \textit{size}$*$\textit{nn}$*$2 for complex number, where \textit{nn} is the number of local nodes on each process. The field data type (real or complex) is determined at time of configuration. See Section~\ref{install} for how to configure MSI with complex number.

The field shape is optional and the default is \textit{NULL}. If it is \textit{NULL}, the field shape of the mesh is used (a field shape returned by \texttt{pumi$\_$mesh$\_$getShape}).
The field shape is optional (default: \textit{NULL}). If it is \textit{NULL}, the field shape of the mesh is used (a field shape returned by \texttt{pumi$\_$mesh$\_$getShape}).

If $nv$ is 1, \texttt{msi$\_$field$\_$create (m, field$\_$name, nv, nd, s)} is equivalent to \texttt{pumi$\_$field$\_$create (m, field$\_$name, nd, PUMI$\_$PACKED, s)} followed by \texttt{pumi$\_$field$\_$freeze} and \texttt{pumi$\_$ment$\_$setField} with value $0$ for all nodes.

Expand All @@ -162,24 +140,83 @@ \subsection{Field}
Given a field handle, return the field size, \textit{nv}$*$\textit{nd}.


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\subsection{Node}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\begin{verbatim}
int msi_node_getID (
pMeshEnt /* in */ e,
int /* in */ n)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a mesh entity handle and the index of node $n$, return local number of $n^{th}$ node of the mesh entity.

\begin{verbatim}
int msi_node_getGlobalID (
pMeshEnt /* in */ e,
int /* in */ n)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a mesh entity handle and the index of node $n$, return global number of $n^{th}$ node of the mesh entity.
The global numbering is based on the ownership handle provided in \texttt{msi$\_$start}.

\begin{verbatim}
void msi_node_setField (
pField /* in */ f,
pMeshEnt /* in */ e,
int /* in */ n,
int /* in */ size_dof,
double* /* in */ dof_data
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a field handle, a mesh entity handle, the index of node $n$, size of DOF data and a double array containing DOF data, set the field data (DOF) of $n^{th}$ node of the entity.

\begin{verbatim}
int msi_node_getField(
pField /* in */ f,
pMeshEnt /* in */ e,
int /* in */ n,
double* /* out */ dof_data)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a field handle, a mesh entity handle, the index of node $n$, fill the array \textit{dof$\_$data} with field data (DOF) of $n^{th}$ node of the entity.

\begin{verbatim}
void msi_node_getFieldID (
pField /* in */ f,
pMeshEnt /* in */ e,
int /* in */ n,
int* /* out */ start_dof_id,
int* /* out */ end_dof_id_plus_one)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a field handle, a mesh entity handle, and the index of node $n$, return the starting local ID and the ending local ID plus one for field data (DOF) of $n^{th}$ node of the entity.

\begin{verbatim}
void msi_ment_getGlobalFieldID (
pField /* in */ f,
pMeshEnt /* in */ e,
int /* in */ n,
int* /* out */ start_dof_id,
int* /* out */ end_dof_id_plus_one)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a field handle, a mesh entity handle, and the index of node $n$, return the starting global ID and the ending global ID plus one for field data (DOF) of $n^{th}$ node of the mesh entity.

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\subsection{PETSc Matrix and Solver}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

\begin{verbatim}
pMatrix msi_matrix_create (
int /* in */ matrix_type,
pField /* in */ field)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a matrix type and a field handle, create a matrix and return its handle. The matrix type indicates the purpose of the matrix: 0 for matrix-vector multiplication and 1 for solver. The input field handle is used to retrieve the numbering (row/column ID) for matrix manipulation. The status of matrix is \textit{MSI$\_$NOT$\_$FIXED} so the matrix values can be modified.
Given matrix type (0 for matrix-vector multiplication, 1 for solver) and a field handle, create a matrix and return its handle. The input field handle is used to set up the matrix size and row/column ID's.The initial status of matrix is \textit{MSI$\_$NOT$\_$FIXED} so the matrix values can be modified.

\begin{verbatim}
void msi_matrix_delete (pMatrix /* in */ matrix)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a matrix handle, delete the matrix.

\begin{verbatim}
pField msi_matrix_getField (pMatrix /* in */ matrix)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Given a matrix handle, return the field handle associated with the matrix.

\begin{verbatim}
void msi_matrix_assemble (pMatrix /* in */ matrix)
\end{verbatim}\vspace{-.5cm}\hspace{1cm}
Expand Down
5 changes: 3 additions & 2 deletions openmpi-gcc4.4.5-real-config.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
PREFIX=/fasttmp/seol/openmpi-gcc4.4.5-install
PREFIX=/lore/seol/openmpi-gcc4.4.5-install
ZOLTAN_DIR=$PREFIX
PETSC_DIR=/lore/seol/petsc-3.5.4
PETSC_ARCH=real-openmpi1.6.5
cmake .. \
Expand All @@ -10,7 +11,7 @@ cmake .. \
-DCMAKE_Fortran_FLAGS="-fpic "\
-DSCOREC_INCLUDE_DIR=$PREFIX/include \
-DSCOREC_LIB_DIR=$PREFIX/lib \
-DZOLTAN_LIBRARY="$PREFIX/lib/libzoltan.a" \
-DZOLTAN_LIBRARY="$ZOLTAN_DIR/lib/libzoltan.a" \
-DPARMETIS_LIBRARY="$PETSC_DIR/$PETSC_ARCH/lib/libparmetis.a" \
-DMETIS_LIBRARY="$PETSC_DIR/$PETSC_ARCH/lib/libmetis.a" \
-DENABLE_PETSC=ON \
Expand Down
10 changes: 7 additions & 3 deletions readme
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@

User's Guide: http://www.scorec.rpi.edu/~seol/MSI
Repository: https://github.com/SCOREC/msi (access permission required)

=============================
To build MSI library with petsc:
=============================
Expand All @@ -9,13 +13,13 @@ make install
=============================
To compile MSI test program with petsc located in test/petsc:
=============================
export LD_LIBRARY_PATH=:/usr/local/openmpi/latest/lib:/fasttmp/seol/openmpi-gcc4.4.5-install/lib:/lore/seol/petsc-3.5.4/real-openmpi1.6.5/lib:/usr/lib/gcc/x86_64-linux-gnu/4.4.5:/usr/lib/x86_64-linux-gnu:
export LD_LIBRARY_PATH=:/usr/local/openmpi/latest/lib:/lore/seol/openmpi-gcc4.4.5-install/lib:/lore/seol/petsc-3.5.4/real-openmpi1.6.5/lib:/usr/lib/gcc/x86_64-linux-gnu/4.4.5:/usr/lib/x86_64-linux-gnu:

#real
/usr/local/openmpi/latest/bin/mpicc ../test/petsc/main.cc -o petsc -DDEBUG -DMSI_PETSC -I/usr/local/openmpi/latest/include -I/lore/seol/petsc-3.5.4/real-openmpi1.6.5/include -I/lore/seol/petsc-3.5.4/include -I/fasttmp/seol/openmpi-gcc4.4.5-install/include -Wl,--start-group,-rpath,/fasttmp/seol/openmpi-gcc4.4.5-install/lib -L/fasttmp/seol/openmpi-gcc4.4.5-install/lib -lmsi -lpumi -lcrv -ldsp -lph -lsize -lsam -lspr -lma -lapf_zoltan -lparma -lmds -lapf -llion -lmth -lgmi -lpcu -lzoltan -Wl,--end-group -L/lore/seol/petsc-3.5.4/real-openmpi1.6.5/lib -lpetsc -Wl,-rpath,/lore/seol/petsc-3.5.4/real-openmpi1.6.5/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -lHYPRE -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lsuperlu_4.3 -lsuperlu_dist_3.3 -lflapack -lfblas -lparmetis -lmetis -lpthread -lssl -lcrypto -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lz -lmpi_f90 -lmpi_f77 -lgfortran -lm -lmpi_cxx -lstdc++ -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lnuma -lrt -lnsl -lutil -lgcc_s -lpthread -ldl
/usr/local/openmpi/latest/bin/mpicc ../test/petsc/main.cc -o petsc -DDEBUG -DMSI_PETSC -I/usr/local/openmpi/latest/include -I/lore/seol/petsc-3.5.4/real-openmpi1.6.5/include -I/lore/seol/petsc-3.5.4/include -I/lore/seol/openmpi-gcc4.4.5-install/include -Wl,--start-group,-rpath,/lore/seol/openmpi-gcc4.4.5-install/lib -L/lore/seol/openmpi-gcc4.4.5-install/lib -lmsi -lpumi -lcrv -lph -lsam -lspr -lma -lapf_zoltan -lparma -lmds -lapf -llion -lmth -lgmi -lpcu -lzoltan -Wl,--end-group -L/lore/seol/petsc-3.5.4/real-openmpi1.6.5/lib -lpetsc -Wl,-rpath,/lore/seol/petsc-3.5.4/real-openmpi1.6.5/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -lHYPRE -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lsuperlu_4.3 -lsuperlu_dist_3.3 -lflapack -lfblas -lparmetis -lmetis -lpthread -lssl -lcrypto -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lz -lmpi_f90 -lmpi_f77 -lgfortran -lm -lmpi_cxx -lstdc++ -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lnuma -lrt -lnsl -lutil -lgcc_s -lpthread -ldl

#complex
/usr/local/openmpi/latest/bin/mpicc ../test/petsc/main.cc -o petsc -DDEBUG -DMSI_PETSC -DPETSC_USE_COMPLEX -I/usr/local/openmpi/latest/include -I/lore/seol/petsc-3.5.4/complex-openmpi1.6.5/include -I/lore/seol/petsc-3.5.4/include -I/fasttmp/seol/openmpi-gcc4.4.5-install/include -Wl,--start-group,-rpath,/fasttmp/seol/openmpi-gcc4.4.5-install/lib -L/fasttmp/seol/openmpi-gcc4.4.5-install/lib -lmsi_complex -lpumi -lcrv -ldsp -lph -lsize -lsam -lspr -lma -lapf_zoltan -lparma -lmds -lapf -llion -lmth -lgmi -lpcu -lzoltan -Wl,--end-group -L/lore/seol/petsc-3.5.4/complex-openmpi1.6.5/lib -lpetsc -Wl,-rpath,/lore/seol/petsc-3.5.4/complex-openmpi1.6.5/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lsuperlu_4.3 -lsuperlu_dist_3.3 -lflapack -lfblas -lparmetis -lmetis -lpthread -lssl -lcrypto -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lz -lmpi_f90 -lmpi_f77 -lgfortran -lm -lmpi_cxx -lstdc++ -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lnuma -lrt -lnsl -lutil -lgcc_s -lpthread -ldl
/usr/local/openmpi/latest/bin/mpicc ../test/petsc/main.cc -o petsc -DDEBUG -DMSI_PETSC -DPETSC_USE_COMPLEX -I/usr/local/openmpi/latest/include -I/lore/seol/petsc-3.5.4/complex-openmpi1.6.5/include -I/lore/seol/petsc-3.5.4/include -I/lore/seol/openmpi-gcc4.4.5-install/include -Wl,--start-group,-rpath,/lore/seol/openmpi-gcc4.4.5-install/lib -L/lore/seol/openmpi-gcc4.4.5-install/lib -lmsi_complex -lpumi -lcrv -lph -lsam -lspr -lma -lapf_zoltan -lparma -lmds -lapf -llion -lmth -lgmi -lpcu -lzoltan -Wl,--end-group -L/lore/seol/petsc-3.5.4/complex-openmpi1.6.5/lib -lpetsc -Wl,-rpath,/lore/seol/petsc-3.5.4/complex-openmpi1.6.5/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lsuperlu_4.3 -lsuperlu_dist_3.3 -lflapack -lfblas -lparmetis -lmetis -lpthread -lssl -lcrypto -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lz -lmpi_f90 -lmpi_f77 -lgfortran -lm -lmpi_cxx -lstdc++ -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -L/usr/local/openmpi/1.6.5-ib/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/gcc/x86_64-linux-gnu/4.4.5 -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lnuma -lrt -lnsl -lutil -lgcc_s -lpthread -ldl

=============================
To run MSI test program with petsc
Expand Down

0 comments on commit 7639b2c

Please sign in to comment.