Home My Page Projects PaStiX
Summary Activity Forums Lists Docs News Files

Forum: open-discussion

Monitor Forum | Start New Thread Start New Thread
RE: Multithreaded Pastix inside PETSc [ Reply ]
By: Xavier Lacoste on 2009-11-12 10:30
[forum:103379]
Bonjour,

Le patch a été appliqué sur les sources de PETSC, disponibles via mercurial :

hg clone http://petsc.cs.iit.edu/petsc/petsc-dev (http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html)

XL.

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: DESIRE NUENTSA WAKAM on 2009-10-22 06:04
[forum:103221]
Yes Thanks !
It's going well now.
The missing function was pastix_checkMatrix(...) to symmetrize the pattern, I guess!
You can send a mail to Petsc-maint to let them correct it in the next release.
Thanks again !

--
Desire

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: Xavier Lacoste on 2009-10-21 18:00
[forum:103216]
Hi,

Could you please try adding this block at the end of the function MatConvertToCSC of file petsc-dev/src/mat/impls/aij/mpi/pastix/pastix.c , just before the return line?

{

pastix_int_t * tmpcolptr = malloc((*n+1)*sizeof(PetscInt));
pastix_int_t * tmprows = malloc(nnz*sizeof(PetscInt));
pastix_float_t * tmpvalues = malloc(nnz*sizeof(PetscScalar));
if (sizeof(PetscScalar) != sizeof(pastix_float_t))
{
SETERRQ2(PETSC_ERR_SUP,"sizeof(PetscScalar) %d != sizeof(pastix_float_t) %d", sizeof(PetscScalar), sizeof(pastix_float_t));
}

memcpy(tmpcolptr, *colptr, (*n+1)*sizeof(PetscInt));
memcpy(tmprows, *row, nnz*sizeof(PetscInt));
memcpy(tmpvalues, *values, nnz*sizeof(PetscScalar));
ierr = PetscFree(*row);CHKERRQ(ierr);
ierr = PetscFree(*values);CHKERRQ(ierr);

pastix_checkMatrix(MPI_COMM_WORLD, API_VERBOSE_NO,
((isSym != 0) ? API_SYM_YES : API_SYM_NO), API_YES,
*n, &tmpcolptr, &tmprows, &tmpvalues, NULL);

memcpy(*colptr, tmpcolptr, (*n+1)*sizeof(PetscInt));
free(tmpcolptr);
ierr = PetscMalloc( ((*colptr)[*n]-1) *sizeof(PetscInt) ,row);CHKERRQ(ierr);
memcpy(*row, tmprows, ((*colptr)[*n]-1)*sizeof(PetscInt));
free(tmprows);
ierr = PetscMalloc( ((*colptr)[*n]-1) *sizeof(PetscScalar),values);CHKERRQ(ierr);
memcpy(*values, tmpvalues, ((*colptr)[*n]-1)*sizeof(PetscScalar));
free(tmpvalues);
}

I'm not sure it's the good way to write it, but it seems to work, I'll ask the petsc-dev team if they would do like that.

Thanks

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: Xavier Lacoste on 2009-10-21 16:15
[forum:103215]
Hi,

I'm currently having a look on it and it seems that the problem come from the fact PaStiX always need a symmetric patern for its matrix, and I didn't checked it when I worked on it...

I'm currently working on a patch to correct that.

XL.

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: DESIRE NUENTSA WAKAM on 2009-10-21 16:06
[forum:103214]
Hi All!
Finally, the problem doesn't come from PETSc but from my installation or from my code.
The version is pastix_release_1789.
I just send to Xavier a small code I write to test the installed version.
I'm checking the installed version of Scotch.

--
Desire


RE: Multithreaded Pastix inside PETSc [ Reply ]
By: DESIRE NUENTSA WAKAM on 2009-10-20 16:03
[forum:103201]
Please, you can find one example here http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/ksp/pc/examples/tests/ex2.c.html
The source code can be found at $PETSC_DIR/src/ksp/pc/examples/tests/ex2.c
You can modify the input matrix to have a non symmetric one then call Pastix using
-ksp_type preonly -pc_type lu
-pc_factor_mat_solver_package pastix -ksp_monitor

Regards

--
Desire

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: DESIRE NUENTSA WAKAM on 2009-10-20 15:45
[forum:103200]
Hello Xavier,
thanks for your answer.
the version of PETSC is 3.0.0.p2
As I test several methods (gmres, mumps, superlu, pastix) I'm using this sequence of PETSc functions.

To create the matrix,
MatCreateMPIAIJ( ...)
MatSetValues(...)
To create the context of the solver to use :
KSPCreate(...)
KSPSetOperators(...)
KSPSetFromOptions(...)

To create the context of the preconditioner :
KSPGetPC(...)
PCSetFromOptions(...)


To solve the system :
KSPSolve(...)

Calling PaStiX is done with these options
-ksp_type preonly
-pc_type lu
-pc_factor_mat_solver_package pastix

--
Desire

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: Xavier Lacoste on 2009-10-20 15:10
[forum:103199]
Re-hello,

Can I have a look on the code you use to call PaStiX with PETSc ?

Which version of PETSc ?

I'm not using PETSc often, I don't really remember how to use it...

XL.

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: Xavier Lacoste on 2009-10-20 14:55
[forum:103198]
Hello,

It seems, that we have to check and correct our pastix call in PETSc with unsymmetric matrix.

I'll had this in our TODO-LIST and tell you when it will be checked/what to modify in PETSc interface to PaStiX to correct it.

Thanks,

XL.

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: DESIRE NUENTSA WAKAM on 2009-10-12 07:42
[forum:103117]
Hello.
Please, excuse me for this question as it's related again to PaStiX inside PETSc.
Now, I can solve my systems inside PETSc using the multithreaded version of PaStiX, but only with symmetric matrices.
For unsymetric ones, I have the following errors :

Check : ordering OK
Check : Graph SymmetryCheck : Sort CSC OK
ERROR: graphCheck: arc data do not match

ERROR: pastix : graphCheck

ERROR: graphCheck: arc data do not match

ERROR: pastix : graphCheck

-----------------------------------------------
The input options are the following :

+--------------------------------------------------------------------+
+ PaStiX : Parallel Sparse matriX package +
+--------------------------------------------------------------------+
Matrix size 1080 x 1080
Number of nonzeros 23094
+--------------------------------------------------------------------+
+ Options +
+--------------------------------------------------------------------+
Version : exported
SMP_SOPALIN : Defined
VERSION MPI : Defined
PASTIX_BUBBLE : Not defined
STATS_SOPALIN : Not defined
NAPA_SOPALIN : Defined
TEST_IRECV : Not defined
TEST_ISEND : Defined
THREAD_COMM : Not defined
THREAD_FUNNELED : Not defined
TAG : Exact Thread
FORCE_CONSO : Not defined
RECV_FANIN_OR_BLOCK : Not defined
OUT_OF_CORE : Not defined
DISTRIBUTED : Not defined
FLUIDBOX : Not defined
METIS : Not defined
INTEGER TYPE : int
FLOAT TYPE : double
+--------------------------------------------------------------------+


The PaStiX interface in PETSc can be found here :
http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/mat/impls/aij/mpi/pastix/pastix.c.html

The man page is here : http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MAT_SOLVER_PASTIX.html

It is said nowhere that the profile for the input matrix should be symmetric.

I apologize again if this post is out-of-topic.

Desire

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: DESIRE NUENTSA WAKAM on 2009-09-29 12:19
[forum:103005]
Matthieu, thanks for this answer.
I will try it and let you know how it's going.
Desire

RE: Multithreaded Pastix inside PETSc [ Reply ]
By: Mathieu Faverge on 2009-09-29 12:12
[forum:103004]
Hi,

The last release of PETSc includes the PaStiX release 5.1.1 revolution. So you can try to use PaStiX by this way, but it is not supported.

> First, the local matrices are nonsymmetric (value and structure). Is there any problems with this?

No it's not a problem. PaStiX symmetrize the interne structure for the factorization but users can give a nonsymmetric pattern.

> could you advise me anyway to use multithreaded PaStiX to solve those local systems on SMP nodes ?

If you want to use PaStiX on SMP nodes, you just have to set the variable IPARM_THREAD_NBR to the number of cores and launch one MPI process per node. Be careful to link with a sequential BLAS library to have good performances.
I never try a multithreaded PaStiX with PETSc but I think it should be ok.

I hope I answered to your questions. If not don't hesitate to ask more informations.

Mathieu


Multithreaded Pastix inside PETSc [ Reply ]
By: DESIRE NUENTSA WAKAM on 2009-09-29 11:43
[forum:103002]
Hello Petsc developpers.
I'm working on a package that uses the multiplicative schwarz preconditioner with the parallel version of GMRES method. The package is written in PETSc format.
Presently, I'm using sequential and distributed packages (SuperLU, UMFPACK, SuperLU_DIST, MUMPS) to solve the local systems induced by that preconditioner.
Now, I want to try some multithreaded packages like PaStiX. First, the local matrices are nonsymmetric (value and structure). Is there any problems with this?
Also, since PETSc is not thread safe (http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/petscthreads.html), could you advise me anyway to use multithreaded PaStiX to solve those local systems on SMP nodes ?
Thanks in advance for your answers.
Désiré