Home My Page Projects PaStiX
Summary Activity Forums Lists Docs News Files

Forum: help

Monitor Forum | Start New Thread Start New Thread
RE: PASTIX MPI and ParMETIS [ Reply ]
By: Serban Georgescu on 2014-04-23 17:25
[forum:148468]
Hi XL,

I am going to try this next week and let you know how it went.
Thank you Mathieu Faverge for your comment. Indeed, what I am actually comparing is wall clock time. The number of operation was just for me to understand why it is faster or slower than other solvers like MUMPS.

Best regards,
Serban


RE: PASTIX MPI and ParMETIS [ Reply ]
By: Xavier Lacoste on 2014-04-23 07:15
[forum:148466]
Hello Serban,

Yes, Metis 5 interface is different and we didn't update the code to use it.

Maybe you could get an ordering from Metis 5 through PETsC and give it to PaStiX using API_ORDER_PERSONAL ?

Mathieu Faverge wanted to add the remark that the number of operations is not the only parameter to look for. You obtain a smaller number of operations with metis but they can be less parallelism in the elimination tree...

XL.

RE: PASTIX MPI and ParMETIS [ Reply ]
By: Serban Georgescu on 2014-04-22 13:32
[forum:148465]
Hello XL,

Got a bit further.
I made the modification you suggested and METIS is now called. However, I am getting the following error:

Check : Duplicates OK
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
Input Error: Incorrect objective type.
calling metis...
Input Error: Incorrect objective type.
Time to compute ordering 0.00388 s

WARNING: metis or personal ordering can't be used without kass, forced use of kass.

[5]PETSC ERROR: ------------------------------------------------------------------------
[5]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[5]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[5]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[5]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[5]PETSC ERROR: likely location of problem given in stack below
...

I suspect that might be because PETSC (3.4.4) is compiled with METIS 5 while I compiled PASTIX using METIS 4. I am now trying to "convince" PETSc to compile with METIS 4, maybe that will solve the problem.
I tried to compile PASTIX with METIS 5 but I got compilation errors. Is it correct to assume METIS 5 is not supported by PASTIX?

Cheers,
Serban

RE: PASTIX MPI and ParMETIS [ Reply ]
By: Xavier Lacoste on 2014-04-17 03:38
[forum:148449]
Hello,

Yes, you have to get a column permutation array, construct the reverse permutation array (such that invp[perm[i]] == i;) and use this while calling PaStiX with
lu->iparm[IPARM_START_TASK] = API_TASK_ORDERING;
and
lu->iparm[IPARM_ORDERING] = API_ORDER_PERSONAL.

Regards,

XL

RE: PASTIX MPI and ParMETIS [ Reply ]
By: Xavier Lacoste on 2014-04-17 03:34
[forum:148448]
Hello Serban,

Yes this should be good with that setting...
It should be placed next to the :
lu->iparm[IPARM_START_TASK] = API_TASK_ORDERING;
lu->iparm[IPARM_END_TASK] = API_TASK_NUMFACT;

you should see some message like :
"metis or personal ordering can't be used without kass, forced use of kass."

XL.

RE: PASTIX MPI and ParMETIS [ Reply ]
By: Serban Georgescu on 2014-04-16 15:37
[forum:148447]
Hello XL,

Thank you again for the reply.

Regarding the number of operations, here is some data that I got for one of the matrices that I am trying to solve:

========== pastix + metis ============
~/Programs/pastix_release_6eafa91_2/src/example/bin/simple -mm mat.mm -t 1 -ord metis
Number of operations (LLt) 1.864e+11

========== pastix + scotch ============
~/Programs/pastix_release_6eafa91_2/src/example/bin/simple -mm mat.mm -t 1 -ord scotch
Number of operations (LLt) 2.59558e+11

========== pastix + pt-scotch ============
~/Programs/pastix_release_6eafa91_2/src/example/bin/simple_dist -mm mat.mm -t 1 -ord scotch
Number of operations (LLt) 3.68213e+11

========== PETSc + pastix + scotch ============
/work/serban/benchtools/solver_mpi -mat mat.petscbin -matrhs rhs.petscbin -matsol sol.petscbin -mat_type sbaij -ksp_monitor -ksp_type preonly -pc_type cholesky -pc_factor_mat_solver_package pastix -mat_pastix_threadnbr 1 -mat_pastix_verbose 1
Number of operations (LLt) 2.86562e+11

As you can see above, METIS looks better than SCOTCH which looks better than PT-SCOTCH. This is why I would like to use METIS or ParMETIS.

I have followed your suggestion about adding METIS support to PETSc. I have succeeded in compiling METIS support in the PASTIX inside PETSc. PETSc now shows METIS = Defined, so I take it that all is OK.

+--------------------------------------------------------------------+
+ PaStiX : Parallel Sparse matriX package +
+--------------------------------------------------------------------+
Matrix size 894555 x 894555
Number of nonzeros in A 28282032
+--------------------------------------------------------------------+
+ Options +
+--------------------------------------------------------------------+
Version : exported
SMP_SOPALIN : Defined
VERSION MPI : Defined
PASTIX_DYNSCHED : Not defined
STATS_SOPALIN : Not defined
NAPA_SOPALIN : Defined
TEST_IRECV : Not defined
TEST_ISEND : Defined
THREAD_COMM : Not defined
THREAD_FUNNELED : Not defined
TAG : Exact Thread
FORCE_CONSO : Not defined
RECV_FANIN_OR_BLOCK : Not defined
OUT_OF_CORE : Not defined
DISTRIBUTED : Defined
METIS : Defined
WITH_SCOTCH : Defined
INTEGER TYPE : int32_t
FLOAT TYPE : double
+--------------------------------------------------------------------+

What I haven't yet managed is to modify src/mat/impls/aij/mpi/pastix/pastix.c in order to use METIS (hard-code it at this point).
I've added
lu->iparm[IPARM_ORDERING] = API_ORDER_METIS;
all over the place, however it does not seem to make a difference, so I think I am doing it wrong.
What is the correct way?

Cheers,
Serban

RE: PASTIX MPI and ParMETIS [ Reply ]
By: Serban Georgescu on 2014-04-16 15:45
[forum:148446]
Hello again,

I was actually about to ask you about providing the ordering, since PETSc does support ParMETIS.
I see there is this function called MatGetOrdering: http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/MatOrderings/MatGetOrdering.html
The output of this is a permutation vector. Is this something that PASTIX could use via API_ORDER_PERSONAL?

Cheers,
Serban

RE: PASTIX MPI and ParMETIS [ Reply ]
By: Xavier Lacoste on 2014-04-15 08:40
[forum:148436]
Hello,

If I'm up-to-date with the PETSc repository, the used version is the centralized one (pastix()). The matrix is centralized and then pastix() is called (which call scotch).
I prepared a patch to use distributed one but it's not finished yet... (Need to update the compilation script to get the new version...)

In the build step the Metis option is not activated so PaStiX cannot call it (it would be some lines to add to the config.in file... In the compilation script in PETSc).

# uncomment the following lines for using metis ordering
#VERSIONORD = _metis
#METIS_HOME = /home/ramet/metis-4.0.3
#CCPASTIX := $(CCPASTIX) -DMETIS -I$(METIS_HOME)/Lib
#EXTRALIB := $(EXTRALIB) -L$(METIS_HOME) -lmetis

With that one would be able to use Metis if the src/mat/impls/aij/mpi/pastix/pastix.c is also modified to get the argument with PetscOptionsInt() and set IPARM_ORDERING to API_ORDER_METIS....

This would only require change in PETSc sources. But if you want ParMETIS we have to add it in PaStiX. It will be done but I can't give you an issue datee for next release... Aren't the reordering better with Scotch than PT-Scotch ? Is Scotch also worse than Metis/ParMetis ?

One could even use is own reordering by filling perm/invp mannualy and calling ordering step (IPARM_START_TASK = API_TASK_ORDERING) with IPARM_ORDERING = API_ORDER_PERSONAL but I don't know how to get an ordering from PETSc.
This change would also be done in src/mat/impls/aij/mpi/pastix/pastix.c.

Regards,

XL

RE: PASTIX MPI and ParMETIS [ Reply ]
By: Serban Georgescu on 2014-04-14 15:39
[forum:148435]
Dear Xavier,

Thank you for the quick reply and for the clarification.

I am using PASTIX via PETSc using a distributed matrix, which if my understanding is correct, uses the dpastix interface. It would be nice if ParMETIS were supported, since for the matrices that I am trying to solve, ParMETIS seems to be performing better (i.e., generating less fill-in) than PT-SCOTCH.

Best regards,
Serban



RE: PASTIX MPI and ParMETIS [ Reply ]
By: Xavier Lacoste on 2014-04-14 13:40
[forum:148434]
Hello,

I think their is a misunderstanding. Indeed, pastix() and dpastix() interface are both using MPI. Only the way of giving the matrix differ between these interface, the first one require the full matrix as input whereas the second one require only some columns. But both of them use internally a distributed matrix.

PaStiX should work with Metis with MPI + Threads through the pastix() interface.

However, ParMetis or Metis is not yet available with the distributed interface of PaStiX (dpastix() or Murge). It is on the todo-list but I don't know when it'll be done...

XL.

PASTIX MPI and ParMETIS [ Reply ]
By: Serban Georgescu on 2014-04-14 12:48
[forum:148433]
Hello,

I have a question regarding the use of METIS in PASTIX.

It seems that METIS can only be used with the multi-threaded version of PASTIX and not when using the MPI version. If I am not mistaken, that seems to only work with PT-Scotch.

Are there any plans to add ParMETIS support to PASTIX in the future?

Thank you,
Serban