[overture] Re: parallel ins with PETSc

  • From: Joel Guerrero <joegi.geo@xxxxxxxxx>
  • To: overture@xxxxxxxxxxxxx
  • Date: Mon, 7 Jan 2008 20:28:37 -0800 (PST)

Ok, petsc option is set to on and I'm not copying the files from the overture 
installation. At this moment, I'm getting tis error, (which by the way I have 
no clue to what is related to)



joegi@linux-imac:~/cfd/overture/cg_parallel.v22/ins> make -j2
g++ -fpic -g -w -I./src -I../common/src -I../common/shared 
-I../common/moving/src -I../common/chemistry -I../common/dataBase -I./src 
-I/home/joegi/cfd/overture/Overture_parallel.v22/include 
-I/home/joegi/cfd/overture/A++P++-0.7.9d/P++/install//include 
-I/home/joegi/openmpi-1.2.4//include -DUSE_PPP 
-I/home/joegi/cfd/overture/HDF5-1.6.5_parallel/include 
-I/home/joegi/cfd/overture/petsc-2.3.2_parallel//include  
-I/home/joegi/cfd/overture/petsc-2.3.2_parallel//bmake/linux-gnu-opt 
-DOVERTURE_USE_PETSC   -fpermissive -o obj/PETScSolver.o -c 
/home/joegi/cfd/overture/Overture_parallel.v22/Oges/PETScSolver.C
make[1]: Entering directory `/home/joegi/cfd/overture/cg_parallel.v22/common'
make[1]: Nothing to be done for `lib'.
make[1]: Leaving directory `/home/joegi/cfd/overture/cg_parallel.v22/common'
In file included from /usr/include/c++/4.2.1/iosfwd:48,
                 from /usr/include/c++/4.2.1/bits/stl_algobase.h:70,
                 from /usr/include/c++/4.2.1/bits/stl_tree.h:67,
                 from /usr/include/c++/4.2.1/map:65,
                 from 
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/mpicxx.h:35,
                 from /home/joegi/openmpi-1.2.4//include/mpi.h:1783,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petsc.h:95,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscis.h:7,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscvec.h:9,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscmat.h:6,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscpc.h:6,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscksp.h:6,
                 from 
/home/joegi/cfd/overture/Overture_parallel.v22/include/PETScSolver.h:9,
                 from 
/home/joegi/cfd/overture/Overture_parallel.v22/Oges/PETScSolver.C:9:
/usr/include/c++/4.2.1/bits/stringfwd.h:48: error: template with C linkage
/usr/include/c++/4.2.1/bits/stringfwd.h:51: error: template with C linkage
/usr/include/c++/4.2.1/bits/stringfwd.h:54: error: template with C linkage
/usr/include/c++/4.2.1/bits/stringfwd.h:58: error: template specialization with 
C linkage
/usr/include/c++/4.2.1/bits/stringfwd.h:63: error: template specialization with 
C linkage
In file included from /usr/include/c++/4.2.1/iosfwd:49,
                 from /usr/include/c++/4.2.1/bits/stl_algobase.h:70,
                 from /usr/include/c++/4.2.1/bits/stl_tree.h:67,
                 from /usr/include/c++/4.2.1/map:65,
                 from 
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/mpicxx.h:35,
                 from /home/joegi/openmpi-1.2.4//include/mpi.h:1783,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petsc.h:95,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscis.h:7,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscvec.h:9,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscmat.h:6,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscpc.h:6,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscksp.h:6,
                 from 
/home/joegi/cfd/overture/Overture_parallel.v22/include/PETScSolver.h:9,
                 from 
/home/joegi/cfd/overture/Overture_parallel.v22/Oges/PETScSolver.C:9:
/usr/include/c++/4.2.1/bits/postypes.h:80: error: template with C linkage
/usr/include/c++/4.2.1/bits/postypes.h:94: error: template with C linkage
/usr/include/c++/4.2.1/bits/postypes.h:197: error: template with C linkage
/usr/include/c++/4.2.1/bits/postypes.h:202: error: template with C linkage


.
.
.
.

100s of lines here with a similar error

.
.
.
.


In file included from 
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/mpicxx.h:168,
                 from /home/joegi/openmpi-1.2.4//include/mpi.h:1783,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petsc.h:95,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscis.h:7,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscvec.h:9,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscmat.h:6,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscpc.h:6,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscksp.h:6,
                 from 
/home/joegi/cfd/overture/Overture_parallel.v22/include/PETScSolver.h:9,
                 from 
/home/joegi/cfd/overture/Overture_parallel.v22/Oges/PETScSolver.C:9:
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions.h:60: error: 
declaration of C function ‘void MPI::Init()’ conflicts with
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions.h:57: error: 
previous declaration ‘void MPI::Init(int&, char**&)’ here
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions.h:88: error: 
declaration of C function ‘int MPI::Init_thread(int)’ conflicts with
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions.h:85: error: 
previous declaration ‘int MPI::Init_thread(int&, char**&, int)’ here
In file included from 
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/mpicxx.h:245,
                 from /home/joegi/openmpi-1.2.4//include/mpi.h:1783,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petsc.h:95,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscis.h:7,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscvec.h:9,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscmat.h:6,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscpc.h:6,
                 from 
/home/joegi/cfd/overture/petsc-2.3.2_parallel//include/petscksp.h:6,
                 from 
/home/joegi/cfd/overture/Overture_parallel.v22/include/PETScSolver.h:9,
                 from 
/home/joegi/cfd/overture/Overture_parallel.v22/Oges/PETScSolver.C:9:
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h:95: 
error: ‘void MPI::Init(int&, char**&)’ should have been declared inside ‘MPI’
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h: In 
function ‘void MPI::Init(int&, char**&)’:
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h:95: 
error: declaration of C function ‘void MPI::Init(int&, char**&)’ conflicts with
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions.h:60: error: 
previous declaration ‘void MPI::Init()’ here
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h: At 
global scope:
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h:102: 
error: ‘void MPI::Init()’ should have been declared inside ‘MPI’
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h: In 
function ‘void MPI::Init()’:
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h:102: 
error: declaration of C function ‘void MPI::Init()’ conflicts with
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h:95: 
error: previous declaration ‘void MPI::Init(int&, char**&)’ here
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h: At 
global scope:
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h:147: 
error: ‘int MPI::Init_thread(int&, char**&, int)’ should have been declared 
inside ‘MPI’
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h: In 
function ‘int MPI::Init_thread(int&, char**&, int)’:
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h:147: 
error: declaration of C function ‘int MPI::Init_thread(int&, char**&, int)’ 
conflicts with
/home/joegi/openmpi-1.2.4//include/openmpi/ompi/mpi/cxx/functions_inln.h:137: 
error: previous declaration ‘int MPI::Init_thread(int)’ here
make: *** [obj/PETScSolver.o] Error 1


Any clue,

joel


----- Original Message ----
From: Bill Henshaw <henshaw@xxxxxxxx>
To: overture@xxxxxxxxxxxxx
Sent: Tuesday, January 8, 2008 11:17:25 PM
Subject: [overture] Re: parallel ins with PETSc


Hi Joel:

   You should NOT copy PETScSolver.C nor buildEquationSolvers.C. from
 the Overture directory.
The cg make system will automatically find these files and compile them
 for you.
All you should need to do is set
    usePETSc   := on
in cg/ins/Makefile. When this flag is set the above files will be
 compiled in the correct way.

Sorry for the confusion.

...Bill


Joel Guerrero wrote:
> Hi Bill,
> 
> PETScSolver.C should be also recompile by cg???.  
> 
> So, according to your message, I should change the cg makefile in a
 way that buildEquationSolvers.C is recompile during cg compilation?.
  Because so far I'm just copying these two files (PETScSolver.C and
 buildEquationSolvers.C ) from the overture installation dir. 
> 
> 
> HAve a nice day,
> 
> Joel
> 
> ----- Original Message ----
> From: Bill Henshaw <henshaw@xxxxxxxx>
> To: overture@xxxxxxxxxxxxx
> Sent: Tuesday, January 8, 2008 3:05:39 PM
> Subject: [overture] Re: parallel ins with PETSc
> 
> 
> Hi Joel,
> 
>   Your error message is different now since Oges cannot find PETScNew
>  (which
> is the parallel PETSc solver)
> 
>  > Oges::buildEquationSolvers:ERROR: solver PETScNew is not currently
>  available
> 
> Check the file Overture/oges/buildEquationSolvers.C to see where this
>  error
> occurs. You may need to recompile this file. A new version of this
 file
>  should be
> compiled by cg with the compile flag  -DOVERTURE_USE_PETSC, and this
>  new version over-rides
> the version that is in the Overture library.
> 
> ...Bill
> 
> 
> 
> Joel Guerrero wrote:
>> Hi,
>>
>> Sorry for bringing again this subject but I still have problems and
>  this is driving me nuts.  With the serial version of cg ins I don't
 have
>  problems linking it with petsc, but with the parallel version I
 still
>  have the same problem that is:
>> initialize the solution...
>>  interpolateAndApplyBoundaryConditions (start) steps=-1 t=0: number
>  of array ID's has increased to 684
>>>>>>> Cgins::project: project the initial conditions <<<<
>>  $$$$$$$$$$$$$$$ Cgins: updateToMatchGrid(CompositeGrid & cg)
>  $$$$$$$$$$$$
>> Oges::buildEquationSolvers:ERROR: solver PETScNew is not currently
>  available
>>  You may have to copy and edit the file
>  Overture/Oges/buildEquationSolvers.C
>>  and a file like PETScEquationSolver.C (if you are trying to use
>  PETSc)
>>  and then link the files to your application in order to get a
>  non-standard solver
>>  See the Oges documentation for further details
>> error
>> Overture::abort: I am now going to purposely abort so that you can
>  get a traceback from a debugger
>> 7:31775] [14]
>  ../../../../cg_parallel.v22/ins/bin/cgins(_ZN3MPI3Win4FreeEv+0xdd)
 [0x8059151]
>> [linux-p8i7:31775] *** End of error message ***
>> mpirun noticed that job rank 0 with PID 31775 on node linux-p8i7
>  exited on signal 6 (Aborted). 
>> 1 additional process aborted (not shown)
>>
>>
>> the compilation process goes fine, all the necessary files are
>  present and also all the required libraries.   I also checked the
 shared
>  library dependencies and the executable is proper linked to the
 petsc
>  libraries.  At this point I dont know what could be the problem, do
 I need
>  to change something in the source code in order to have the parallel
>  petsc solver ????
>>
>> Regards,
>>
>> joel
>>
>> ----- Original Message ----
>> From: Bill Henshaw <henshaw@xxxxxxxx>
>> To: overture@xxxxxxxxxxxxx
>> Sent: Sunday, January 6, 2008 6:16:25 PM
>> Subject: [overture] Re: parallel ins with PETSc
>>
>>
>> Joel:
>>
>>  > Oges::buildEquationSolvers:ERROR: solver PETSc is not currently
>>  available
>>
>>    It looks like you are trying to use the serial "PETSc" solver
>>  instead of the
>> parallel version. Use the "PETScNew" option when specifying the
>  solver
>>  to use
>> or "choose best iterative solver" should also work.
>>
>>     pressure solver options
>>      * PETScNew
>>      choose best iterative solver
>>
>> You could also run with gdb and set a break in OgesParameters::set(
>>  OptionEnum option, int value, real rvalue )
>> to make sure that the solver is set to PETScNew at some point.
>>
>> Many cmd files in ins/cmd run in parallel -- grep for "mpirun" in
>  *.cmd
>>  for exmaples
>>
>> ...Bill
>>
>> Joel Guerrero wrote:
>>> Hi,
>>>
>>> The PETSc option is switch on in the makefile.  Also the
>>  buildEquationSolvers.o and PETScSolver.o are proper linked.  The
>  compilation
>>  process goes fine but when I try to use parallel ins I always get
>  the same
>>  error.  Besides the file cicp.cmd, is there another parallel
>  example??
>>> cheers,
>>>
>>> jg 
>>>
>>
>>
>>
>>
>>
>>      
>
  
____________________________________________________________________________________
>> Looking for last minute shopping deals?  
>> Find them fast with Yahoo! Search.
>  
 http://tools.search.yahoo.com/newsearch/category.php?category=shopping
> 
> 
> 
> 
> 
> 
>      
 
____________________________________________________________________________________
> Be a better friend, newshound, and 
> know-it-all with Yahoo! Mobile.  Try it now.
  http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ 
> 






      
____________________________________________________________________________________
Never miss a thing.  Make Yahoo your home page. 
http://www.yahoo.com/r/hs

Other related posts: