[overture] Re: PETSc Error without invoking PETSc

  • From: Bill Henshaw <henshaw@xxxxxxxx>
  • To: overture@xxxxxxxxxxxxx
  • Date: Fri, 20 Jun 2008 08:26:34 -0700

It's not really a PETSc error. PETSc catches any errors and
prints out messages. If you run with gdb you can see where
the error is.

...Bill

#DOMINIC DENVER JOHN CHANDAR# wrote:
Hi,
I'm experiencing a weird error when i run cgins with PETSc turned off in the command file. As seen below, both the implicit solver and the pressure solver is Yale. but I get a PETSc error segmentation fault... any idea whats going on ? Regards,
Dominic
****************************************************************** Cgins version 0.1 ----------------- Solving: incompressible Navier Stokes, standard model cfl = 0.900000, tFinal=1.000000e+01, tPrint = 1.000000e-02 Time stepping method: implicit
 recompute dt at least every 100 steps.
use 2nd order artificial dissipation, ad21=1.00e+00 ad22=1.00e+00
 Interpolation type: interpolate computational variables
 Order of accuracy in space = 2
 Order of extrapolation for interpolation neighbours = 3
 Order of extrapolation for second ghost line = 3
Implicit time stepping. Order of predictor corrector=2
   implicit factor = 0.50, (.5=Crank-Nicolson, 1.=Backward Euler)
*Implicit solver =yale, direct sparse solver, no pivoting (parallel), direct sparse solver, no pivoting, tolerance=0.000000e+00, max number of iterations=default * Implicit time stepping with some grids time integrated explicitly or semi-implicitly
                     box is time integrated explicitly
                   inbox is time integrated explicitly
                    wing is time integrated implicitly
                      sp is time integrated implicitly
                      np is time integrated implicitly
 Moving grid problem:
  Grid           inbox is moving : rigidBody
  Grid            wing is moving : rigidBody
  Grid              sp is moving : rigidBody
  Grid              np is moving : rigidBody
 nu=1.000000e-02, cdv=1.000000e+00, *pressure solver=yale
*
** >>> t = 0.000e+00, dt = 3.64e-02, cpu = 0.00e+00 seconds (0 steps)
           p : (min,max)=( 1.000000e+00, 1.000000e+00)
           u : (min,max)=( 0.000000e+00, 0.000000e+00)
           v : (min,max)=( 0.000000e+00, 0.000000e+00)
           w : (min,max)=( 0.000000e+00, 0.000000e+00)
Divergence: divMax/vorMax = 0.00e+00 divl2Norm/vorMax=0.00e+00 vorMax=0.00e+00
***Save grid in the show file: numberOfComponentGrids=5
GridCollectionData:put:INFO: put the gridDistributionList. gridDistributionList.size()=5
GridDistributionList:put: numberOfElements=5
movie mode
**************** implicitPC: still need correct initial values for du/dt(t-dt) ****** **************** use values from du/dt(t) ******
InsParameters::getNormalForce: fluidDensity=1 --> 1
InsParameters::getNormalForce: fluidDensity=1 --> 1
InsParameters::getNormalForce: fluidDensity=1 --> 1
 ***moveGrids: build transform: cfg3: grid=1 dimension=[-2,42][-2,42]
 ***moveGrids: build transform: cfg3: grid=2 dimension=[-2,62][-2,42]
 ***moveGrids: build transform: cfg3: grid=3 dimension=[-2,62][-2,42]
 ***moveGrids: build transform: cfg3: grid=4 dimension=[-2,62][-2,42]
**** moveGrids: before gridGenerator->updateOverlap: Number of A++ arrays has increased to 1743 Mapping:getGrid called, remake grid for mapping hyperbolic-wingmain
Mapping:getGrid called, remake grid for mapping hyperbolic-sp
Mapping:getGrid called, remake grid for mapping hyperbolic-np
**** moveGrids: after gridGenerator->updateOverlap: Number of A++ arrays has increased to 1790 Mapping:getGrid called, remake grid for mapping hyperbolic-sp
Mapping:getGrid called, remake grid for mapping hyperbolic-sp
allocateWorkSpace: numberOfEquations=305100, nsp = 387040602, fillinRatio= 205.737, numberOfNonzeros = 1881237 applyBoundaryConditions (before applyBC's): number of array ID's has increased to 1870 **** solveForTimeIndependentVariablesINS: start: Number of A++ arrays has increased to 1830 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on linux or man libgmalloc on Apple to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 2.3.2, Patch 3, Fri Sep 29 17:09:34 CDT 2006 HG revision: 9215af156a9cbcdc1ec666e2b5c7934688ddc526
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: oges on a linux-gnu named quad by pdominic Fri Jun 20 21:28:16 2008 [0]PETSC ERROR: Libraries linked from /home/pdominic/overturev22/petsc-2.3.2-p3/lib/linux-gnu-c-debug
[0]PETSC ERROR: Configure run at Tue Mar 11 18:05:14 2008
[0]PETSC ERROR: Configure options --with-cc=icc --with-cxx=icpc --with-fc=0 --with-mpi=0 LIBS="-lpthread -ldl -lutil -L /home/gcc/lib64 -lgcc_s" --download-c-blas-lapack=1 --with-shared=0 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
Abort

Other related posts: