[overture] Re: HDF library version mismatch error

  • From: "Adnan Qamar" <Adnan.Qamar@xxxxxxxxxxxx>
  • To: "overture@xxxxxxxxxxxxx" <overture@xxxxxxxxxxxxx>
  • Date: Tue, 30 Apr 2013 08:57:50 +0000

Hi Bill,
Attached file contains output I obtained after running "check.p" on the fresh 
installation of Overture. It seems mpi tests are failing with the same errors 
that I reported to you earlier. It appears that there are some issues with the 
version of HDF1.8.8 and openmpi-1.5.3/4 (which I am using) as suggested by many 
people in various forums over internet. I would try to upgrade the version of 
openmpi and see if this problem in resolved.
Best,
-Adnan

Adnan Qamar, PhD
Mechanical Engineering,
Physical Sciences and Engineering Division,
Building 4, Room 3216,
4700 King Abdullah University of Science and Technology (KAUST),
Thuwal 23955-6900, KSA.
Email: Adnan.Qamar@xxxxxxxxxxxx
Ph: 966-2-8084896
Fax: 96628021077@xxxxxxxxxxxxxxxx
________________________________
From: overture-bounce@xxxxxxxxxxxxx [overture-bounce@xxxxxxxxxxxxx] on behalf 
of Bill Henshaw [henshaw@xxxxxxxx]
Sent: Monday, April 29, 2013 6:36 AM
To: overture@xxxxxxxxxxxxx
Subject: [overture] Re: HDF library version mismatch error

Hi Adnan,
  The file Overture/configure.options should be there. You may need to rebuild 
the parallel version from scratch
and make those changes to the Makefile after you run configure on Overture but 
before your type make.

...Bill

On 04/28/2013 12:38 AM, Adnan Qamar wrote:
Hi Bill,
Yes, I have compiled the serial version on the same cluster and the file 
'n.hdf' is produced by the same serial version of ogen. I have checked the same 
file (n.hdf) on serial version on the cluster and it is working fine. It has 
issues only with parallel version.   I have checked file permission it seems to 
be in order. However, when I am running regression test on serial/parallel by 
typing 'check.p' it is giving me the following error.

unable to open /home/qamara/over_p/Overture.v25/configure.options
1 at ./checkop.p line 35.

Seems configure.options file is missing.
-Best,
Adnan

Adnan Qamar, PhD
Mechanical Engineering,
Physical Sciences and Engineering Division,
Building 4, Room 3216,
4700 King Abdullah University of Science and Technology (KAUST),
Thuwal 23955-6900, KSA.
Email: Adnan.Qamar@xxxxxxxxxxxx<mailto:Adnan.Qamar@xxxxxxxxxxxx>
Ph: 966-2-8084896
Fax: 96628021077@xxxxxxxxxxxxxxxx<mailto:96628021077@xxxxxxxxxxxxxxxx>
________________________________
From: overture-bounce@xxxxxxxxxxxxx<mailto:overture-bounce@xxxxxxxxxxxxx> 
[overture-bounce@xxxxxxxxxxxxx<mailto:overture-bounce@xxxxxxxxxxxxx>] on behalf 
of Bill Henshaw [henshaw@xxxxxxxx<mailto:henshaw@xxxxxxxx>]
Sent: Sunday, April 28, 2013 5:50 AM
To: overture@xxxxxxxxxxxxx<mailto:overture@xxxxxxxxxxxxx>
Subject: [overture] Re: HDF library version mismatch error

Hi Adnan,
  This is a curious error. Does the file n.hdf have the correct read 
permissions?

   Have you tried building the serial version of Overture on your cluster and 
running the
regression tests with check.p ?

Regards,
  Bill

On 04/27/2013 04:20 AM, Adnan Qamar wrote:
Hi Bill,
I have change the order of -I flag as suggested by you. After recompiling I am 
getting a different error. PLOTSTUFF just quits whereas CGINS shows some 
errors. Pasted are the errors that I am getting.

PLOTSTUFF ERROR
===================================================
A++ Internal_Index bounds checking: ON
Type: `plotStuff [-noplot] [-nopause] [-plot3d] [-ovText] fileName 
[file[.cmd]]' to read the show file called fileName,
                                                                       and 
optionally read a command file.
  or: `plotStuff [-noplot] [-nopause] [-plot3d] [-ovText] file.cmd' to run the 
command file (with first command the show file name).
Successfully opened /home/qamara/.overturerc for reading
Unknown keyword `ndwindow*width' in the .overturerc file.
Not using the colour `MEDIUMGOLDENROD'
User commands are being saved in the file `plotStuff.cmd'
mount the showfile: n.hdf
ShowFileReader::ERROR: unable to open an old file = n.hdf (or n.hdf.hdf) (or 
n.hdf.show)
error
Overture::abort: I am now going to purposely abort so that you can get a 
traceback from a debugger
Segmentation fault (core dumped)
 =============================================================

CGINS ERROR
===============================================================
$grid
OvertureParser::result = [n.hdf]
readOrBuildTheGrid:Try to read the overlapping grid file : n.hdf
 ***** Mounting file n.hdf****
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1522 in H5Fopen(): unable to open file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Sat Apr 27 
13:54:24 2013
, name = 'n.hdf', tent_flags = 0
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
getFromADataBase:ERROR: unable to open an old file = n.hdf (or n.hdf.hdf ),  
(or n.hdf.show )
readOrBuildTheGrid:ERROR return from getFromADataBase
Error occured in file src/readOrBuildTheGrid.C line 50.
error
Overture::abort: I am now going to purposely abort so that you can get a 
traceback from a debugger
Segmentation fault (core dumped)
==================================================================
Any suggestions to fix this.

Best,
-Adnan


Adnan Qamar, PhD
Mechanical Engineering,
Physical Sciences and Engineering Division,
Building 4, Room 3216,
4700 King Abdullah University of Science and Technology (KAUST),
Thuwal 23955-6900, KSA.
Email: Adnan.Qamar@xxxxxxxxxxxx<mailto:Adnan.Qamar@xxxxxxxxxxxx>
Ph: 966-2-8084896
Fax: 96628021077@xxxxxxxxxxxxxxxx<mailto:96628021077@xxxxxxxxxxxxxxxx>
________________________________
From: overture-bounce@xxxxxxxxxxxxx<mailto:overture-bounce@xxxxxxxxxxxxx> 
[overture-bounce@xxxxxxxxxxxxx<mailto:overture-bounce@xxxxxxxxxxxxx>] on behalf 
of Bill Henshaw [henshaw@xxxxxxxx<mailto:henshaw@xxxxxxxx>]
Sent: Thursday, April 25, 2013 5:58 AM
To: overture@xxxxxxxxxxxxx<mailto:overture@xxxxxxxxxxxxx>
Subject: [overture] Re: HDF library version mismatch error

Hi Adnan,

  As you noted, the problem occurs since the HDF version 1.8.5 library is being 
found first
by the compiler when it compiles files, but the loader is finding the HDF 1.8.8 
libraries.

  Here is my compile line when the files in Overture/DataBase are compiled:

g++ -fPIC -I/home/henshaw.0/Overture.g/include -I.   -DUSE_MESA 
-I/home/henshaw.0/A++P++/A++P++-4.3.2-64/A++/install/include 
-I/home/henshaw.0/software/OpenGL/Mesa-7.2.intel.gcc4.3.2/include 
-I/usr/include  -DBL_USE_DOUBLE -DBL_Solaris 
-I/usr/lib64/perl5/5.8.8/x86_64-linux-thread-multi/CORE   -g -DH5_USE_16_API 
-I/home/henshaw.0/software/hdf/hdf5-1.6.5-gcc4.3.2-64/include -c 
GenericDataBase.C HDF_DataBase.C DataBaseBuffer.C dbAccess.C dbFunctions.f 
kk_ptr.cc

  The -I flags tell the compiler where to look for .h files. In the above case, 
if there were HDF .h files
in /usr/include then these would be found before the .h files in
   /home/henshaw.0/software/hdf/hdf5-1.6.5-gcc4.3.2-64/include

You could edit the Overture/DataBase/Makefile and change the order of the -I 
flags so that the HDF include
directory is found first. I think you can make the following change in this 
file:

DataBase_date: $(Source)
#      $(CC) $(CCFLAGS) -DH5_USE_16_API -I$(HDF)/include -c $?
      $(CC) -DH5_USE_16_API -I$(HDF)/include $(CCFLAGS)  -c $?
      touch $@

Then in Overture/DataBase, type
  rm *.o *_date
and then type "make" in the Overture directory.

...Bill




On 04/23/2013 01:42 AM, Adnan Qamar wrote:
Hi Bill,
I am trying to install parallel Overture/CG (PetSc=on) on one of our HPC 
cluster. Apparently, everything compiles smoothly and all executable are 
generated. However, when I run parallel ogen or CGINS by utilising a grid file 
created by a serial version of ogen it  gives me HDF5 library version mismatch 
error (error cut pasted below). The serial version of ogen is complied with 
same version of HDF i.e 1.8.8 as used by parallel version.

I believe the HPC cluster already have HDF5 v 1.8.5 installed by root and  
somehow during compiling parallel or serial Overture/CG  it is picking the 
original installed libraries of HDF5.v.1.8.5 instead of library path I am 
providing using HDF variable in defenv file. I would appreciate if you could 
point me to a possible fix to this issue.
Best,
Adnan
===================================================
Unknown keyword `ndwindow*width' in the .overturerc file.
Not using the colour `MEDIUMGOLDENROD'
User commands are being saved in the file `plotStuff.cmd'
mount the showfile: n.hdf
Warning! ***HDF5 library version mismatched error***
The HDF5 header files used to compile this application do not match
the version used by the HDF5 library to which this application is linked.
Data corruption or segmentation faults may occur if the application continues.
This can happen when an application was compiled by one version of HDF5 but
linked with a different version of static or shared HDF5 library.
You should recompile the application or check your shared library related
settings such as 'LD_LIBRARY_PATH'.
You can, at your own risk, disable this warning by setting the environment
variable 'HDF5_DISABLE_VERSION_CHECK' to a value of '1'.
Setting it to 2 or higher will suppress the warning messages totally.
Headers are 1.8.5, library is 1.8.8
        SUMMARY OF THE HDF5 CONFIGURATION
        =================================

General Information:
-------------------
           HDF5 Version: 1.8.8
          Configured on: Sun Apr 21 10:12:00 AST 2013
          Configured by: qamara@fen3
         Configure mode: production
            Host system: x86_64-unknown-linux-gnu
          Uname information: Linux fen3 2.6.32-358.2.1.el6.x86_64 #1 SMP Wed 
Feb 20 12:17:37 EST 2013 x86_64 x86_64 x86_64 GNU/Linux
               Byte sex: little-endian
              Libraries:
         Installation point: /home/qamara/over_p/hdf5-1.8.8

Compiling Options:
------------------
               Compilation Mode: production
                     C Compiler: /usr/lib64/openmpi/bin/mpicc
                         CFLAGS:
                      H5_CFLAGS: -std=c99 -pedantic -Wall -Wextra -Wundef 
-Wshadow -Wpointer-arith -Wbad-function-cast -Wcast-qual -Wcast-align 
-Wwrite-strings -Wconversion -Waggregate-return -Wstrict-prototypes 
-Wmissing-prototypes -Wmissing-declarations -Wredundant-decls -Wnested-externs 
-Winline -Wno-long-long -Wfloat-equal -Wmissing-format-attribute 
-Wmissing-noreturn -Wpacked -Wdisabled-optimization -Wformat=2 
-Wunreachable-code -Wendif-labels -Wdeclaration-after-statement 
-Wold-style-definition -Winvalid-pch -Wvariadic-macros -Wnonnull -Winit-self 
-Wmissing-include-dirs -Wswitch-default -Wswitch-enum -Wunused-macros 
-Wunsafe-loop-optimizations -Wc++-compat -Wstrict-overflow -Wlogical-op 
-Wlarger-than=2048 -Wvla -Wsync-nand -Wframe-larger-than=16384 
-Wpacked-bitfield-compat -O3 -fomit-frame-pointer -finline-functions
                      AM_CFLAGS:
                       CPPFLAGS:
                    H5_CPPFLAGS: -D_POSIX_C_SOURCE=199506L   -DNDEBUG 
-UH5_DEBUG_API
                    AM_CPPFLAGS: -D_LARGEFILE_SOURCE -D_LARGEFILE64_SOURCE 
-D_BSD_SOURCE
               Shared C Library: no
               Static C Library: yes
  Statically Linked Executables: no
                        LDFLAGS:
                     H5_LDFLAGS:
                     AM_LDFLAGS:
          Extra libraries:  -lz -lrt -lm
                Archiver: ar
               Ranlib: ranlib
           Debugged Packages:
            API Tracing: no

Languages:
----------
                        Fortran: no

                            C++: no

Features:
---------
                  Parallel HDF5: mpicc
             High Level library: yes
                   Threadsafety: no
            Default API Mapping: v18
 With Deprecated Public Symbols: yes
         I/O filters (external): deflate(zlib)
         I/O filters (internal): shuffle,fletcher32,nbit,scaleoffset
                            MPE:
                     Direct VFD: no
                        dmalloc: no
Clear file buffers before write: yes
           Using memory checker: no
         Function Stack Tracing: no
                           GPFS: no
      Strict File Format Checks: no
   Optimization Instrumentation: no
       Large File Support (LFS): yes
Bye...
Segmentation fault (core dumped)
==========================================================

Adnan Qamar, PhD
Mechanical Engineering,
Physical Sciences and Engineering Division,
Building 4, Room 3216,
4700 King Abdullah University of Science and Technology (KAUST),
Thuwal 23955-6900, KSA.
Email: Adnan.Qamar@xxxxxxxxxxxx<mailto:Adnan.Qamar@xxxxxxxxxxxx>
Ph: 966-2-8084896
Fax: 96628021077@xxxxxxxxxxxxxxxx<mailto:96628021077@xxxxxxxxxxxxxxxx>
________________________________


________________________________

This message and its contents including attachments are intended solely for the 
original recipient. If you are not the intended recipient or have received this 
message in error, please notify me immediately and delete this message from 
your computer system. Any unauthorized use or distribution is prohibited. 
Please consider the environment before printing this email.



*** check.p : machine=linux, precision=double, parallel=parallel

====================================================================================
  check.p : This perl script will run the Overture regression tests 
 
    Usage: check.p 
[debug=true/false][grids=false][cadGrids=false][op=false][rap=false] 
        debug=true/false : see more detailed results printed to the screen if 
true. 
        grids=false    : turn off the test for grid generation.
        cadGrids=false : turn off the test for grid generation from CAD.
        op=false       : turn off the test for operators.
        rap=false      : turn off the test for rapsodi (CAD fixup tests)
 
==================================================================================

   
************************************************************************************************
   *** Test: build grids : build a collection of overlapping grids in the 
sampleGrids directory ***
   
************************************************************************************************
>>>cd /home/qamara/over_p/Overture.v25/sampleGrids...
>>>make the sample grids: generate.p  

================================================================================
This perl script will run ogen and create many different grids
It will check to see if the grids seem to be correctly generated.
  Usage: 
    generate.p [options] 
 Options 
   <gridName> : the name of a single grid to check. By default check all grids. 
  
   check=<checkFileDirectory> : directory in which to look for the check files, 
default=. 
   ogen=<name> : specify where the ogen executable is.   
   cmdFileDirectory=<dir> : directory where to find the command files.   
   -replace (replace check files with those currently generated)   
   -replaceAll (replace .dp and .sp check files with those currently generated) 
  
   -np=<num> : use this many processors when running in parallel  
==============================================================================

*** machine=linux, precision=double, parallel=parallel
Using the grid generator ../bin/ogen
checking square5.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./square5.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:17 2013
, name = 'square5.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21833 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating square5.cmd ****
checking square5CC.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./square5CC.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:17 2013
, name = 'square5CC.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21844 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating square5CC.cmd ****
checking square10.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./square10.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:17 2013
, name = 'square10.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21848 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating square10.cmd ****
checking square20.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./square20.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:18 2013
, name = 'square20.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21852 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating square20.cmd ****
checking square40.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./square40.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:18 2013
, name = 'square40.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21856 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating square40.cmd ****
checking channelShort.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./channelShort.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:18 2013
, name = 'channelShort.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21860 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating channelShort.cmd ****
checking sis.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./sis.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:18 2013
, name = 'sis.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21864 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating sis.cmd ****
checking cic.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./cic.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:18 2013
, name = 'cic.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21868 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating cic.cmd ****
checking cic2.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./cic2.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:19 2013
, name = 'cic2.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21872 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating cic2.cmd ****
checking cicCC.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./cicCC.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:19 2013
, name = 'cicCC.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21876 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating cicCC.cmd ****
checking cic.4.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./cic.4.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:19 2013
, name = 'cic.4.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21880 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating cic.4.cmd ****
checking cicAdd.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./cicAdd.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:19 2013
, name = 'cicAdd.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21884 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating cicAdd.cmd ****
checking cilc.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./cilc.cmd > ogen.out
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21888 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating cilc.cmd ****
checking qcic.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./qcic.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:20 2013
, name = 'qcic.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21892 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating qcic.cmd ****
checking valve.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./valve.cmd > ogen.out
STOP CPINIT: Error return from STINIT (r)
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 21896 on
node fen3 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
 *** There was an error generating valve.cmd ****
checking valveCC.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./valveCC.cmd > ogen.out
STOP CPINIT: Error return from STINIT (r)
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 21899 on
node fen3 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
 *** There was an error generating valveCC.cmd ****
checking oneValve.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./oneValve.cmd > ogen.out
STOP CPINIT: Error return from STINIT (r)
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 21902 on
node fen3 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
 *** There was an error generating oneValve.cmd ****
checking stir.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./stir.cmd > ogen.out
STOP CPINIT: Error return from STINIT (r)
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 21906 on
node fen3 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
 *** There was an error generating stir.cmd ****
checking twoBump.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./twoBump.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:21 2013
, name = 'twoBump.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21909 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating twoBump.cmd ****
checking box5.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./box5.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:21 2013
, name = 'box5.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21913 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating box5.cmd ****
checking box10.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./box10.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:21 2013
, name = 'box10.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21917 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating box10.cmd ****
checking box20.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./box20.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:21 2013
, name = 'box20.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21921 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating box20.cmd ****
checking box40.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./box40.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:22 2013
, name = 'box40.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21925 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating box40.cmd ****
checking bib.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./bib.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:22 2013
, name = 'bib.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21929 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating bib.cmd ****
checking sib.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./sib.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:22 2013
, name = 'sib.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21933 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating sib.cmd ****
checking twoBoxesInterface.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./twoBoxesInterface.cmd > 
ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:23 2013
, name = 'twoBoxesInterfacei111.order2.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21937 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating twoBoxesInterface.cmd ****
checking sibCC.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./sibCC.cmd > ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:23 2013
, name = 'sibCC.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21941 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating sibCC.cmd ****
checking revolve.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./revolve.cmd > ogen.out
STOP CPINIT: Error return from STINIT (r)
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 21945 on
node fen3 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
 *** There was an error generating revolve.cmd ****
checking sphereInATube.cmd...
mpirun -np 1 ../bin/ogen noplot nopause abortOnEnd ./sphereInATube.cmd > 
ogen.out
HDF5-DIAG: Error detected in HDF5 (1.8.8) MPI-process 0:
  #000: H5F.c line 1440 in H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: H5F.c line 1211 in H5F_open(): unable to open file: time = Tue Apr 30 
11:48:24 2013
, name = 'sphereInATube.hdf', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: H5FD.c line 1086 in H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: H5FDmpio.c line 999 in H5FD_mpio_open(): MPI_ERR_OTHER: known error not 
in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
ogen: HDF5_DataBase.C:4612: virtual int HDF_DataBase::put(const int&, const 
aString&): Assertion `fileID>0 && fullGroupPath.c_str()!=__null' failed.
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 21948 on node fen3 exited on signal 
11 (Segmentation fault).
--------------------------------------------------------------------------
 *** There was an error generating sphereInATube.cmd ****
********************************************************************
********************* There were 29  ERRORS *************************
**** NOTE: some errors may occur due to differences in         *****
**** machine precision. If the numbers are similiar then there *****
**** is probably no reason for concern. (try plotting the grid)*****
********************************************************************
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
ERROR return rt=7424

 An ERROR occured for the test=buildGrids
 You may want to check the appropriate log file=.
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

 ****** skipping grids from cad checks in parallel  ******

   
************************************************************************************************
   *** Test: test operators and grid functions (in the tests directory)         
                ***
     
************************************************************************************************
>>>cd /home/qamara/over_p/Overture.v25/tests...
>>>run checkop.p 

================================================================================
This perl script will run some regression tests on grid functions and operators.

  Usage: 
    checkop.p [<application>]  (or `perl checkop.p') 
  Notes: 
 Applications:                                                                
     tderivatives    : tests derivatives in the operators.          
     tbcc            : tests boundary conditions for coefficient matrices.   
     tbc             : tests explicit boundary conditions.                    
     tcm3            : tests coefficient matrix solver on a CompositeGrid    
     tcm4            : tests coefficient matrix solver  (systems) on a 
CompositeGrid    
==============================================================================

*** precision=double, parallel=parallel ****
Making tderivatives
mpiCC -fPIC  -I/home/qamara/over_p/Overture.v25/include -I.   -DUSE_MESA 
-I/home/qamara/over_p/A++P++-0.8.0/P++/install/include -I/usr/include 
-I/usr/include  -DBL_AUTO_INSTANTIATE -DBL_Solaris -I/usr/lib64/perl5/CORE   
-I/usr/include/openmpi-x86_64 -DUSE_PPP -g -DUSE_PPP -o tderivatives 
tderivatives.o    -Wl,-rpath,/home/qamara/over_p/Overture.v25/lib 
-L/home/qamara/over_p/Overture.v25/lib -lOverture -lOverture_static  
-Wl,-rpath,/home/qamara/over_p/hdf5-1.8.8/lib 
-L/home/qamara/over_p/hdf5-1.8.8/lib -lhdf5 -ljpeg -lz  
-Wl,-rpath,/home/qamara/over_p/A++P++-0.8.0/P++/install/lib 
-L/home/qamara/over_p/A++P++-0.8.0/P++/install/lib -lPpp -lPpp_static 
-Wl,-rpath,/usr/lib64/openmpi/lib -L/usr/lib64/openmpi/lib -lmpi -lmpi_cxx -lc 
-lm   -Wl,-rpath, /usr/lib64/perl5/CORE -L/usr/lib64/perl5/CORE -lperl -lresolv 
-lnsl -ldl -lm -lcrypt -lutil -lpthread -lc -Wl,-rpath,/usr/lib64 -L/usr/lib64 
-lGL -lGLU  -Wl,-rpath,/usr/lib64 -L/usr/lib64 -lXm -L/usr/lib64 -lXpm -lXp 
-lXt -lXmu -lXi -lXext -lX11 -lm
/usr/lib64/perl5/CORE: file not recognized: Is a directory
collect2: ld returned 1 exit status
make: *** [tderivatives] Error 1
Error making tderivatives 
>>>tests of operators and grid functions successful!

 ****** skipping rapsodi checks in parallel ****** 

   
************************************************************************************************
   ****************** Tests: passed = 1, failed=1 
***************************************** 
   
************************************************************************************************

Other related posts: