Re: Re: datafile size

  • From: Mladen Gogala <gogala@xxxxxxxxxxxxx>
  • To: Tanel Poder <tanel.poder.003@xxxxxxx>
  • Date: Tue, 02 Nov 2004 22:59:02 +0000

On 11/02/2004 05:48:05 PM, Tanel Poder wrote:
> Hi all!
>=20
> Btw, in 10g with bigfile tablespaces you can have datafile sizes up =20
> to
> 128TB.

And they say that size doesn't matter? Can utilities like tar, cpio, =20
gzip and bzip2 operate on such monsters? I know that "rm -f" will not =20
have problems even with the largest file, but that's probably not =20
something that my boss would like to see.


>=20
> This means files with 2^32-1 blocks per datafile - it can be done
> because in a bigfile tablespace, the ROWID bits for relative fno are
> now used for block# as well. This effectively means, that you can =20
> have
> only one datafile in a bigfile TS.

You say that in the Highlander tablespaces, as far as data files are =20
concerned, there can be only one? Interesting.

--=20
Mladen Gogala
Oracle DBA


--
//www.freelists.org/webpage/oracle-l

Other related posts: