With 20+ millions of directory objects, problems are inevitable. File systems are not made for that kind of load. The file names in the directory are either searched sequentially or by binary search, which is OK if there are few thousands of the directory entries, which fit in memory, but would likely cause problems with 20+ millions of directory entries. File systems are not meant for 20+ millions of files in a directory.
On 01/12/2018 06:11 PM, Kellyn Pot'Vin-Gorman wrote:
I’d only just heard of it recently, but the DBA experiencing it said it was any pull from external table to do bulk loads. There’s a number of bugs in Oracle support for the search criteria: “12.1 external table read”
_Bug 21553593 20+ million directory objects causing slow reads
from external tables_
_Bug 19597583 "external table reads" during local PDB _