RE: tar exclusions

A ludicrously large exclusion list could be the problem, but it's
probably related to stupidity in the exclusion list matching and the
performance hit you take when you try to do a readdir() on a directory
with lots of objects.  Especially on older UNIX filesystems, once you
get into the thousands/10s of thousands of files in a directory, each
opendir/readdir combo goes from a couple of lookups to crawling a
hideous linked list structure.  Couple that with some stupidity in the
matching, and you've got a multi-hour job.

 

If you ever find yourself in this situation again, you could use
strace/truss/whatever to watch the system calls, and we/you could see
where the time is being spent.

 

Thanks,

Matt

 

________________________________

From: oracle-l-bounce@xxxxxxxxxxxxx
[mailto:oracle-l-bounce@xxxxxxxxxxxxx] On Behalf Of Niall Litchfield
Sent: Thursday, September 02, 2010 5:50 PM
To: ORACLE-L
Subject: tar exclusions

 

I figured this list would be a good place to ask.I had recently to tar a
(15g) directory tree on Linux for transfer to a new host. The tree had a
rather large number of log and trace files that we didn't wish to move.
No problem build an exclusion list and use 

 

tar -X <file> -cpvzf <tarfile> <directory> 

 

I killed this after 10 hours. I then removed the old log and trace files
saving about 1gb of space only and created a new (blank) exclusion list

 

tar -X <file> -cpvf <tarfile> <directory>

 

took 40 mins. Is this likely to be a side effect of removing compression
(the server was pretty much idle apart from me so I didn't expect the
cpu cycles to be an issue) or of having a ludicrously large exclusion
list.  

-- 
Niall Litchfield
Oracle DBA
http://www.orawin.info

Other related posts: