Hi all , Thanks for all the input which you have provide . I've noted it down. I would like to clear my question. our requirements : We have several customers using oracle9i and 8i (and in the future 10g) , all around the world . Most of those database are big (3Tb-400Gb). Something we need to get those database in order to investigate problems or new patches. The issue is how do we get the copy of the database from customer site to the development center. What we need is the database structure ( Tables , code and more) and set of configuration tables (size 200M~ or less). * first option * (export/import) Today , we have a solution that work like the way some of the answers aimed to. On the customer side we do 2 exports . one without data , and one with configurations tables only . - This works without any problem. The problems start on the development side , where we first import the structure (and this raise some issues , like tablespaces , code dependencies ...) and then import the data (which raise others set of problems like constraints). -- This step require help of a DBA. * second option * (partial backup & duplicate restore) What I would like to suggest (and I like to hear your ideas) is to do the following : On the customer side - Do a database (RMAN) backup of system,undo and configuration tablespaces + archive logs , only those tablespaces ,not the all database tablespaces ( Does RMAN a such simple script ??) On the development side - Do a database (RMAN) restore & recover from the backup. Doing a restore & recover only for those tablespaces (put other data files offline , because they are not needed ( Does RMAN a such simple script ??) . I think that the second option will be create bigger files to transport over the internet , but will be more simple to implement , and once developed , it can be very automatic , without human interaction. I would like to get your comments. Amihay Gonen DBA, ----------------------------------------------------------- my old question : Hi , I've a requirement to develop a process to copy production database (3Tb and above) at customer site to QA or test environment (Vendor side). The requirements are : * Database structures - This include tables,indexes , code (package,trigger ...) , views ...(select count(*) from DBA_objects = 150,000 ) * Data of main configurations tables. (total size of all tables is less then 200M). This copy is done in order to investigate problem in customer sites and to verify that patches which will be release , will run well. I'm thinking on two ways to developer such process : 1. Full exp without data + A list of tables to export with data. Zipped those files , transfer them to vendor site , import those files into a clean database. 2. Take a hot backup of system and undo tablespaces and the configurations tablespace (let's assume that all the configurations tables are in the same tablespace).Zipped those files , transfer them to vendor site , create a new instance , recover and open the database What do you think it , it's the best way to go ? Please note that this process should be done with minimal DBA resource in the process. (DBA is a rare resource :) ) Amihay Gonen DBA, -- //www.freelists.org/webpage/oracle-l