Server Utilities :: Data Migration Using Datapump?
May 11, 2011
I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.
As a part of our back up we used to export the production data every day using Original export Utility but from 11g original export Utility is de supported and also datapump doesn't support XML Objects so is there any other way to export the full database else any option to export xml Object using datapump.
We have three instances on the RAC, When I create a directory is there a default instance that it gets created on? When I execute an datapump export on instance 1, it works but on 2 and 3 it fails with error relating to directory.
Error - ORA-39002: invalid operation ORA-39070: Unable to open the log file. ORA-29283: invalid file operation ORA-06512: at "SYS.UTL_FILE", line 475 ORA-29283: invalid file operation
(1) Does imp utility rebuild the indexes while loading data ? or it simply takes the rows from dump and load into test system without building from scratch ?
(2) I am trying to replace 'exp' and 'imp' with datapump utilities ? But, I am confused about the parameters to be used ?
(a) Can I load both data and meta data at the same time (Using CONTENT=ALL option) ? (b) I am planning to implement this in two steps :
first load only metadata using - CONTENT=METADATA_ONLY TABLE_EXISTS_ACTION=REPLACE
have a requirement to load .dmp files into existing staging tables and there is package to load the ODS tables from staging.So,I thought of using DBMS_Datapump utility to import the data from .DMP files to the Tables and this need be automated.
--Create Directory CREATE OR REPLACE DIRECTORY test_dir AS 'C:Test'
--grant Access to the User GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
--Script to import DECLARE l_dp_handle1 NUMBER; BEGIN l_dp_handle1 := dbms_datapump.OPEN(operation => 'IMPORT',
[code]...
Errors
ERROR at line 1: ORA-31634: job already exists ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 938 ORA-06512: at "SYS.DBMS_DATAPUMP", line 4590 ORA-06512: at line 4
I'm trying to deploy the schema using DATAPUMP API. The user from which the schema get deployed has the direct privilege of CREATE USER (not through role). But got the insufficient privileges error.
Processing object type SCHEMA_EXPORT/USER ORA-31685: Object type USER:"SCOTT1" failed due to insufficient privileges. Failing sql is: CREATE USER "SCOTT1" IDENTIFIED BY VALUES '4EBB0DDE3C79FE47' DEFAULT TABLESPACE "USERS" TEMPORARY TABLESPACE "TEMP2" PROFILE "APP_PROFILE".
But the user get created successfully when run the CREATE statement manually.I have created the user manually and again run the deployment procedure. Got the below error for ROLE_GRANTS.
Processing object type SCHEMA_EXPORT/ROLE_GRANT ORA-39083: Object type ROLE_GRANT failed to create with error: ORA-01932: ADMIN option not granted for role 'EXP_FULL_DATABASE' Failing sql is: GRANT "EXP_FULL_DATABASE" TO "SCOTT1"
The user has EXP_FULL_DATABASE with ADMIN Option and IMP_FULL_DATABASE with ADMIN option direct privileges.which privileges the user needs to deploy the schema successfully?
I am using Datapump import using database link to import an entire schema from another Server but it gives issues with constraints.I tried to first only import the metadata and then disable the constraints and import data and enable constraint but in this case the temp tablespace keeps filling up and i am out of space. Is there any method to do a full import including constraints and indexes.
I have analyzed that, datapump estimation is 9.902GB. When i check size of .dmp file, it's shows 1.44Gb.
Export: Release 11.2.0.1.0 - Production on Fri Apr 5 02:00:05 2013 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. ;;; Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Starting "SYSTEM"."SYS_EXPORT_FULL_01": system/******** dumpfile=expdp_LVGITRN_30_24_050413.dmp directory=DP_DIR logfile=expdp_LVGITRN_30_24_050413.log full=y exclude=statistics Estimate in progress using BLOCKS method... Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA Total estimation using BLOCKS method: [bold]9.902 GB [/bold]
Expdp directory=xxx.dmp dumpfile=aaa.dmp logfile=xxx.log FULL=Y : :: : : :: : : : ; Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 24.87 MB Processing object type SCHEMA_EXPORT/USER
[code]...
then my export hangs..... checked in alert log nothing found.and then killed the job and reran again but same....checked the status and it's saying EXECUTING.
I have a problem with DBMS_DATAPUMP.metadata_filter.Let's suppose that I need to export a huge list of tables (a,b,c,d,e,f,g,h,i...). Let's suppose that the list is dynamic do NOT want to use
DBMS_DATAPUMP.metadata_filter (handle => h1, NAME => 'NAME_EXPR', VALUE => 'IN (''a'', ''b'', ...)', object_type => NULL);
DBMS_DATAPUMP.metadata_filter (handle => h1, NAME => 'NAME_EXPR', VALUE => 'IN (SELECT a.export_object_name FROM my_export_table a, user_objects b WHERE a.export_object_name = b.object_name AND b.object_type = ''TABLE'')', object_type => NULL );
but it results in error.
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB [] ORA-00942: table or view does not exist
I have Windows server working Oracle 8.0.5.I have to migrate to another server that is Oracle 11gR2 on Windows 2008.Some say I can't do Export and Import directly.And some say DBlink is not working from 11Gr2 to 8.
What can be fewest step to migrate?Somebody try to use 'Oracle directory' and VB and Powerbuilder program to send data and DDL from 8 to 11Gr2.
I have migrated from postgres to oracle entire databases however some tables have spaces how can they be removed and how can i found out all the tables in a schema with spaces?
im trying to migrate Sybase 12.5 to oracle 11G R2 on linux all tables are fine expect one which fails with error
"SQL Error: ORA-01841: (full) year must be between -4713 and +9999, and not be 0 01841. 00000 - "(full) year must be between -4713 and +9999, and not be 0" *Cause: Illegal year entered *Action: Input year in the specified range"
i have change the data type from CHAR to VARCHAR2 etc as for the datatype date shows as date on the tool - I'm using sqldeveloper from oracle.
We are planning to migrate data from an application called clintrace to another application called argus safety. Both the applications are related to pharmacovigilence safety operation. Both the applications functionality are similar. So both the database are having the same data though the table structures might be different. Both the database are oracle clintrace db is 9i and argus db is 11g.
have 2 Oracle server. One with Suse Linux (Oracle 10.2.0.4.0) and one with Windows 2003 Server x64 (Oracle 11.2.0.1.0). I made a mistake by installing Oracle 11 x32 on a x64 server. Nevertheless it works for about half a year. Then my backups with datapump don't work. I changed the SGA_TARGET down to 1024M and the datapump works again. Now I want to renew the Windows server and want to import the datapump schemes to the Linux server.
I have database A (Working in Live environment) and Database B copy of Database (Not live) I have Restored whole database (A) RMAN backup file on Database (B) Previous week now i don't want to change anything in any schema and want to import only updated and new records in the table in Database B
There are around 20 schema If for example i have everything in new database B all required database objects like Procedure,functions, packages with indexes in all tables and data in tables, i just want to add new data and updated data.
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
I have exported data of one user an importing into another schema at another server. when i am trying to imoport it is working fine for quite no of imports into tables, but after some time it starts giving me below mention error...
IMP-00008: unrecognized statement in the export file: < IMP-00008: unrecognized statement in the export file: < IMP-00008: unrecognized statement in the export file: <ے IMP-00008: unrecognized statement in the export file: +A IMP-00008: unrecognized statement in the export file: [code]...
load data infile 'trlc.csv' replace into table trlc fields TERMINATED BY '|' TRAILING NULLCOLS (est_no,right_no,maj_auth,weight,idm_ht,c_date,P_tkt)
The rows get inserted successfully. But the result sets are different, for example: When I do a select in SQL Server,'select len(weight) from trlc;' , I get the length as 0. But when I do a select in oracle database, I get the length as 1. Also, the result set varies for the query below:
select * from trlc where weight=' ';
(SQL Server returns 1 row but Oracle returns no rows)
Do I need to mention any conversion code for the weight field to accept ' ' value?
i am working on oracle 11g r2. is there any way to go back to oracle 9i with use of import/export utility. Should i take downgraded export from oracle 11g for oracle 9i.