Server Utilities :: Export XML Object Using Datapump
Jan 2, 2012
As a part of our back up we used to export the production data every day using Original export Utility but from 11g original export Utility is de supported and also datapump doesn't support XML Objects so is there any other way to export the full database else any option to export xml Object using datapump.
Expdp directory=xxx.dmp dumpfile=aaa.dmp logfile=xxx.log FULL=Y : :: : : :: : : : ; Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 24.87 MB Processing object type SCHEMA_EXPORT/USER
[code]...
then my export hangs..... checked in alert log nothing found.and then killed the job and reran again but same....checked the status and it's saying EXECUTING.
I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.
We have three instances on the RAC, When I create a directory is there a default instance that it gets created on? When I execute an datapump export on instance 1, it works but on 2 and 3 it fails with error relating to directory.
Error - ORA-39002: invalid operation ORA-39070: Unable to open the log file. ORA-29283: invalid file operation ORA-06512: at "SYS.UTL_FILE", line 475 ORA-29283: invalid file operation
(1) Does imp utility rebuild the indexes while loading data ? or it simply takes the rows from dump and load into test system without building from scratch ?
(2) I am trying to replace 'exp' and 'imp' with datapump utilities ? But, I am confused about the parameters to be used ?
(a) Can I load both data and meta data at the same time (Using CONTENT=ALL option) ? (b) I am planning to implement this in two steps :
first load only metadata using - CONTENT=METADATA_ONLY TABLE_EXISTS_ACTION=REPLACE
have a requirement to load .dmp files into existing staging tables and there is package to load the ODS tables from staging.So,I thought of using DBMS_Datapump utility to import the data from .DMP files to the Tables and this need be automated.
--Create Directory CREATE OR REPLACE DIRECTORY test_dir AS 'C:Test'
--grant Access to the User GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
--Script to import DECLARE l_dp_handle1 NUMBER; BEGIN l_dp_handle1 := dbms_datapump.OPEN(operation => 'IMPORT',
[code]...
Errors
ERROR at line 1: ORA-31634: job already exists ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 938 ORA-06512: at "SYS.DBMS_DATAPUMP", line 4590 ORA-06512: at line 4
I'm trying to deploy the schema using DATAPUMP API. The user from which the schema get deployed has the direct privilege of CREATE USER (not through role). But got the insufficient privileges error.
Processing object type SCHEMA_EXPORT/USER ORA-31685: Object type USER:"SCOTT1" failed due to insufficient privileges. Failing sql is: CREATE USER "SCOTT1" IDENTIFIED BY VALUES '4EBB0DDE3C79FE47' DEFAULT TABLESPACE "USERS" TEMPORARY TABLESPACE "TEMP2" PROFILE "APP_PROFILE".
But the user get created successfully when run the CREATE statement manually.I have created the user manually and again run the deployment procedure. Got the below error for ROLE_GRANTS.
Processing object type SCHEMA_EXPORT/ROLE_GRANT ORA-39083: Object type ROLE_GRANT failed to create with error: ORA-01932: ADMIN option not granted for role 'EXP_FULL_DATABASE' Failing sql is: GRANT "EXP_FULL_DATABASE" TO "SCOTT1"
The user has EXP_FULL_DATABASE with ADMIN Option and IMP_FULL_DATABASE with ADMIN option direct privileges.which privileges the user needs to deploy the schema successfully?
I am using Datapump import using database link to import an entire schema from another Server but it gives issues with constraints.I tried to first only import the metadata and then disable the constraints and import data and enable constraint but in this case the temp tablespace keeps filling up and i am out of space. Is there any method to do a full import including constraints and indexes.
I have analyzed that, datapump estimation is 9.902GB. When i check size of .dmp file, it's shows 1.44Gb.
Export: Release 11.2.0.1.0 - Production on Fri Apr 5 02:00:05 2013 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. ;;; Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Starting "SYSTEM"."SYS_EXPORT_FULL_01": system/******** dumpfile=expdp_LVGITRN_30_24_050413.dmp directory=DP_DIR logfile=expdp_LVGITRN_30_24_050413.log full=y exclude=statistics Estimate in progress using BLOCKS method... Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA Total estimation using BLOCKS method: [bold]9.902 GB [/bold]
I have a problem with DBMS_DATAPUMP.metadata_filter.Let's suppose that I need to export a huge list of tables (a,b,c,d,e,f,g,h,i...). Let's suppose that the list is dynamic do NOT want to use
DBMS_DATAPUMP.metadata_filter (handle => h1, NAME => 'NAME_EXPR', VALUE => 'IN (''a'', ''b'', ...)', object_type => NULL);
DBMS_DATAPUMP.metadata_filter (handle => h1, NAME => 'NAME_EXPR', VALUE => 'IN (SELECT a.export_object_name FROM my_export_table a, user_objects b WHERE a.export_object_name = b.object_name AND b.object_type = ''TABLE'')', object_type => NULL );
but it results in error.
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB [] ORA-00942: table or view does not exist
Every quarterly, we export the data from DEEXTRA and import into INEXTRA user.This is not working after i made the changes 2 tables and views in DEEXTRA .
I added new columns to 2 tables and associated 2 views. After this change, my import process got failed, with message especially for those views. "IMP-00041: Warning: object created with compilation warnings"
before i change the tables, import process was working fine. My doubt is, views were created in INEXTRA before the tables in import functionality.
I had given the grants similary the other objects. I belive no problems with privilages.
Because of table changes the order of the objects in exporting got disturbed? like in the exporting functionality first views created then the table?or the order of the objects in importing got disturbed? like first views created and then tables?
D:>impdp TESTIMP/TESTIMP TABLES=ENVDB DIRECTORY=Rini DUMPFILE=envdb_11g.dmp REM AP_SCHEMA=TESTIMP:STP11G REMAP_TABLESPACE=TBS_TESTIMP:TBS_STP11G; Import: Release 11.2.0.1.0 - Production on Thu Sep 9 15:16:45 2010 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production With the Partitioning, OLAP, Data Mining and Real Application Testing options ORA-39002: invalid operation ORA-39166: Object TESTIMP.ENVDB was not found.
One of my Friend gets error While datapump Export backup of Full database.
Pfa below error details:-
ORA-31693: Table data object "RADIOMIRCHI_PIP_HRMS"."GM_DEPT" failed to load/unload and is being skipped due to error: ORA-02354: error in exporting/importing data ORA-00604: error occurred at recursive SQL level 3 ORA-21780: Maximum number of object durations exceeded. [code]......
I am importing the dump.All the tables are getting imported but when it comes to the part as shown in the error it gets struck there and nothing is showed after.
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
The dumps for which I am doing the import is huge. Its around 46 GB.
select sl.sid, sl.serial#, sl.sofar, sl.totalwork from v$session_longops sl, v$session s, dba_datapump_sessions d where s.saddr = d.saddr and s.sid = sl.sid and s.serial# = sl.serial# order by start_time;
It showing the lock for two sid's. Should I kill the process and import the dump again? I used below command to import dums
I could see in teh log lot of IMP-00041: Warning: object created with compilation warnings
example: IMP-00041: Warning: object created with compilation warnings "CREATE FORCE VIEW "ARADMIN"."BMC_CORE_BMC_IMPACT" (" ""REQUESTID","SUBMITTER","CREATEDATE","ASSIGNEDTO","LASTMODIFIEDBY","MODIFIE" "DDATE","STATUS","SHORTDESCRIPTION","CMDBROWLEVELSECURITY","INSTANCEID","CMD"
datapump on Windows 2003/R11.2 I have a batch file that creates a daily dump of a schema in the DATA_PUMP_DIR - However, it doesn't! The script is as follows: REM Script to perform data pump export of the user01 schema and move to the
For some reason when this is run from a remote server as a batch it fails to create a file although the output from the scripts has no errors apart from move statements and the expdp output is all good (it states that the file was created in the expected location).If the expdp command is run on the server itself it is all good.
the database (11gR2) is located on Linux server. A business application is installed on a Windows server with an Oracle client 11g.The application is able to start a datapump export, but as matter of fact the dumpfiles are always written to the Linux-server. The directoryobject is defined as DATA_PUMP_DIR (which is the default directory).
Now we are supposed to change the datapump export in a matterthat the dumpfiles get written to the Windows server. Creating a new directory (e.g. c:datapump) and starting than the datapump from theclient always raises the errors
ORA-39002: ...ORA-39070: ...ORA-29283: ...ORA-06512: in "SYS.UTL_FILE", Zeile 536ORA-29283: ...
Is it possible at all to start a datapump export from a Windows client and writing the dumpfiles to the Windows server itself? Or do the dumpfiles always written to the database-server?
I am trying to export schemas from 10g to 11g. The NLS_CHARACTERSET for 10g is WE8ISO8859P1 and the NLS_CHARACTERSET for 11g is WE8MSWIN1252. Is it fine or do I need to change the character set, so that I will be able to successfully do the export/import?
i am having a problem when trying to export my DB,i could run an import fine,i have ran the catalog.sql,catproc.sql,catexp.sql and utlrp.sql again.Is it because the client and DB are different?How can i solve this problem
exp usr/pass file=exp_full.dmp log=exp_full.log full=y consistent=y Export: Release 8.1.7.0.0 - Production on Wed Nov 28 13:40:04 2007 (c) Copyright 2000 Oracle Corporation. All rights reserved. Connected to: Oracle8i Enterprise Edition Release 8.1.6.0.0 - Production With the Partitioning option
I was asked to do export/import of some schemas from 10g(linux) to 11g(AIX) using original expor/import method. I did not consider the character set and started doing export and import. while exporting, I get questionable statistics error in export log file. In the import log, I see the error like CREATE DATABASE LINK "xxxxxxxxxxxxx" CONNECT TO "xxxx" IDENTIFIED BY...
i need to export master data in excel sheets to our database and we use toad too. How i can export the data with the use of macros in excel. how i can export data from excel to oracle.