Server Utilities :: Datapump Export Taking Long Time (HUNG)?

Aug 23, 2012

Expdp directory=xxx.dmp dumpfile=aaa.dmp logfile=xxx.log FULL=Y
: :: : : :: : : : ;
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 24.87 MB
Processing object type SCHEMA_EXPORT/USER

[code]...

then my export hangs..... checked in alert log nothing found.and then killed the job and reran again but same....checked the status and it's saying EXECUTING.

View 15 Replies


ADVERTISEMENT

Server Utilities :: DBMS-DATAPUMP.metadata-filter And Long Table List?

Jun 9, 2010

I have a problem with DBMS_DATAPUMP.metadata_filter.Let's suppose that I need to export a huge list of tables (a,b,c,d,e,f,g,h,i...). Let's suppose that the list is dynamic do NOT want to use

DBMS_DATAPUMP.metadata_filter (handle => h1,
NAME => 'NAME_EXPR',
VALUE => 'IN (''a'', ''b'', ...)',
object_type => NULL);

In my_export_table there is the list:

CREATE TABLE my_export_table
(
EXPORT_OBJECT_NAME VARCHAR2(50 BYTE)
)

Now I'm trying to use this form:

DBMS_DATAPUMP.metadata_filter (handle => h1,
NAME => 'NAME_EXPR',
VALUE => 'IN (SELECT a.export_object_name FROM my_export_table a, user_objects b WHERE a.export_object_name = b.object_name AND b.object_type = ''TABLE'')',
object_type => NULL
);

but it results in error.

Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
ORA-00942: table or view does not exist

[code]...

View 1 Replies View Related

Server Utilities :: Export XML Object Using Datapump

Jan 2, 2012

As a part of our back up we used to export the production data every day using Original export Utility but from 11g original export Utility is de supported and also datapump doesn't support XML Objects so is there any other way to export the full database else any option to export xml Object using datapump.

View 2 Replies View Related

Drop Statement On Table Taking Long Time

Sep 20, 2010

We are firing a normal Drop command on our database and the database version is 10.2.0.4.The database is running on AIX v5.The command is taking more time than usual .

When i am monitoring the session i can see that a call is being made to procedure "aw_drop_proc".Could i ask you if this is something that is taking more time than usual.

We are not having any partitions on the nested tables .We have a pack of tables and we are dropping this pack through a procedure.The pack comprises of nested tables & normal tables.To drop a nested table it is taking around 6 seconds(Table with no rows) and a normal table(With no rows) it is taking 17 milli seconds.We have a partition on Normal table.

The same operation in windows is taking very less time when compared to AIX.

View 5 Replies View Related

SQL Queries Taking Long Time After 11g Data Migration

Sep 14, 2010

I have upgraded oracle database from 9i to 11g using export and import utility. After migration we are facing performance issue in report generation, We have observed that First execution of report is taking very long time and when we generate the same report 2 -3 times there is considerable change in the execution time and it is more better than the first execution.

2 days back I have restarted the database and found the same issue. There are around 300 Reports and it is not possible to generate all the reports 2-3 times every time we restart the database.

View 5 Replies View Related

PL/SQL :: Stored Procedure Taking Long Time When Updates In Loop

Feb 5, 2013

This is my stored procedure

I have below store procedure:

create or replace
PROCEDURE TESTPERFORMANCE (
o_statuscode OUT NUMBER,
o_statusdescription OUT VARCHAR2,
starttime out timestamp,
time_after_query_TESTJOB out timestamp,

[Code]...

This procedure is taking around 35 minutes when there are 35000 records to loop over (i.e cursor has 35000 records) and TESTJOBTRANSACTIONS table has 90000 records. How to reduce execution time.

View 12 Replies View Related

Server Utilities :: Taking Export Dump Using Expdp Of Some Schema's Of Total Size Is 300GB

Mar 30, 2007

I'm taking export dump using expdp of some schema's of total size is 300GB. This is the par file:

DIRECTORY=expdp
FILESIZE=32212254720
DUMPFILE=expdp_schema01.dmp,expdp_schema02.dmp,expdp_schema03.dmp,expdp_schema04.dmp,expdp_schema05.dmp,expdp_schema06.dmp,expdp_sche ma07.dmp,expdp_schema08.dmp,expdp_schema09.dmp,expdp_schema10.dmp,expdp_schema11.dmp,expdp_schema12.dmp,expdp_schema13.d
[code]....

here one biggest schema size is 250GB and the total size of all the schema's is 300GB. The file where am taking the dump has 350GB space but even then the expdp failed saying

ORA-39095: Dump file space has been exhausted: Unable to allocate 8192 bytes

why it failed and how to restart it and make sure it runs successfully without error.

View 4 Replies View Related

Server Utilities :: Impdp - Package Body Import Taking Huge Time?

Sep 13, 2012

I am try to import 4G dump in Oracle 11R2 version, in that we have around 9000+ Package Body which is taking huge time than other objects (about 8 to 12 hrs) and also it is expecting lots of system space (roughly about 10GB).

I have tried both parallel and non-parallel.how to improve speed of the package body import.

Details about the Schema & Import No. of objects in Schema

SQL> select object_type,count(1) from user_objects GROUP BY ROLLUP( object_type);

OBJECT_TYPE COUNT(1)
------------------- ----------
FUNCTION 248
INDEX 5161
JAVA CLASS 471
JAVA RESOURCE 1
JAVA SOURCE 16
LIBRARY 1

ORA-00933: SQL command not properly ended

View 3 Replies View Related

Server Utilities :: Time Taken In Export And Import?

Feb 13, 2012

When i do a table export , it got over in 30 mins.When i do import using same dump file (that was created in 30 mins), its taking more than 30 mins .

why the import is taking more time than the export time ?

View 14 Replies View Related

Server Utilities :: Get Total Time When Using Export A Table?

Jul 12, 2011

How can i get the total time when using export a table?

for example:

exp userid=test/test file=d: est.dmp tables=(tb_test)

View 4 Replies View Related

Server Utilities :: How To Add Date / Time In Export Dump File In Linux

Jul 11, 2013

I want to know how to add date/time in export dump file in Linux using parfile script. I keep getting an error "contain an invalid substitution variables"

my parfile is:

Dumpfile = Daily_Full_%U_`date "+%Y%m%d%H%S"`.dmp or
Dumpfile = Daily_Full_%U_`%date%`.dmp

View 1 Replies View Related

Server Utilities :: Datapump API?

Jun 15, 2011

I am trying to use the export the database using datapump API dbms_datapump.I want to export the entire database excluding some schemas.

My question is how to exclude some schemas using datapump API?

View 1 Replies View Related

Server Utilities :: Data Migration Using Datapump?

May 11, 2011

I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.

View 28 Replies View Related

Server Utilities :: Oracle Directory And Datapump

Mar 31, 2010

We have three instances on the RAC, When I create a directory is there a default instance that it gets created on? When I execute an datapump export on instance 1, it works but on 2 and 3 it fails with error relating to directory.

Error -
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation

Is my only option is to use shared storage space?

View 6 Replies View Related

Server Utilities :: Datapump And Index Building

Jan 13, 2012

Currently we are using "exp and imp" utilities to unload from production and load into Dev server. While importing, we are following below steps

(1) Load only data [by specifying INDEXES=N in the par file]
(2) Unlock statistics
(3) Load indexes, other objects [by specifying ROWS=N]

After doing these steps, both data, indexes and others objects are loaded. To verify indexes, we are checking DBA_INDEXES.

DBA_INDEXES :
-------------
OWNER INDEX_NAME TABLE_NAME STATUS LAST_ANALYZED
----- ---------- ---------- ------ -------------
MYSCH CP_INDEX_1 CP_TABLE_1 VALID 14/JAN/12

Question :-

(1) Does imp utility rebuild the indexes while loading data ? or it simply takes the rows from dump and load into test system without building from scratch ?

(2) I am trying to replace 'exp' and 'imp' with datapump utilities ? But, I am confused about the parameters to be used ?

(a) Can I load both data and meta data at the same time (Using CONTENT=ALL option) ?
(b) I am planning to implement this in two steps :

first load only metadata using - CONTENT=METADATA_ONLY TABLE_EXISTS_ACTION=REPLACE

then, load data - CONTENT=DATA_ONLY.

View 1 Replies View Related

Server Utilities :: Using The Datapump Tool To Migrate 10g DB To 11g?

Jan 3, 2012

I am trying to use the datapump tool to migrate a 10g db to 11g. Everything works fine except for the "nameless" check constraints.

View 7 Replies View Related

Server Utilities :: DBMS-DATAPUMP Import Error?

Oct 25, 2010

have a requirement to load .dmp files into existing staging tables and there is package to load the ODS tables from staging.So,I thought of using DBMS_Datapump utility to import the data from .DMP files to the Tables and this need be automated.

--Create Directory
CREATE
OR REPLACE DIRECTORY test_dir AS 'C:Test'

--grant Access to the User
GRANT READ, WRITE ON DIRECTORY test_dir TO scott;

--Script to import
DECLARE
l_dp_handle1 NUMBER;
BEGIN
l_dp_handle1 := dbms_datapump.OPEN(operation => 'IMPORT',

[code]...

Errors

ERROR at line 1:
ORA-31634: job already exists
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 938
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4590
ORA-06512: at line 4

View 6 Replies View Related

Server Utilities :: Rename Table While Using Datapump Import?

Mar 18, 2013

I have a Datapump Export File which was created in Schema mode.

I have to import the tabelles in a new database where a have to use the REMAP SCHEMA Parameter.

Additionally I would like to add a prefix to tablenames.

For example:

original tablename: THE_TABLE
Name after import: IMP_THE_TABLE

Is there a way to add a prefix while using Datapump Import?

View 5 Replies View Related

Server Utilities :: ORA-31685 Deploy Schema Using DATAPUMP API

Feb 7, 2011

I'm trying to deploy the schema using DATAPUMP API. The user from which the schema get deployed has the direct privilege of CREATE USER (not through role). But got the insufficient privileges error.

Processing object type SCHEMA_EXPORT/USER
ORA-31685: Object type USER:"SCOTT1" failed due to insufficient privileges. Failing sql is:
CREATE USER "SCOTT1" IDENTIFIED BY VALUES '4EBB0DDE3C79FE47' DEFAULT TABLESPACE "USERS"
TEMPORARY TABLESPACE "TEMP2" PROFILE "APP_PROFILE".

But the user get created successfully when run the CREATE statement manually.I have created the user manually and again run the deployment procedure. Got the below error for ROLE_GRANTS.

Processing object type SCHEMA_EXPORT/ROLE_GRANT
ORA-39083: Object type ROLE_GRANT failed to create with error:
ORA-01932: ADMIN option not granted for role 'EXP_FULL_DATABASE'
Failing sql is:
GRANT "EXP_FULL_DATABASE" TO "SCOTT1"

The user has EXP_FULL_DATABASE with ADMIN Option and IMP_FULL_DATABASE with ADMIN option direct privileges.which privileges the user needs to deploy the schema successfully?

View 1 Replies View Related

Server Utilities :: Datapump Import And Referential Integrity Constraints

May 23, 2012

I am using Datapump import using database link to import an entire schema from another Server but it gives issues with constraints.I tried to first only import the metadata and then disable the constraints and import data and enable constraint but in this case the temp tablespace keeps filling up and i am out of space. Is there any method to do a full import including constraints and indexes.

View 7 Replies View Related

Server Utilities :: Total Estimation Using BLOCKS Method - Datapump

Apr 5, 2013

I have analyzed that, datapump estimation is 9.902GB. When i check size of .dmp file, it's shows 1.44Gb.

Export: Release 11.2.0.1.0 - Production on Fri Apr 5 02:00:05 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
;;;
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_FULL_01": system/******** dumpfile=expdp_LVGITRN_30_24_050413.dmp directory=DP_DIR logfile=expdp_LVGITRN_30_24_050413.log full=y exclude=statistics
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: [bold]9.902 GB [/bold]

Why Datapump estimate so much than actual size?

View 8 Replies View Related

Export/Import/SQL Loader :: Table Import Takes Long Time And Still Running?

Jun 22, 2012

MY DB Version: 10.2.o

OS: Windows Server 2003

I am trying to import on table which i have the export dump file which i take using expdp previously when i load that table on the same host

by using below command:

expdp scott/tiger@db10g tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expdpEMP_DEPT.log

after that i zip that dump and move it to external usb and now i need that table i copy that table and unzip that that dump

Command i am using to do the import is :

impdp scott/tiger@db10g tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=impdpEMP_DEPT.log

But the query of import is still runing even not showing any amount of rows to be imported.

i already make the tablespace in which the table was previosuly before dropping but when i check the sapce of tablespace that is also not consuming one error i got preiviously while performing this task is:

Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "CDR"."SYS_IMPORT_TABLE_03" successfully loaded/unloaded
Starting "CDR"."SYS_IMPORT_TABLE_03":  cdr/********@tsiindia directory=TEST_DIR dumpfile=CAT_IN_DATA_042012.DMP tables=CAT_IN_DATA_042012 logfile=impdpCAT_IN_DATA_042012.log

[code]....

i check streams_pool_size it will show zero and then i make it to 48M and after that

SQL> show parameter streams_pool_size;
NAME                                 TYPE        VALUE
-----------
streams_pool_size                    big integer 48M

But still it takes time

View 13 Replies View Related

RAC & Failsafe :: RMAN Duplicate Taking Long To Finish

Sep 27, 2011

How can we tune our RMAN Duplicate as it was taking 10+ hours to finish. Current production database size is 800GB. Approx duration of rman online backup (FULL, including archived logs) is 4 to 5 hours.

Here's the rman backup script we used:
sql 'alter system archive log current';
list archivelog all;
run
{
show all;
report schema;
backup database plus archivelog delete all input;
}
exit

Here's the rman duplicate command we used:
run
{
set until time "to_date(to_char(sysdate,'Mon DD YYYY') || ' 02:30:00', 'Mon DD YYYY HH24:MI:SS')";
allocate auxiliary channel ch1 type disk;
duplicate target database to testdb;
}
exit

View 3 Replies View Related

Data Archive Script Is Taking Too Long To Delete A Large Table

Aug 8, 2013

We have data archive scripts, these scripts move data for a date range to a different table. so the script has two parts first copy data from original table to archive table; and second delete copied rows from the original table. The first part is executing very fast but the deletion is taking too long i.e. around 2-3 hours. The customer analysed the delete query and are saying the script is not using index and is going into full table scan. but the predicate itself is the primary key,More info below

CREATE TABLE "APP"."MON_TXNS"    (    "ID_TXN" NUMBER(12,0) NOT NULL ENABLE,     "BOL_IS_CANCELLED" VARCHAR2(1 BYTE) DEFAULT 'N' NOT NULL ENABLE,     "ID_PAYER" NUMBER(12,0),     "ID_PAYER_PI" NUMBER(12,0),     "ID_PAYEE" NUMBER(12,0),     "ID_PAYEE_PI" NUMBER(12,0),     "ID_CURRENCY" CHAR(3 BYTE) NOT NULL ENABLE,     "STR_TEXT" VARCHAR2(60 CHAR),     "DAT_MERCHANT_TIMESTAMP" DATE,     "STR_MERCHANT_ORDER_ID" VARCHAR2(30 BYTE),     "DAT_EXPIRATION" DATE,     "DAT_CREATION" DATE,     "STR_USER_CREATION" VARCHAR2(30 CHAR),     "DAT_LAST_UPDATE"

[Code]...

 Data is first moved to table in schema3.OTW. and then we are deleting all the rows in otw from original table. below is the explain plan for delete  

SQL> explain plan for  2  delete from schema1.mon_txns where id_txn in (select id_txn from schema3.OTW); 

Explained. SQL> select * from table(dbms_xplan.display); 

PLAN_TABLE_OUTPUT--------------------------------------------------------------------------------------------------------------------------------------------

Plan hash value: 2798378986
 -------------------------------------------------------------------------------------
| Id  | Operation              | Name       | Rows  | Bytes | Cost (%CPU)| Time     |
-------------------------------------------------------------------------------------|   0 | DELETE STATEMENT       |            |  2520 |   233K|    87   (2)| 00:00:02 ||   1 |  DELETE                | MON_TXNS   |       |       |            |          ||*  2 |   HASH JOIN RIGHT SEMI |            |  2520 |   233K|    87   (2)| 00:00:02 ||   3 |    INDEX FAST FULL SCAN| OTW_ID_TXN |  2520 | 15120 |     3   (0)| 00:00:01 ||   4 |    TABLE ACCESS FULL   | MON_TXNS   | 14260 |  1239K|    83   (0)| 00:00:02 |

-------------------------------------------------------------------------------------
 PLAN_TABLE_OUTPUT
-------------------------------------------------------------------------------------------------------------------------------------------- 
Predicate Information (identified by operation id):
--------------------------------------------------- 

View 6 Replies View Related

Possible To Export Data To Another Directory Other Than Datapump

Sep 8, 2013

Is it possible to export data to another directory other than datapump default directory(DATA_PUMP_DIR) in Linux.

View 5 Replies View Related

Error While Datapump Export Backup Of Full Database

Sep 5, 2012

One of my Friend gets error While datapump Export backup of Full database.

Pfa below error details:-

ORA-31693: Table data object "RADIOMIRCHI_PIP_HRMS"."GM_DEPT" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-00604: error occurred at recursive SQL level 3
ORA-21780: Maximum number of object durations exceeded.
[code]......

View 1 Replies View Related

Export/Import/SQL Loader :: Exporting Query Using Datapump

May 29, 2013

how to WRITE a script for datapump EXPORT for the below list of tables in the ADAM schema in RMUAT2 database using include

1) PSOPR,PSDEFN,PSMTR,PSQFPR,PSMNT,PSOPR
2) PMNT,PMTS
3) RMOT,RMST

etc.... i WANT TO EXPORT ONLY FEW TABLES IN THE SCHEMA

View 7 Replies View Related

Export/Import/SQL Loader :: DataPump Succeeds But Dump File Does Not Get Created

Aug 23, 2013

datapump on Windows 2003/R11.2 I have a batch file that creates a daily dump of a schema in the DATA_PUMP_DIR - However, it doesn't! The script is as follows: REM Script to perform data pump export of the user01 schema and move to the

desktop.SET CURRDATE=%DATE:~10,4%%DATE:~4,2%%DATE:~7,2%SET CURRTIME=%TIME:~0,2%%TIME:~3,2%%time:~6,2%SET CURRTIME=%CURRTIME: =0%SET DATESTAMP=%CURRDATE%_%CURRTIME%SET DUMP_FILE=user01_EXP_%DATESTAMP%.DMPSETLOG_FILE=user01_EXP_%DATESTAMP%.LOGexpdp "sys/pwd@db  as sysdba" directory=DATA_PUMP_DIR dumpfile=%DUMP_FILE% schemas=user01 logfile=%LOG_FILE%  MOVE

F:Oracleproduct11.2.0dbRDBMSlog\%DUMP_FILE% C:USERSADMINISTRATORDESKTOP\%DUMP_FILE%MOVE F:Oracleproduct11.2.0dbRDBMSlog\%LOG_FILE% C:USERSADMINISTRATORDESKTOP\%LOG_FILE% 

For some reason when this is run from a remote server as a batch it fails to create a file although the output from the scripts has no errors apart from move statements and the expdp output is all good (it states that the file was created in the expected location).If the expdp command is run on the server itself it is all good. 

View 4 Replies View Related

Export/Import/SQL Loader :: Using Datapump On Windows Client When Database Is On Linux

Jun 13, 2013

the database (11gR2) is located on Linux server. A business application is installed on a Windows server with an Oracle client 11g.The application is able to start a datapump export, but as matter of fact the dumpfiles are always written to the Linux-server. The directoryobject is defined as DATA_PUMP_DIR (which is the default directory).

Now we are supposed to change the datapump export in a matterthat the dumpfiles get written to the Windows server. Creating a new directory (e.g. c:datapump) and starting than the datapump from theclient always raises the errors 

ORA-39002: ...ORA-39070: ...ORA-29283: ...ORA-06512: in "SYS.UTL_FILE", Zeile 536ORA-29283: ... 

Is it possible at all to start a datapump export from a Windows client and writing the dumpfiles to the Windows server itself? Or do the dumpfiles  always written to the database-server?

View 9 Replies View Related

Server Utilities :: Export Schemas From 10g To 11g?

Apr 27, 2011

I am trying to export schemas from 10g to 11g. The NLS_CHARACTERSET for 10g is WE8ISO8859P1 and the NLS_CHARACTERSET for 11g is WE8MSWIN1252. Is it fine or do I need to change the character set, so that I will be able to successfully do the export/import?

View -1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved