Server Utilities :: Impdp - Package Body Import Taking Huge Time?

Sep 13, 2012

I am try to import 4G dump in Oracle 11R2 version, in that we have around 9000+ Package Body which is taking huge time than other objects (about 8 to 12 hrs) and also it is expecting lots of system space (roughly about 10GB).

I have tried both parallel and non-parallel.how to improve speed of the package body import.

Details about the Schema & Import No. of objects in Schema

SQL> select object_type,count(1) from user_objects GROUP BY ROLLUP( object_type);

OBJECT_TYPE COUNT(1)
------------------- ----------
FUNCTION 248
INDEX 5161
JAVA CLASS 471
JAVA RESOURCE 1
JAVA SOURCE 16
LIBRARY 1

ORA-00933: SQL command not properly ended

View 3 Replies


ADVERTISEMENT

Server Utilities :: Impdp Takes Huge Time In Oracle11gR2

Feb 13, 2012

1) My database dump size near about 4GB , which is provided by the vendor .
2) In the dump , total objects are 364949 , where

Table : 121316
LOB object : 121315
(Normal+LOB) indexes : 122317

3) Now when I run the import using system or another user , it hangs on the below stage for 70+ hours ..

impdp ntest/ntest directory=test_dir dumpfile=JBLLIVE.31Jan2012.11.50AM.dmp remap_schema=JBLLIVE:NTEST logfile=ntest_10feb.log

Import: Release 11.2.0.1.0 - Production on Fri Feb 10 09:49:50 2012

Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "NTEST"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "NTEST"."SYS_IMPORT_FULL_01": ntest/******** directory=test_dir dumpfile=JBLLIVE.31Jan2012.11.50AM.dmp remap_schema=JBLLIVE:NTEST logfile=ntest_10feb.log
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"NTEST" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
----
In this situation I observed the worker status and see that some table and some LOB objects including LOB indexes are imported . Worker process do it in background but it does not show in the front import log file (I dont understand why it not shows in the import logfile). it imports one table,one LOB , one LOB index ..then again one table,one LOB , one LOB index ... in this way .

And my observation first it inserts data into the LOB tables and then it inserts into normal table . And when it is starting to insert data to the normal table then this table's log are shown in the import logfile.

an example of our data type :

Objects :
===================================================
LOB_FD17_RGS_TSTCD2 LOB
FD17_RGS_VERSION TABLE

(here i see one table has one LOB segment, in this way 121316 table has 121316 LOB)

SQL> desc FD17_RGS_VERSION
Name Null? Type
----------------------------------------- -------- ----------------------------
RECID VARCHAR2(255)
XMLRECORD BLOB

Our observation perhaps inserting blob mainly occurs the slowness . Is there any patch or is there any bug regarding BLOB/LOB objects in oracle-11gR2

View 6 Replies View Related

Server Utilities :: ORA-04063 - Package Body Has Errors

Jan 30, 2013

I am getting the below error when i am trying to import the dmp file into my oracle 11g.

impdp system/password@orcl dumpfile=aaa.dmp directory=datapump remap_schema=dev:user

ORA-31626: job does not exist
ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors
ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_INTERNAL_
LOGSTDBY"
ORA-06512: at "SYS.KUPV$FT", line 949
ORA-04063: package body "SYS.DBMS_LOGREP_UTIL" has errors
ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_LOGREP_UTIL"

View 7 Replies View Related

Server Utilities :: IMPORT (Impdp) Multiple Dump Files

Aug 2, 2012

i have more than 100 dumpfiles to import into my oracle 11g database. i know how to import(impdp) for same named dumps but here all the dumpfile names are totally different(ex: aa.dmp,bb.dmp,).

View 3 Replies View Related

Server Utilities :: Datapump Export Taking Long Time (HUNG)?

Aug 23, 2012

Expdp directory=xxx.dmp dumpfile=aaa.dmp logfile=xxx.log FULL=Y
: :: : : :: : : : ;
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 24.87 MB
Processing object type SCHEMA_EXPORT/USER

[code]...

then my export hangs..... checked in alert log nothing found.and then killed the job and reran again but same....checked the status and it's saying EXECUTING.

View 15 Replies View Related

Server Utilities :: Import A Dump File Using Impdp Data Pump Utility On Oracle 10g

Feb 19, 2012

Is it possible to import a dump file using impdp data pump utility on oracle 10g where the export dump was taken using traditional exp utility and vice versa.

View 1 Replies View Related

Server Utilities :: Restore A Package Only With Original Import Utility?

Jun 28, 2013

I am trying to restore a package only from a dump file that was exported by original exp command. I know you can do it with data pump, but unfortunately this is dump file is not exported by expdp. Can I do that. I really do not want to import the whole database.

View 5 Replies View Related

Server Utilities :: Time Taken In Export And Import?

Feb 13, 2012

When i do a table export , it got over in 30 mins.When i do import using same dump file (that was created in 30 mins), its taking more than 30 mins .

why the import is taking more time than the export time ?

View 14 Replies View Related

Server Utilities :: How To Find Estimated Time To Complete Import (imp)

Apr 17, 2012

I am using imp / Schema level. Dump file contains 3 schemas with 120 GB size.Imp is running now. How to find completion time. So that I can inform to application developer about activity completion time.

View 4 Replies View Related

Backup & Recovery :: Taking A Dump Of Huge Table

Jul 10, 2012

Is this possible to take the dump of a huge table , say 500gb, into multiple files and then import it in other database? if yes how can we do it?

Note that it is a single table with 500GB of size.

View 1 Replies View Related

Server Utilities :: Exporting Huge Amount Of Data?

Jul 25, 2011

extract a huge amount of data from a couple of views... the problem is that they want it in TXT files with fixed record length. There will be like 6 files, for a total amount of about 10GB.

export those tables in the fastest possible way? If I'm not mistaken exp and expdp can't create txt files, so do I really need to use utl_file or spool?

View 1 Replies View Related

Server Utilities :: Error While Impdp?

Oct 8, 2010

Getting below error While Impdp..

Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
ORA-39083: Object type PROCACT_SCHEMA failed to create with error:
ORA-31625: Schema ADAS is needed to import this object, but is unaccessible
ORA-28031: maximum of 148 enabled roles exceeded

[code]...

ORA-06512: at "SYS.KUPW$WORKER", line 1342
ORA-06512: at line 2

Job "SYS"."SYS_IMPORT_FULL_01" stopped due to fatal error at 17:13:38

View 2 Replies View Related

Server Utilities :: Impdp Changes The Ownership Of A DBA_JOB

Apr 30, 2009

When I am exporting a schema that own several DBA_JOBS using expdp, and then importing it into another database - the DBA_JOB change ownership to the schema I used for the import.

For example :
the log_user,priv_user and schema_user used to be schema 'A'
but after the impdp, assuming I imported as SYSTEM, the log_user,priv_user and schema_user are now SYSTEM !.

Is this an expected behavior ? How can I avoid it ? the job will fail executing as SYSTEM , and I need to manually create it as user A again.

View 5 Replies View Related

Server Utilities :: User Creation While Impdp

Jul 9, 2013

I have a question to clarify regarding user creation during export and import.

Will user get created along with roles,privileges by default when using impdp command ?

View 6 Replies View Related

Server Utilities :: Impdp Getting Grants And Privs

May 8, 2012

I did an export using the following parfile (see below) I want to import all the objects associated with this schema into another DB but I want don't want to over-write any of the permissions such as grants.

Is there a way I can get the grants into a sql file before I do the import. If so, provide an example.

cat exp_par

DUMPFILE=exp.dmp
LOGFILE=exp.log
DIRECTORY=DBBACKUP
schemas=t1

View 1 Replies View Related

PL/SQL :: Creating A Package Body

Jan 6, 2013

I'm trying to create a package body but i keep getting an error message saying package created with complication.

CREATE OR REPLACE PACKAGE BODY MY_PACK
IS
PROCEDURE MY_PROC(A NUMBER, B NUMBER)
IS
C NUMBER(4);
BEGIN
C:=A+B;
[code].......

View 2 Replies View Related

Server Utilities :: IMPDP Hangs On Table Data?

May 26, 2010

I'm trying to do a network datapump between oracle databases, and it seems to continually hang when it gets to the point where it should be processing table data.

C:>impdp DP_USER/DP_USER parfile=sde_webmap_2.par

Import: Release 11.1.0.7.0 - 64bit Production on Wednesday, 26 May, 2010 17:42:03

Copyright (c) 2003, 2007, Oracle. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "DP_USER"."SYS_IMPORT_FULL_01": DP_USER/******** parfile=sde_webmap_2.par
Estimate in progress using BLOCKS method...

[code]...

It just sits at this point indefinitely.The parfile for those interested:

directory=datapumps
logfile=sde_webmap_2.log
network_link=backup
full=y
INCLUDE=SCHEMA:"IN ('SDE_WEBMAP_BUSINESS','SDE_WEBMAP_BUSINESS_A','SDE_WEBMAP_BUSINESS_B')"

And the results from V$SESSION_LONGOPS

69 SYS_IMPORT_FULL_01 IMPORT 0 1031 MB 5/26/2010 5:50:37 PM 5/26/2010 6:03:29 PM

View 3 Replies View Related

Server Utilities :: Indexes Parameters Original Impdp

Sep 25, 2011

As we know,there is a parameters named indexes of orignal imp,it use to generate create index ddl,Is there a parameter in impdp compare to it?

View 6 Replies View Related

Server Utilities :: IMPDP - Big Table's Data Can't Be Imported

Oct 23, 2013

I was running IMPDP for my company's database.Everything is just fine except one problem.A big table(61.1G) could't be imported.The warning message shows "imported TBL_XXXX 61.1G 0 out of 147653981rows".The alert log had no warning message.I didn't know

View 12 Replies View Related

Server Utilities :: Error Executing Impdp Utility

Nov 9, 2012

I exported three databases : servicedesk, report and mostcmdb on server A to a dumpfile. Moved them to similar dump location on server B. Now, I want to import them into B by using:

[SQL> impdp system/OSS_MOS7100 dumpfile=MOSTCMDB_0211.DMP schemas=mostcmdb

The error was:

SP2-0734: unknown command beginning "impdp syst..." - rest of line ignored.

View 7 Replies View Related

SQL & PL/SQL :: Compiling Package BODY Causes Invalidation?

Aug 25, 2010

Up until today we assumed that compiling a package BODY-only would not cause invalidation of other code. However when we compiled the body of the HTP package, the OWA package became invalid and required re-compilation.

Oracle Database 10g Release 10.2.0.4.0 - 64bit Production
PL/SQL Release 10.2.0.4.0 - Production
CORE10.2.0.4.0Production
TNS for Linux: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production

View 6 Replies View Related

SQL & PL/SQL :: How To See The Contents Of Wrapped Package Body

Mar 31, 2011

I want to see a stored procedure definition which is inside the body of a package and the package is wrapped.So I am not able to see the content of the package body.I am new to oracle and I don't know how to decrypt the wrapped package.how to read the content of a wrapped package inside oracle.

View 1 Replies View Related

SQL & PL/SQL :: Error While Compiling Package Body?

Jun 28, 2011

When i compile the package body i get errors.

CREATE OR REPLACE PACKAGE BODY CEE_OSPCM_PKG AS
PROCEDURE CEEOT_WORK_TYPE_PRC IS
CURSOR CUR_WORK_TYPE IS
SELECT *

[code]...

View 7 Replies View Related

SQL & PL/SQL :: Missing UTL_SMTP Package Body?

Mar 5, 2012

I tried to create a UTL_SMTP package using '@?/rdbms/admin/utlmtp.sql' script but there are no package body for UTL_SMTP is created. There are only UTL_SMTP package is create.

View 4 Replies View Related

Server Utilities :: IMPDP (ORA-31619 Invalid Dump File)

May 29, 2012

I need to recreate/ clone my database to a new machine. The two machine are not connected in the network.

Step 1. (Oracle 10.2.0.5 AIX 64-bit)
expdp username/password@db1 full=y dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_expdp.log job_name=full_exp

Step 2.
FTP dump files to Windows

Step 3. (Oracle 10.1.0.2 Windows 32-bit)
impdp username/password@db1 dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_impdp.log full=y

I got:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"

Done in AIX:
create directory dp as '/bak'
grant read, write on directory dp to public;
grant exp_full_database to username;

Done in Windows:
create directory dp as 'C:P7DB';
grant read, write on directory dp to public;
grant exp_full_database to username;
grant imp_full_database to username;

View 8 Replies View Related

Server Utilities :: Increasing Parallel Process In Impdp In Runtime?

Apr 15, 2012

I started datapump not using parallel option

I issued the following command

impdp t24/t24 directory=dp_dump dumpfile=bef_cob_%U.dmp schemas=t24

The total dumpfile size is 200 GB..

Now I want to add parallel process to the job..

View 9 Replies View Related

Server Utilities :: Faster Impdp Excluding Indexes And Constraints

May 24, 2012

We need to setup a test system using data from one of the offsite customer location.The option we would like to use is impdp with network link. Below given step will make import faster if we exclude indexes and constraints

Steps
Import schema excluding constraints,rf_constraints,indexes with metadata only
Import schema with data only
spool ddl using dbms_metadata.get_ddl from Source for constraints,rf_constraints and indexes
execute on Destination the Index creation ddl
execute on Destination constraints and rf_constraints ddl

View 1 Replies View Related

Server Utilities :: Perform Impdp Command For Specific Function?

Jun 26, 2013

Below is my import command for importing specific function from export file but iam getting below errors

impdp system/PASSWORD schemas=TNC6 directory=dumpdir dumpfile=FULL01-02-2011.dmp logfile=IMP.log include=FUNCTION:"IN ('TNC_IS_NUMBER')"

ORA-39001: invalid argument value
ORA-39071: Value for INCLUDE is badly formed.
ORA-00936: missing expression

View 4 Replies View Related

Server Utilities :: Impdp Procedure Or Function 10.2.0.4 Error ORA-00904

Sep 16, 2011

I have error ORA-00904: "USERNAME" : identificateur non valide. What is the problem ?

My command impdp :
impdp datapump/password@%ORACLE_SID% DIRECTORY=datapump schemas=gcom INCLUDE=PROCEDURE remap_tablespace=gpao_indx:indx remap_tablespace=gpao_data:data DUMPFILE=%ORA_DUMPFILE% LOGFILE=Imp_%annee%%mois%%jour%%hh%%min%%sec%_%ORACLE_SID%.log

The result :
;;;
Import: Release 10.2.0.4.0 - Production on Jeudi, 15 Septembre, 2011 17:47:16

Copyright (c) 2003, 2007, Oracle. All rights reserved.
;;;
Connecté à : Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Table maître "DATAPUMP"."SYS_IMPORT_SCHEMA_29" chargée/déchargée avec succès

[Code]....

View 2 Replies View Related

Server Utilities :: EXP Versus IMPDP Interrogating Dump File

Jul 16, 2010

I have a bit of an issue with Oracle datapump dump files.

Today, I manage the export and import of oracle dump files. As part of the batch export process I have a script which essentially says:

For each schema realated to my application in THIS instance, export schema via the system user (system user allows me privs to all schemas).

On the import UI side of things I am able to run a "head -20" command on the dmp file and determine the "export client version", "date the schema was dumped", and "what schema it was dumped from". All useful info presented in my UI.

Sample output: Begin

EXPORT:V09.02.00
DSYSTEM
RUSERS
8192
Wed Jun 30 11:51:21 UserXXX.dmp
#C##
#C##

[code]....

Sample output: End

in that I allow the importation of production schemas into test schemas, (contained in a different tablespace). Based on naming convention I can determine the schema type (production or test). Additionally and probably most importantly, I am assured where the data has come from.

In looking at "expdp" and the dump file. Using the same method as above, it appears the data pump dump DOES NOT carry similar headers. Because of this, I am unable to return very little useful info from the dump file.

I realize I could run the impdp with the "sqlfile=myfile.sql" and then interrogate the sql file for the info. But on large dump files this would be fairly time consuming compared to a "head -20" on a dump file.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved