IMPDP - ORA-31626 - Job Does Not Exist?

Apr 3, 2012

I am trying to do a impdp using a network link and it fails with the ORA-31626: job does not exist.It worked with a different database on the same server. The network link is there, data pump directory exists, the read and write privileges are granted to the oracle user.There are no other data pump jobs running:

SQL> select JOB_NAME,STATE from DBA_DATAPUMP_JOBS;

no rows selected

My database details:
BANNER
----------------------------------------------------------------
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Linux: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production

The whole error is listed below:

omdx16dd$ impdp oracle@xtst080 SCHEMAS=BRM REMAP_SCHEMA=BRM:MBR_SLN CONTENT=METADATA_ONLY DIRECTORY=DATA_PUMP_DIR NETWORK_LINK=po02

Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 03 April, 2012 17:20:33

Copyright © 2003, 2007, Oracle. All rights reserved.Password:

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
With the Partitioning, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-00942: table or view does not exist
ORA-00942: table or view does not exist

[code]...

View 1 Replies


ADVERTISEMENT

IMPDP In 11g XE Database

Sep 20, 2012

I have a client database on a remote server.I take a daily export dump of this database.Now, I have created a new Oracle 11g XE database on my local machine. I want to import that dump in this XE database.

I will ftp the dump file to local machine and all no isssues about that. I just want to know will import in XE database be successful? I mean, can we import data in XE database ?

View 2 Replies View Related

Expdp On 10G AIX / Impdp 11G Linux?

May 4, 2013

First of all, I'm not a DBA, and when I try to import the dmp file generated on AIX with Oracle 10G on a RHEL6 11G machine I got a lot of issues, how to do this?

View 3 Replies View Related

Error IMPDP ORA-02219?

Feb 13, 2012

I'm trying desperately for a few days to import an Oracle database 10.2.0.5 from a Red Hat Enterprise Linux Server release 4.1 on my Windows server 2008 R2 x64 11.2.0.1. expdp export is performed without errors with the following options:

DUMPFILE = expinclude.dmp
DIRECTORY = exp_dir
LOGFILE = expinclude.log
TABLES = (list of tables with partitions)
ESTIMATE = STATISTICS

When I try to import my dump I do not get a multitude of errors like this:

ORA-39083: Failed to create the object type TABLE: "VL_DATA". "LOG_VOUCHERS_MESSAGES" with option value error:
ORA-02219: invalid NEXT storage option value
SQL failing:

After having scoured the forums on the net I have found very little info on this error (besides the code itself emerges out of a thousand sites) I tried multiple combinations for Import: excluding the index, only the structure of import, import data, etc. ... without success.

View 3 Replies View Related

How To Restore A Database Using Impdp

Oct 5, 2012

I'm a SQL DBA, so pretty new to Oracle.

I'm trying to restore a database using impdp (it was created using expdp).

Here's my impdp statement:

CODEimpdp system/oracle full=y directory=ATA_PUMP_IR dumpfile=IRISexp%U.dmp remap_datafile=/'+PDATA/iris/datafile/undotbs2.302.699230857/':\'D:\NickL\App\oradata\DT_IRIS_EXPORT_2012\DATAFILE\undotbs2.302.699230857'/

This is the error I get:

CODEORA-39083: Object type TABLESPACE failed to create with error:

ORA-01276: Cannot add file +PATA/iris/datafile/undotbs2.302.6992308597. File has an Oracle Managed Files files name.

View 1 Replies View Related

Multiple Errors During Impdp

Feb 11, 2013

I am trying to use NETWORK_LINK option in datapump and import a table from one server to another. I gave the below command :

C:>impdp example/example@db DIRECTORY=DATA_PUMP_DIR
NETWORK_LINK=db.legal.regn.net remap_schema=BI:example
tables=BI.BI_DIRECT dumpfile=BI.dmp logfile=BI.log

Got the following errors :

ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation

Is this error related to the permission in the OS level (windows 7 in my case)? I manually created the folder 'DATA_PUMP_DIR' in the specified directory path. Though the directory I created (DATA_PUMP_DIR) shows read-only in the general tab of the property, I am able to create files under the folder 'DATA_PUMP_DIR'.

View 3 Replies View Related

Recognized Way Of Replacing Entire DB With Impdp

Nov 15, 2012

I did an expdp on the prod DB and have been doing a straight impdp as a test, just to have data to work with, but it spews 214 errors. Mostly these:

ORA-31684 lots
ORA-39083 some
ORA-39151 lots
ORA-39082 lots
ORA-39111 lots
ORA-39112 lots

I can see that I can use REUSE_DATAFILES & TABLE_EXISTS_ACTION to overwrite tables by default, but is there a recognised way of replacing the entire DB with the impdp? Do I just create the instance, (with the init file) and not build/init the tables, or what? I'll experiment, but I'm just interested if there is a DBA best practice for this sort of thing.

View 3 Replies View Related

How Remap_datafile Parameter In IMPDP Works In 10g

Oct 15, 2010

I want to know how Remap_datafile paramater in IMPDP works in Oracle 10g. give me the whole scenario for two databases. One in Linux Platform and other in Windows platform. I want to import the tablesapce data from Linux to Windows Server database.

View 5 Replies View Related

Server Utilities :: Error While Impdp?

Oct 8, 2010

Getting below error While Impdp..

Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
ORA-39083: Object type PROCACT_SCHEMA failed to create with error:
ORA-31625: Schema ADAS is needed to import this object, but is unaccessible
ORA-28031: maximum of 148 enabled roles exceeded

[code]...

ORA-06512: at "SYS.KUPW$WORKER", line 1342
ORA-06512: at line 2

Job "SYS"."SYS_IMPORT_FULL_01" stopped due to fatal error at 17:13:38

View 2 Replies View Related

Import Data From Dmp File Using Command IMPDP?

May 2, 2011

I am trying to import data from a dmp file using the command IMPDP Here is the command :

impdp
USERID= core_edb_20112_ct/local
DIRECTORY=dir_core20112
DUMPFILE=CORE_EDB_20112_CT_20110426.DMP
LOGFILE=log_core02112_1.log
SCHEMAS=core_edb_20112_ct
REMAP_SCHEMA=core_edb_20112_ct:core_edb_20112_ct
PARALLEL=4
METRICS=Y TRANSFORM=OID:N
TRANSFORM=SEGMENT_ATTRIBUTES:N
REMAP_TABLESPACE=C64_EDB_TS:C64_EDB_TS

I am trying to import data in the following user : core_edb_20112_ct/local
This user is already created , using the tablespace named C64_EDB_TS
The dmp file resides in the location dir_core20112 ( e:\oracle)

I am getting the following error while i try to import

Import: Release 11.2.0.1.0 - Production on Mon May 2 12:47:54 2011

Copyright © 1982, 2009, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Produc
tion
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39002: invalid operation
ORA-31694: master table "CORE_EDB_20112_CT"."SYS_IMPORT_FULL_01" failed to load/
unload
ORA-02354: error in exporting/importing data
ORA-02368: the following file is not valid for this load operation
ORA-02369: internal number in header in file e:\oracle\core_edb_20112_ct_20110426.dmp is not valid.

The DMP is copied from a different network location into the local drive where the command is running.

View 1 Replies View Related

SQL & PL/SQL :: Export A Schema Using Expdp And Import Using Impdp

Apr 20, 2011

I have a process to export a schema using expdp and import using impdp. Everything creates successfully except for a trigger. The trigger gives and error that the table or view does not exist. The account that I use to import the schema is different than the schema user but is a highly privileged account. I notice that the schema in the create or replace trigger line of code is remapped (I am using remapping in the impdp syntax) and the rest of the syntax of the trigger (which is just a sequence trigger for a primary key column) does not have the schema. In order to fix the issue, I have my bash script log into oracle as the schema user after the import of the schema and execute the trigger code. why do I have to do this for trigger code but not for other objects like views that create just fine.

View 2 Replies View Related

Server Utilities :: Impdp Changes The Ownership Of A DBA_JOB

Apr 30, 2009

When I am exporting a schema that own several DBA_JOBS using expdp, and then importing it into another database - the DBA_JOB change ownership to the schema I used for the import.

For example :
the log_user,priv_user and schema_user used to be schema 'A'
but after the impdp, assuming I imported as SYSTEM, the log_user,priv_user and schema_user are now SYSTEM !.

Is this an expected behavior ? How can I avoid it ? the job will fail executing as SYSTEM , and I need to manually create it as user A again.

View 5 Replies View Related

Server Utilities :: User Creation While Impdp

Jul 9, 2013

I have a question to clarify regarding user creation during export and import.

Will user get created along with roles,privileges by default when using impdp command ?

View 6 Replies View Related

Server Utilities :: Impdp Getting Grants And Privs

May 8, 2012

I did an export using the following parfile (see below) I want to import all the objects associated with this schema into another DB but I want don't want to over-write any of the permissions such as grants.

Is there a way I can get the grants into a sql file before I do the import. If so, provide an example.

cat exp_par

DUMPFILE=exp.dmp
LOGFILE=exp.log
DIRECTORY=DBBACKUP
schemas=t1

View 1 Replies View Related

RAC & Failsafe :: Expdp / Impdp Performs Slow In Oracle11gR2?

Jan 23, 2012

i got a problem recenly in Oracle 11g R2 RAC database . normally When I export sample user 'SCOTT' , it takes hardly one minutes .But In our RAC environment this export runs with 20to40 minutes .

Here the output :
---------------------------------------------------------------
oracle@rac2 dump]$ expdp system/sys123 directory=test_dir dumpfile=scott1.dmp schemas=scott

Export: Release 11.2.0.1.0 - Production on Mon Jan 23 09:30:26 2012

Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP,
Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01": system/******** directory=test_dir dumpfile=scott1.dmp schemas=scott
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 192 KB

[Code] .......

In another machine(where I configure RAC again in Linux) , I got the same problem . I also dont find any perfect documents in metalink . My host information :

OS : AIX 6.1
Storage : IBM (using ASM)
Database : Oracle 11g R2

View 4 Replies View Related

Server Utilities :: IMPDP Hangs On Table Data?

May 26, 2010

I'm trying to do a network datapump between oracle databases, and it seems to continually hang when it gets to the point where it should be processing table data.

C:>impdp DP_USER/DP_USER parfile=sde_webmap_2.par

Import: Release 11.1.0.7.0 - 64bit Production on Wednesday, 26 May, 2010 17:42:03

Copyright (c) 2003, 2007, Oracle. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "DP_USER"."SYS_IMPORT_FULL_01": DP_USER/******** parfile=sde_webmap_2.par
Estimate in progress using BLOCKS method...

[code]...

It just sits at this point indefinitely.The parfile for those interested:

directory=datapumps
logfile=sde_webmap_2.log
network_link=backup
full=y
INCLUDE=SCHEMA:"IN ('SDE_WEBMAP_BUSINESS','SDE_WEBMAP_BUSINESS_A','SDE_WEBMAP_BUSINESS_B')"

And the results from V$SESSION_LONGOPS

69 SYS_IMPORT_FULL_01 IMPORT 0 1031 MB 5/26/2010 5:50:37 PM 5/26/2010 6:03:29 PM

View 3 Replies View Related

Server Utilities :: Indexes Parameters Original Impdp

Sep 25, 2011

As we know,there is a parameters named indexes of orignal imp,it use to generate create index ddl,Is there a parameter in impdp compare to it?

View 6 Replies View Related

Server Utilities :: IMPDP - Big Table's Data Can't Be Imported

Oct 23, 2013

I was running IMPDP for my company's database.Everything is just fine except one problem.A big table(61.1G) could't be imported.The warning message shows "imported TBL_XXXX 61.1G 0 out of 147653981rows".The alert log had no warning message.I didn't know

View 12 Replies View Related

Server Utilities :: Error Executing Impdp Utility

Nov 9, 2012

I exported three databases : servicedesk, report and mostcmdb on server A to a dumpfile. Moved them to similar dump location on server B. Now, I want to import them into B by using:

[SQL> impdp system/OSS_MOS7100 dumpfile=MOSTCMDB_0211.DMP schemas=mostcmdb

The error was:

SP2-0734: unknown command beginning "impdp syst..." - rest of line ignored.

View 7 Replies View Related

Export/Import/SQL Loader :: Should Db_link Be Recreated After Impdp

Sep 17, 2013

I have imported schemas (impdp) from production database (10gR2, RHEL 64bit). One of the schemas has a db_link. The db_link points to a database that exists on the same server - both in production server and also on the new server where I imported the schemas. When I run a simple query in production using this db_link, it works but when I run the same query on the test server (where I imported the schemas), it gives me following error: ORA-02019: connection description for remote database not found I run this in prod database: Select count(1) from SOME_TABLE@my_link; when I run it in new database, it gives the above ORA error - even if I qualify the table and db_link with the schema owner like this: Select count(1) from the_owner.some_table@the_owner.my_link; NOTE: I am not running these queries as schema owner - I do not know the password. I am able to connect to both databases like this from the command prompt: $ sqlplus user@/password@db1$ sqlplus user@/password@db2 Does this mean that I need to recreate the db_link - perhaps every time I import?

View 6 Replies View Related

Export/Import/SQL Loader :: Impdp Custom Log Path

Oct 18, 2012

I am using the following command for importing data in to oracle database 11g in linux box.

impdp system/test-123 directory=test_dir dumpfile=test.dmp logfile=test.log full=yes remap_schema=test1:test

Here test_dir = /u01/app/oracle/product/11.2.0/db_1/DATABASE/test/dmp

By default the test.log file is getting created in the above said dir (test_dir) but my requirement is I want to have it in a separate dir.

View 13 Replies View Related

Export/Import/SQL Loader :: Use Impdp To Restore To Different Database Name

Jun 20, 2012

Can I use impdp to restore to a different database name? If yes what is the syntax? This is 10.2 on linux.

View 2 Replies View Related

Server Utilities :: IMPDP (ORA-31619 Invalid Dump File)

May 29, 2012

I need to recreate/ clone my database to a new machine. The two machine are not connected in the network.

Step 1. (Oracle 10.2.0.5 AIX 64-bit)
expdp username/password@db1 full=y dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_expdp.log job_name=full_exp

Step 2.
FTP dump files to Windows

Step 3. (Oracle 10.1.0.2 Windows 32-bit)
impdp username/password@db1 dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_impdp.log full=y

I got:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"

Done in AIX:
create directory dp as '/bak'
grant read, write on directory dp to public;
grant exp_full_database to username;

Done in Windows:
create directory dp as 'C:P7DB';
grant read, write on directory dp to public;
grant exp_full_database to username;
grant imp_full_database to username;

View 8 Replies View Related

Server Utilities :: Impdp Takes Huge Time In Oracle11gR2

Feb 13, 2012

1) My database dump size near about 4GB , which is provided by the vendor .
2) In the dump , total objects are 364949 , where

Table : 121316
LOB object : 121315
(Normal+LOB) indexes : 122317

3) Now when I run the import using system or another user , it hangs on the below stage for 70+ hours ..

impdp ntest/ntest directory=test_dir dumpfile=JBLLIVE.31Jan2012.11.50AM.dmp remap_schema=JBLLIVE:NTEST logfile=ntest_10feb.log

Import: Release 11.2.0.1.0 - Production on Fri Feb 10 09:49:50 2012

Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "NTEST"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "NTEST"."SYS_IMPORT_FULL_01": ntest/******** directory=test_dir dumpfile=JBLLIVE.31Jan2012.11.50AM.dmp remap_schema=JBLLIVE:NTEST logfile=ntest_10feb.log
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"NTEST" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
----
In this situation I observed the worker status and see that some table and some LOB objects including LOB indexes are imported . Worker process do it in background but it does not show in the front import log file (I dont understand why it not shows in the import logfile). it imports one table,one LOB , one LOB index ..then again one table,one LOB , one LOB index ... in this way .

And my observation first it inserts data into the LOB tables and then it inserts into normal table . And when it is starting to insert data to the normal table then this table's log are shown in the import logfile.

an example of our data type :

Objects :
===================================================
LOB_FD17_RGS_TSTCD2 LOB
FD17_RGS_VERSION TABLE

(here i see one table has one LOB segment, in this way 121316 table has 121316 LOB)

SQL> desc FD17_RGS_VERSION
Name Null? Type
----------------------------------------- -------- ----------------------------
RECID VARCHAR2(255)
XMLRECORD BLOB

Our observation perhaps inserting blob mainly occurs the slowness . Is there any patch or is there any bug regarding BLOB/LOB objects in oracle-11gR2

View 6 Replies View Related

Server Utilities :: Increasing Parallel Process In Impdp In Runtime?

Apr 15, 2012

I started datapump not using parallel option

I issued the following command

impdp t24/t24 directory=dp_dump dumpfile=bef_cob_%U.dmp schemas=t24

The total dumpfile size is 200 GB..

Now I want to add parallel process to the job..

View 9 Replies View Related

Server Utilities :: Faster Impdp Excluding Indexes And Constraints

May 24, 2012

We need to setup a test system using data from one of the offsite customer location.The option we would like to use is impdp with network link. Below given step will make import faster if we exclude indexes and constraints

Steps
Import schema excluding constraints,rf_constraints,indexes with metadata only
Import schema with data only
spool ddl using dbms_metadata.get_ddl from Source for constraints,rf_constraints and indexes
execute on Destination the Index creation ddl
execute on Destination constraints and rf_constraints ddl

View 1 Replies View Related

Server Utilities :: Perform Impdp Command For Specific Function?

Jun 26, 2013

Below is my import command for importing specific function from export file but iam getting below errors

impdp system/PASSWORD schemas=TNC6 directory=dumpdir dumpfile=FULL01-02-2011.dmp logfile=IMP.log include=FUNCTION:"IN ('TNC_IS_NUMBER')"

ORA-39001: invalid argument value
ORA-39071: Value for INCLUDE is badly formed.
ORA-00936: missing expression

View 4 Replies View Related

Server Utilities :: Impdp Procedure Or Function 10.2.0.4 Error ORA-00904

Sep 16, 2011

I have error ORA-00904: "USERNAME" : identificateur non valide. What is the problem ?

My command impdp :
impdp datapump/password@%ORACLE_SID% DIRECTORY=datapump schemas=gcom INCLUDE=PROCEDURE remap_tablespace=gpao_indx:indx remap_tablespace=gpao_data:data DUMPFILE=%ORA_DUMPFILE% LOGFILE=Imp_%annee%%mois%%jour%%hh%%min%%sec%_%ORACLE_SID%.log

The result :
;;;
Import: Release 10.2.0.4.0 - Production on Jeudi, 15 Septembre, 2011 17:47:16

Copyright (c) 2003, 2007, Oracle. All rights reserved.
;;;
Connecté à : Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Table maître "DATAPUMP"."SYS_IMPORT_SCHEMA_29" chargée/déchargée avec succès

[Code]....

View 2 Replies View Related

Export/Import/SQL Loader :: IMPDP Mapping To A Single Tablespace

Oct 30, 2012

I have a dumpfile from a database with hundreds of tablespaces. Do I need to remap all of them on impdp or is there a way to point all tables to a default tablespace? I mean, the source database has 200 tablespaces. The target database just 1.

View 2 Replies View Related

Server Utilities :: EXP Versus IMPDP Interrogating Dump File

Jul 16, 2010

I have a bit of an issue with Oracle datapump dump files.

Today, I manage the export and import of oracle dump files. As part of the batch export process I have a script which essentially says:

For each schema realated to my application in THIS instance, export schema via the system user (system user allows me privs to all schemas).

On the import UI side of things I am able to run a "head -20" command on the dmp file and determine the "export client version", "date the schema was dumped", and "what schema it was dumped from". All useful info presented in my UI.

Sample output: Begin

EXPORT:V09.02.00
DSYSTEM
RUSERS
8192
Wed Jun 30 11:51:21 UserXXX.dmp
#C##
#C##

[code]....

Sample output: End

in that I allow the importation of production schemas into test schemas, (contained in a different tablespace). Based on naming convention I can determine the schema type (production or test). Additionally and probably most importantly, I am assured where the data has come from.

In looking at "expdp" and the dump file. Using the same method as above, it appears the data pump dump DOES NOT carry similar headers. Because of this, I am unable to return very little useful info from the dump file.

I realize I could run the impdp with the "sqlfile=myfile.sql" and then interrogate the sql file for the info. But on large dump files this would be fairly time consuming compared to a "head -20" on a dump file.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved