Oracle Data Pump Dumps Consistency

Mar 8, 2011

By default the Oracle utility expdp will not guarantee data consistency for particular point in time when the dump was taken.

When importing such dump you can get errors like

CODEORA-39083: Object type REF_CONSTRAINT failed to create with error...
ORA-02298: cannot validate (...) - parent keys not found

To make your Oracle data pump dumps consistent use flashback_scn or flashback_time option.

Example:

CODEexpdp myschema/... DIRECTORY=expdp_dir DUMPFILE=mydb_20110308.dmp logfile=mydb_20110308.log SCHEMAS=myschema FLASHBACK_TIME=\"TO_TIMESTAMP\(TO_CHAR\(SYSDATE,\'YYYY-MM-DD HH24:MI:SS\'\),\'YYYY-MM-DD HH24:MI:SS\'\)\"

View 2 Replies


ADVERTISEMENT

Data Type Consistency CASE / Decode?

Apr 6, 2012

Data Type Consistency CASE and Decode

CASE expects data type consistency, DECODE not expecting.

Obviously both functions handling data types are different let it be

SQL> select decode(2,1,1, 2,'2', ' three' )"RESULT" from dual;

RESULT
---
2
SQL> select case 2
2 when 1 then 1
3 when 2 then 2
4 else 3
5 end "RESULT" from dual;

RESULT
2
SQL> select case 2
2 when 1 then 1
3 when '2' then 2
4 else 3
5 end "RESULT" from dual;

ERROR at line 3:
ORA-00932: inconsistent datatypes: expected NUMBER got CHAR

know cause of error here????I mean " every time case exp checking data type consistency"

my thought is

I am trying to get same output but different methods. Yes, clearly states this is one of the Oracle bug!

I want to know how oracle handles here ???? i mean 3rd query.Purely i am testing this function with dual(dummy) table...
obviously, no possibilities for different data type.next one i am not sure about Oracle compares int variables to int variables, char variables to char variables,

I think so .... that's why oracle throws error. am i right ??ORA-00932: inconsistent datatypes: expected NUMBER got CHAR

see this query

SQL> select case 2
2 when 1 then 1
3 when '2' then 2
4 else 3
5 end "RESULT" from dual;

// clearly i am stating what's error on 3rd line ?????Even if I refer reference books some concepts are blind to understand.

View 2 Replies View Related

SQL & PL/SQL :: Data Type Consistency CASE And Decode

Apr 5, 2012

Data Type Consistency CASE and Decode...CASE expects data type consistency, DECODE not expecting.Obviously both functions handling data types are different..let it be

SQL> select decode(2,1,1, 2,'2', ' three' )"RESULT" from dual;
[color=red]RESULT
---
2
SQL> select case 2
2 when 1 then 1
3 when 2 then 2
4 else 3
5 end "RESULT" from dual;
[code]...

I am trying to get same output but different methods. Yes, clearly states this is one of the Oracle bug! Oracle compares int variables to int variables, char variables to char variables, If any discrepancy between the two data types, then the query will fail but here not different data types.

Even if I refer reference books some concepts are blind to understand.

View 5 Replies View Related

Oracle Data Pump - Export Data From Schemas Or Tables

Oct 11, 2012

I need to export only the data from schemas or tables, how to do that with Oracle Data Pump? when we use schemas parameter this export all schema, not only the data right?

View 7 Replies View Related

Server Utilities :: Data Pump In Oracle 10g

Dec 9, 2010

DATA PUMP in Oracle 10g. i want to know how to use this DATA PUMP and where i have to use this DATA PUMP concept.

View 3 Replies View Related

Export/Import/SQL Loader :: Dumps From Oracle 8i And 9i In 11g

Jun 20, 2013

have few exp dumps from earlier version of oracle 8i & 9i, now i wanted to import in oracle 11g. how to do or some URLs/Docs to refer ?I have read MOS Doc 132904.1 do from where i can get earlier exp/imp client and how to use on oracle 11g DB server ? 

View 2 Replies View Related

Client Tools :: Oracle Data Pump Utility Executable Must Be Specified

Feb 4, 2011

I faced the following problem while exporting tables by using data pump in TOAD.

"Oracle Data Pump Utility executable must be specified."

View 4 Replies View Related

Server Utilities :: Oracle Data Pump Import Error

Jul 26, 2010

I am trying to import database dump using the following command

impdp system/xxxx@xxxx schemas=staging
remap_schema=staging:staging directory=DUMPDIR dumpfile=staging.dmp logfile=impdpstaing.log
TRANSFORM=SEGMENT_ATTRIBUTES:n

its importing data fine upto some stage after that oracle gives the following error

Processing object type SCHEMA_EXPORT/JAVA_SOURCE/JAVA_SOURCE
ORA-39097: Data Pump job encountered unexpected error -1423
ORA-39065: unexpected master process exception in DISPATCH
ORA-01423: error encountered while checking for extra rows in exact fetch
ORA-04030: out of process memory when trying to allocate 123404 bytes (QERHJ has
h-joi,kllcqas:kllsltba)

ORA-39014: One or more workers have prematurely exited.
Job "SYSTEM"."SYS_IMPORT_SCHEMA_04" stopped due to fatal error at 11:42:03

I though its due to lack of memory, so i have increased pga_aggregate_target=512MB to 600MB still i am getting a same error.

View 5 Replies View Related

Server Utilities :: Import Constraints Only From Dump File Using Oracle Data Pump?

Nov 16, 2011

How to import constraints only from a dump file using Oracle data pump.

View 1 Replies View Related

Server Utilities :: Import A Dump File Using Impdp Data Pump Utility On Oracle 10g

Feb 19, 2012

Is it possible to import a dump file using impdp data pump utility on oracle 10g where the export dump was taken using traditional exp utility and vice versa.

View 1 Replies View Related

Server Utilities :: Transferring Changed Data From Database A To B By Data Pump?

Apr 1, 2011

I have database A (Working in Live environment) and Database B copy of Database (Not live) I have Restored whole database (A) RMAN backup file on Database (B) Previous week now i don't want to change anything in any schema and want to import only updated and new records in the table in Database B

There are around 20 schema If for example i have everything in new database B all required database objects like Procedure,functions, packages with indexes in all tables and data in tables, i just want to add new data and updated data.

IF i do following in source database

expdp directory=dpump_dir dumpfile=table_data.dmp content=data_only schemas=ACCMAIN,HRMAIN,..... include=TABLE

AND Import in destination database B, will it add new data and update existing one in table and not touch the table structure and indexes.

View 5 Replies View Related

Server Utilities :: Data Pump For Exporting And Importing Extremely Large Data Files

Sep 24, 2010

I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?

View 4 Replies View Related

Export Using Data Pump

Jul 2, 2010

I would like to ask if there is the possibility using the data pump export utility to export my full database plus some partitions tables by selecting specific partitions. Can i have all these criteria in only one data pump export? If yes any example?

View 2 Replies View Related

Data Pump Import Error

Sep 20, 2010

i am getting the below error while doing an import thru data pump. looks like shared pool is not having enough mem allocated.

whether i can ignore this error? what it is trying to do with the DELETE FROM "SYS"."IMPDP_STATS"; INSERT INTO "SYS"."IMPDP_STATS"

ERROR
=====
. . imported "DBA1KSD"."TDF" 0 KB 0 rows
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT

[Code]....

View 6 Replies View Related

Data Pump Index Creation

Aug 7, 2011

I am running Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bit Production on RHEL5. I am busy with data pump import, from the log I can see that my import is busy with the constraints.

I am using parameter EXCLUDE=INDEX during the import and I created the index DDL's.

Now I want to manually create indexes while the import is busy.

Will this be advisable to do or what would be the impact?

View 1 Replies View Related

How To Use Data Pump Feature In Windows Xp

May 19, 2011

My operating system is Windows Xp Professional and the Oracle database version is 10.2.0.

My task is to export datas using export utility. Following are the steps I did in my sqlplus connecting to a database db2.

SQL> CONN / AS SYSDBA

Connected.

SQL> ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;

User altered.

SQL> CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/';

Directory created.

SQL> GRANT READ, WRITE ON DIRECTORY test_dir TO scott;

Grant succeeded.

SQL> expdp scott/tiger@db2 tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expdpEMP_DEPT.log

I was not able to export the tables, resulting in the following error.

QUOTE SP2-0734: unknown command beginning "expdp scot..." - rest of line ignored.

What should I do to rectify the error?

View 1 Replies View Related

Error In Data Pump Export?

Oct 1, 2011

i am exporting the schema of user kailas by using the data pump export but i am facing the some issue.Following are the details

[oracle@localhost dbs]$ expdp kailas/kailas DIRECTORY=dump_dir dumpfile=testexp01.dmp logfile=dump_dir:mylog.log

Export: Release 10.2.0.1.0 - Production on Saturday, 01 October, 2011 16:50:07

Copyright © 2003, 2005, Oracle. All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
ORA-31626: job does not exist
ORA-31637: cannot create job SYS_EXPORT_SCHEMA_01 for user KAILAS
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT_INT", line 600
ORA-39080: failed to create queues "KUPC$C_1_20111001165007" and "KUPC$S_1_20111001165007" for Data Pump job
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPC$QUE_INT", line 1555
ORA-00832: no streams pool created and cannot automatically create one
[oracle@localhost dbs]$

View 3 Replies View Related

Build And Load Differences Between Dumps?

Jun 19, 2012

we get daily a full dump from our customer. We have to build the differences of tables (deltas) and load them in separate tables.

There some questions, requiremenst and restrictions:

1) Is it possible the merge the dumps ?

2) How to build the differences as the easiest ?

3) The whole process (import, merge, extract) should be run in batch.

4) We have only a solid knowledge and liitle time for doing.

View 1 Replies View Related

Server Utilities :: Data Pump Job Parameters

Sep 2, 2011

How (or having a script) to get, in PL/SQL, the parameters that have been given on a Data Pump command (export or import): mode (easy), tables/schemas list, exclude/include values and so on?

10.2 (preferred) or 11.2 as you want.

View 2 Replies View Related

Server Utilities :: Can Data Pump Utility Do Imp While Exp

Oct 14, 2013

I met some problems with data pump tools. We have a large db(oracle 10.2.0.5, single instance on hpux v11.11, data about 8TB), while we wanna migrate to Linux ia 64bit, 10.2.0.5 RAC. The migrate window is around 20 hours, so the window can NOT be assign more hours.

As we consider if impdp & expdp work can start together, that will be save a lot of time. So I wonder if there are any ways to implement this or any other ways can speed up and make the data integrate better?

View 4 Replies View Related

Server Utilities :: Data Pump Backup

Feb 22, 2011

why datapump is faster than normal exp ? one ans that i know is dp use block mode and exp use byte mode . is there any other major reason? if say i have database of size 10g and want to take datapump backup, but condition is that i can take dumpfile of size 2g only. there is any way to take full backup of database in part wise .

View 1 Replies View Related

Consistency Not Working In Audit Logs And Datafile?

Dec 14, 2012

We have audit logs of a transaction in audit files, however we do not see any changes in the table that the transaction affects.We use point-in-time recovery and flashback feature to figure out the changes in the table . DML Audit Granularity is "ACCESS".The transaction is java application transaction and we use hibernate.

How can this be possibble?

View 7 Replies View Related

Server Utilities :: Query To Check Data Pump

Aug 26, 2011

Can we query on database to check if data pump was run on scheduled time incase we dont have access to operating system to see the log file.

View 1 Replies View Related

Server Utilities :: Data Pump Import Error

Sep 25, 2010

We have a QA database on a VM server with Windows 2003 operating system and oracle 10.2.0.1 installed along with limited disk space.We received an expdp file from a client that is large enough that we had to copy it to a network drive (40GB). I created a new directory called IMPDMP with the directory path (using UNC pathing) to \serversharefoldersubfolder (our network mapped P drive, yes I included the backslash, but I have tried without it also). I also included the parfile here. I checked the grants and they seem to be fine

SQL> select * from session_roles where role like '%DATABASE' or role like 'DBA';

ROLE
------------------------------
DBA
EXP_FULL_DATABASE
IMP_FULL_DATABASE

SQL> select * from session_privs where privilege like '%DICT%';

PRIVILEGE
----------------------------------------
SELECT ANY DICTIONARY
ANALYZE ANY DICTIONARY
[code]....

My questions are this:

1) In interactive mode, does a dummy file expdat.dmp have to exist in the DATA_PUMP_DIR directory?

2) does my export have to reside in the DATA_PUMP_DIR directory (again, no disk space to handle the DMP file), one of the hard drives is just big enough to handle the space but since it has datafiles there also, it would crash during import when trying to extend.

View 3 Replies View Related

Enterprise Manager :: Cannot Fully Import Data Pump

Aug 12, 2010

I have an 11g data pump supplied by another party.I am on Windows 7(x64).I have experience using other databases, but not Oracle. The complexity of it all is a bit overwhelming...

I downloaded and installed [URL].I used the Database Configuration Assistant to create a database:

Template: Data Warehouse
Name/SID: database0
Password: password0

I then used the 'database0' Enterprise Manager:

Logged in as SYSTEM/password0 (Normal)
Import from Export Files
Entire Files
Host Credentials: myself (am Windows administrator)
All the rest defaults

The job appears to finish successfully.When I look at the schema (using razorsql), most tables seem to be there. However,a significant number are not. When I open data pump in a text editor, those missing tables are clearly there - definitions and data.When I look in the import.log, there are errors of the type:

error in creating database file '/db02/oradata/database0/stuff.dbf'
file create error, unable to create file
unable to open file
(OS 3) The system cannot find the path specified.
Failing sql is:
CREATE TABLESPACE "STUFF" DATAFILE '/db01/oradata/database0/stuff.dbf'

-- followed by the associated table creation errors.

So, does this mean that unix paths are hardcoded into the data pump, and is therefore incompatible with import into a Windows based system? Or are the paths symbolic, internal representations used by Oracle, and these errors are a symptom of an earlier, undisclosed problem?

The thing is, when I view the schema, the tablespace "STUFF" exists, just none of its tables.

View 5 Replies View Related

Server Utilities :: Data Pump - How To Take The Consistent Backup

Jun 22, 2010

In oracle 10g data pump (Logical backup) How to take the consistent backup. What parameter can we use?

View 2 Replies View Related

Server Utilities :: Which Client Need To Use Data Pump Tools

Feb 13, 2012

I'm installing a new application-testing server, i have installed 11g r2 instant clients & SQL* Plus client.

when i'm trying to run an expdp command, i get this:

'expdp' is not recognized as an internal or external command

Now, i understand this is because i don't have the Bin directory of a client installation in my Path of the OS. My question is, which one exactly i need for using data-pump utility, and where to download it?

I've found lots of posts of people that had issues with defining the ORA_HOME$in in the $PATH, or having a client incompatibility issue throughout the web, but no answer to my specific question.

View 4 Replies View Related

Server Utilities :: Data Pump Error ORA-31693

Jan 15, 2012

I'm getting the following massage:

ORA-31693: Table data object "033"."EZMILRIKUZ" failed to load/unload and is being skipped due to error:
ORA-00922: missing or invalid option

my backup syntax is:
033/******@INTORCL DIRECTORY=exp_dir DUMPFILE=033.dmp LOGFILE=033.LOG FULL=N REUSE_DUMPFILES=Y FLASHBACK_TIME="TO_TIMESTAMP(TO_CHAR(SYSDATE,'YYYY-MM-DD HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS')"

I thought there was a problem with the table so i created a new one and now I'm getting the same error on a different table (the third on in the list:
. . exported "033"."PRQ" 192.9 KB 479 rows
. . exported "033"."EZMIL" 558.8 KB 1229 rows
ORA-31693: Table data object "033"."MIL" failed to load/unload and is being skipped due to error:
ORA-00922: missing or invalid option

when i takeoff the "FLASHBACK_TIME" parameter it works fine. ButI need this parameter.

View 4 Replies View Related

Server Utilities :: Data-pump Imp Full With Network_link?

Aug 25, 2011

I want do this connected in windows 2008 r2 with oracle 11G R2 execute an import, that will do a full import, from a linux with oracle 10g called "SUPORTE1"

I´m trying this in the windows2008 machine

impdp system/manager@w2k811r2 full=y DIRECTORY=dpump NETWORK_LINK=SUPORTE1;

and I get the follow errors

ORA-39001: valor de argumento invßlido ( argument valor invalid)
ORA-39200: O nome do link "SUPORTE1;" Ú invßlido. ( link name invliad)
ORA-44004: nome SQL qualificado invßlido ( sql name invalid)

I tested the connection, db-link and created the directory.

View 3 Replies View Related

Server Utilities :: Data Pump Expdp To ASM Daily?

Jun 16, 2011

i succeeded to expdp to ASM diskgroup such as

create directory asmexpdir as '+RECO/FILTDB/EXPDP';
grant read,write on directory asmexpdir to oraasfs;
expdp oraasfs/oraasfs2301 directory=asmexpdir dumpfile=SBSR_EXP.dmp tables=TM_SFS_CUST_01 logfile=EXPDP_LOG:SBSR_EXP.log

SUCCESS MESSAGE

. . exported "ORAASFS"."TM_SFS_CUST_01" 387.2 MB 817684 rows
Master table "ORAASFS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ORAASFS.SYS_EXPORT_TABLE_01 is:
+RECO/filtdb/expdp/sbsr_exp.dmp
Job "ORAASFS"."SYS_EXPORT_TABLE_01" successfully completed at 03:34:59

And I like to run this daily and delete after 14 days. but it show error, what can be the solution to run this script?

#!/bin/bash
#Script to Perform Datapump Export backup Every Day
################################################################
#Change History

[code]...

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved