Server Utilities :: Which Client Need To Use Data Pump Tools

Feb 13, 2012

I'm installing a new application-testing server, i have installed 11g r2 instant clients & SQL* Plus client.

when i'm trying to run an expdp command, i get this:

'expdp' is not recognized as an internal or external command

Now, i understand this is because i don't have the Bin directory of a client installation in my Path of the OS. My question is, which one exactly i need for using data-pump utility, and where to download it?

I've found lots of posts of people that had issues with defining the ORA_HOME$in in the $PATH, or having a client incompatibility issue throughout the web, but no answer to my specific question.

View 4 Replies


ADVERTISEMENT

Client Tools :: Oracle Data Pump Utility Executable Must Be Specified

Feb 4, 2011

I faced the following problem while exporting tables by using data pump in TOAD.

"Oracle Data Pump Utility executable must be specified."

View 4 Replies View Related

Server Utilities :: Data Pump Job Parameters

Sep 2, 2011

How (or having a script) to get, in PL/SQL, the parameters that have been given on a Data Pump command (export or import): mode (easy), tables/schemas list, exclude/include values and so on?

10.2 (preferred) or 11.2 as you want.

View 2 Replies View Related

Server Utilities :: Can Data Pump Utility Do Imp While Exp

Oct 14, 2013

I met some problems with data pump tools. We have a large db(oracle 10.2.0.5, single instance on hpux v11.11, data about 8TB), while we wanna migrate to Linux ia 64bit, 10.2.0.5 RAC. The migrate window is around 20 hours, so the window can NOT be assign more hours.

As we consider if impdp & expdp work can start together, that will be save a lot of time. So I wonder if there are any ways to implement this or any other ways can speed up and make the data integrate better?

View 4 Replies View Related

Server Utilities :: Data Pump Backup

Feb 22, 2011

why datapump is faster than normal exp ? one ans that i know is dp use block mode and exp use byte mode . is there any other major reason? if say i have database of size 10g and want to take datapump backup, but condition is that i can take dumpfile of size 2g only. there is any way to take full backup of database in part wise .

View 1 Replies View Related

Server Utilities :: Data Pump In Oracle 10g

Dec 9, 2010

DATA PUMP in Oracle 10g. i want to know how to use this DATA PUMP and where i have to use this DATA PUMP concept.

View 3 Replies View Related

Server Utilities :: Transferring Changed Data From Database A To B By Data Pump?

Apr 1, 2011

I have database A (Working in Live environment) and Database B copy of Database (Not live) I have Restored whole database (A) RMAN backup file on Database (B) Previous week now i don't want to change anything in any schema and want to import only updated and new records in the table in Database B

There are around 20 schema If for example i have everything in new database B all required database objects like Procedure,functions, packages with indexes in all tables and data in tables, i just want to add new data and updated data.

IF i do following in source database

expdp directory=dpump_dir dumpfile=table_data.dmp content=data_only schemas=ACCMAIN,HRMAIN,..... include=TABLE

AND Import in destination database B, will it add new data and update existing one in table and not touch the table structure and indexes.

View 5 Replies View Related

Server Utilities :: Data Pump For Exporting And Importing Extremely Large Data Files

Sep 24, 2010

I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?

View 4 Replies View Related

Server Utilities :: Query To Check Data Pump

Aug 26, 2011

Can we query on database to check if data pump was run on scheduled time incase we dont have access to operating system to see the log file.

View 1 Replies View Related

Server Utilities :: Data Pump Import Error

Sep 25, 2010

We have a QA database on a VM server with Windows 2003 operating system and oracle 10.2.0.1 installed along with limited disk space.We received an expdp file from a client that is large enough that we had to copy it to a network drive (40GB). I created a new directory called IMPDMP with the directory path (using UNC pathing) to \serversharefoldersubfolder (our network mapped P drive, yes I included the backslash, but I have tried without it also). I also included the parfile here. I checked the grants and they seem to be fine

SQL> select * from session_roles where role like '%DATABASE' or role like 'DBA';

ROLE
------------------------------
DBA
EXP_FULL_DATABASE
IMP_FULL_DATABASE

SQL> select * from session_privs where privilege like '%DICT%';

PRIVILEGE
----------------------------------------
SELECT ANY DICTIONARY
ANALYZE ANY DICTIONARY
[code]....

My questions are this:

1) In interactive mode, does a dummy file expdat.dmp have to exist in the DATA_PUMP_DIR directory?

2) does my export have to reside in the DATA_PUMP_DIR directory (again, no disk space to handle the DMP file), one of the hard drives is just big enough to handle the space but since it has datafiles there also, it would crash during import when trying to extend.

View 3 Replies View Related

Server Utilities :: Data Pump - How To Take The Consistent Backup

Jun 22, 2010

In oracle 10g data pump (Logical backup) How to take the consistent backup. What parameter can we use?

View 2 Replies View Related

Server Utilities :: Data Pump Error ORA-31693

Jan 15, 2012

I'm getting the following massage:

ORA-31693: Table data object "033"."EZMILRIKUZ" failed to load/unload and is being skipped due to error:
ORA-00922: missing or invalid option

my backup syntax is:
033/******@INTORCL DIRECTORY=exp_dir DUMPFILE=033.dmp LOGFILE=033.LOG FULL=N REUSE_DUMPFILES=Y FLASHBACK_TIME="TO_TIMESTAMP(TO_CHAR(SYSDATE,'YYYY-MM-DD HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS')"

I thought there was a problem with the table so i created a new one and now I'm getting the same error on a different table (the third on in the list:
. . exported "033"."PRQ" 192.9 KB 479 rows
. . exported "033"."EZMIL" 558.8 KB 1229 rows
ORA-31693: Table data object "033"."MIL" failed to load/unload and is being skipped due to error:
ORA-00922: missing or invalid option

when i takeoff the "FLASHBACK_TIME" parameter it works fine. ButI need this parameter.

View 4 Replies View Related

Server Utilities :: Data-pump Imp Full With Network_link?

Aug 25, 2011

I want do this connected in windows 2008 r2 with oracle 11G R2 execute an import, that will do a full import, from a linux with oracle 10g called "SUPORTE1"

I´m trying this in the windows2008 machine

impdp system/manager@w2k811r2 full=y DIRECTORY=dpump NETWORK_LINK=SUPORTE1;

and I get the follow errors

ORA-39001: valor de argumento invßlido ( argument valor invalid)
ORA-39200: O nome do link "SUPORTE1;" Ú invßlido. ( link name invliad)
ORA-44004: nome SQL qualificado invßlido ( sql name invalid)

I tested the connection, db-link and created the directory.

View 3 Replies View Related

Server Utilities :: Data Pump Expdp To ASM Daily?

Jun 16, 2011

i succeeded to expdp to ASM diskgroup such as

create directory asmexpdir as '+RECO/FILTDB/EXPDP';
grant read,write on directory asmexpdir to oraasfs;
expdp oraasfs/oraasfs2301 directory=asmexpdir dumpfile=SBSR_EXP.dmp tables=TM_SFS_CUST_01 logfile=EXPDP_LOG:SBSR_EXP.log

SUCCESS MESSAGE

. . exported "ORAASFS"."TM_SFS_CUST_01" 387.2 MB 817684 rows
Master table "ORAASFS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ORAASFS.SYS_EXPORT_TABLE_01 is:
+RECO/filtdb/expdp/sbsr_exp.dmp
Job "ORAASFS"."SYS_EXPORT_TABLE_01" successfully completed at 03:34:59

And I like to run this daily and delete after 14 days. but it show error, what can be the solution to run this script?

#!/bin/bash
#Script to Perform Datapump Export backup Every Day
################################################################
#Change History

[code]...

View 9 Replies View Related

Server Utilities :: Data Pump Worker Parallelism

Jun 17, 2010

While using impdp I had set parallel to 16. Currently parallelism is 16 but there are only two workers with worker parallism 1.

1st worker is executing and 2nd worker is waiting.

Is my impdp running parallely? What does work parallelism mean?

View 1 Replies View Related

Server Utilities :: Multiple Directories / Which One To Use For Export Data Pump

Dec 12, 2011

There are multiple directories created in server for data pump. which one to use for export data pump ?

View 6 Replies View Related

Server Utilities :: Data Pump Export With Query And Date Value?

Dec 24, 2010

If i export data using thw below query it shows the error:

>expdp test1/test1 DIRECTORY=datapump DUMPFILE=expfull.dmp query=auth_test:"where TXNREQDTTIME<'20-MAY-10'" tables=auth_test

bash-3.00$ expdp test1/test1 DIRECTORY=datapump DUMPFILE=expfull-3.dmp query=auth_test:"where TXNREQDTTIME<'20-MAY-10'" tables=auth_test

Export: Release 10.2.0.1.0 - Production on Saturday, 25 December, 2010 5:10:06

Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "TEST1"."SYS_EXPORT_TABLE_01": test1/******** DIRECTORY=datapump DUMPFILE=expfull-3.dmp query=auth_test:"where TXNREQDTTIME<20-MAY-10" tables=auth_test
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB

[code]....

View 4 Replies View Related

Server Utilities :: Use Data Pump For First Time On Production Database

Sep 13, 2011

I need to use Data Pump for the first time on my production Database.Currently on Testing Database, when i am taking schema level export there are no errors or warnings in the log file but when i importing it gives fallowing ORA in the import log file. i searched on google,the only way i found is to recompile the invalid objects. how to avoid this warnings in log file.

"ORA-39082: Object type ALTER_PROCEDURE:"QUANTISV4"."P_CTM_ABN_INVST_EQUITY" created with compilation warnings"

View 4 Replies View Related

Server Utilities :: Partition Table Import Through Data Pump?

Mar 11, 2013

IMPORT PARTITION TABLE Through Data Pump.

I have a table with RANGE PARTITION. I wanted to import this into another server with the same partitions.

But when I imported the table, The table created with the Partition but the data is not inserted in partition wise.

But I could see the Entire table's ROW COUNT.

View 5 Replies View Related

Server Utilities :: Oracle Data Pump Import Error

Jul 26, 2010

I am trying to import database dump using the following command

impdp system/xxxx@xxxx schemas=staging
remap_schema=staging:staging directory=DUMPDIR dumpfile=staging.dmp logfile=impdpstaing.log
TRANSFORM=SEGMENT_ATTRIBUTES:n

its importing data fine upto some stage after that oracle gives the following error

Processing object type SCHEMA_EXPORT/JAVA_SOURCE/JAVA_SOURCE
ORA-39097: Data Pump job encountered unexpected error -1423
ORA-39065: unexpected master process exception in DISPATCH
ORA-01423: error encountered while checking for extra rows in exact fetch
ORA-04030: out of process memory when trying to allocate 123404 bytes (QERHJ has
h-joi,kllcqas:kllsltba)

ORA-39014: One or more workers have prematurely exited.
Job "SYSTEM"."SYS_IMPORT_SCHEMA_04" stopped due to fatal error at 11:42:03

I though its due to lack of memory, so i have increased pga_aggregate_target=512MB to 600MB still i am getting a same error.

View 5 Replies View Related

Server Utilities :: Missing Objects During Data Pump Import

May 18, 2011

I did the datapump export and import from one schema to a new schema in the same database. I had to use different tablespace. I used the following parameters in the parfiles :

export parfile
directory
dumpfile
logfile
parallel

import parfile
directory
dumpfile
logfile
parallel
remap_schema
remap_tablespace

Tell me whether I need to use different parameters than the one I used? Can I use both remap_schema and remap_tablespace at a time?

View 1 Replies View Related

Server Utilities :: Fatal Error In Data Pump Import?

Feb 15, 2007

As I put data pump import command: got the error..........

Import: Release 10.1.0.2.0 - Production on Thursday, 15 February, 2007 3:54

Copyright (c) 2003, Oracle. All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_SQL_FILE_FULL_02" successfully loaded/unloaded
Starting "SYSTEM"."SYS_SQL_FILE_FULL_02": system/******** directory=data_pump dumpfile=prashant_dp.
dmp SQLFILE=prashant_imp.sql logfile=prashant_imp.log

[code]...

View 8 Replies View Related

Server Utilities :: How To Setup Data Pump Resumable Error Timeout

Feb 4, 2011

Any way to setup Data Pump Export timeout on resumable error (like with classique export). Seems that default timeout is 2h (7200s).I have tried to set system parameter 'resumable_timeout' from 0 to 60 but no change.

I would like to script an export, but I just want that the script exits on errors like this one:

ORA-39171: Le travail se heurte à une attente avec possibilité de reprise.
ORA-01691: impossible d'étendre le segment LOB SYS.SYS_LOB0000145352C00039$$ de 128 dans le tablespace SYSTEM

Actually, the script have to wait 2h for expdp timeout.

View 5 Replies View Related

Server Utilities :: Import Constraints Only From Dump File Using Oracle Data Pump?

Nov 16, 2011

How to import constraints only from a dump file using Oracle data pump.

View 1 Replies View Related

Server Utilities :: ORA-02374 / Conversion Error Loading Table (through Data-pump)

Jul 12, 2013

While importing dump to the new database, error occurred. Below are the errors -

ORA-02374: conversion error loading table "INS"."GENMST_FINANCIER_BRANCH"
ORA-12899: value too large for column TXT_IFSC_CODE (actual: 19, maximum: 15)
ORA-02372: data for row: TXT_IFSC_CODE : 0X'4644524C30303031353739A0A0A0A0'
[code]...

I would like to know, why such error occurred during the import.

View 5 Replies View Related

Server Utilities :: Data Pump Error - ORA-39070 / Unable To Open The Log File

Nov 1, 2006

I'm getting an error when trying to use the new Data Pump Export/Import utility.

I am able to create a directory using SQLPLus, and I get the "Directory Created" message, but no directory actually gets created on the server.

SQL> CREATE DIRECTORY datapump AS 'C:Inetpubdatafiledatapump';

Directory created. But I dont see the directory created on the server.

Then on the server:

C:Documents and SettingsAdministrator>expdp ******/****** FULL=y DIRECTORY=datapump DUMPFILE=expdata.dmp LOGFILE=expdata.log
Export: Release 10.2.0.1.0 - Production on Wednesday, 01 November, 2006 1:51:55
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation

View 5 Replies View Related

Server Utilities :: Data Pump Import - How To Identify Dump File Tablespaces

Jul 5, 2012

I'm rying to import schema's from a dump file that came from a different environment.

What I have is:

1. dump file
2. log file of the export

I'm trying to import the file(containing three schemas) with remap_schemas, and it fails, gives a lot of ORA-00959: tablespace 'string' does not exist.

Now, I've read in OTN:

[URL]

that what you need to do in that case is to use the REMAP_TABLESPACE option,to redirect the objects to a different tablespace.

I don't see a name of the tablespace I'm getting the error for in the export log.I don't know if I have more tablespaces I have to redirect with REMAP_TABLESPACE.

I don't want to perform this 3 times, have an error, by that find out what's the next tablespace needing redirection and only then starting over...

How can I know from the dump file and the log file,what is the tablespace names i need for the redirection to my names? Or its just that the tablespace giving me the error is the only one in the dump file?

View 3 Replies View Related

Server Utilities :: Data Pump Import Order - Causing Constraint Violation

Nov 16, 2010

We are trying to import data into existing tables in a schema using data pump

However the foreign key tables are being imported first and then the master table data thus violating the constraints

Apparently it seems larger tables are being imported first regardless of referential integrity constraints thus causing constraint violation (contrary to my understanding)

Is it a normal behaviour during data pump import?

Is it possible that the keys being sequence generated are causing this?

As I understand import will commit after each table In that case can we defer commit at all at the expense of large undo, set constraints to deferrable and try the import?

View 3 Replies View Related

Server Utilities :: Import A Dump File Using Impdp Data Pump Utility On Oracle 10g

Feb 19, 2012

Is it possible to import a dump file using impdp data pump utility on oracle 10g where the export dump was taken using traditional exp utility and vice versa.

View 1 Replies View Related

Client Tools :: How To Insert Data From SQL To Oracle Server

Dec 13, 2012

I have problem: I have 1 sql server already setup SQL Server 2012 Express and 1 Oracle Database server 10g. Now i want to insert data from SQL server to Oracle database through link server.

Some step i already make:
1. Setup oracle database 10g and configure listener (Finished)
2. Setup Sql server 2012 express on Windows 7 (Finished)
3. Setup ODTwithODAC1020221 on PC already setup SQL server (Finished)
4. Make Linkserver from SQL server to Oracle database (Finished), and can select data from Oracle Database on SQL server through Linkserver.

However when i insert data from SQl server to Oracle Server not success.

select * from OPENQUERY (QVHKTEST, 'SELECT * FROM QVSYSTEM')

After i run above script, result is OK
With: "QVHKTEST" is alias of Link server from SQL to Oracle server
: "QVSYSTEM" is a table on Oracle database, that table we want to get through Linkserver on SQL server

Both Server Database contain same name table is "QVSYSTEM"
-----------
INSERT OPENQUERY (QVHKTEST, 'SELECT BODY_NO,
MERCHANDISE,
MODEL_NAME,
LINE_NAME,
DATE_ENTRY
FROM QVSYSTEM')
values('VNF4619829','3227B002CA','L1068','01','2012/09/26 03:18:11');

If i run script above directly in SQL Window query can insert OK. This is code in trigger at table on SQL server:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Author:<Author Name: Phuong Do Minh >
-- Create date: <Create Date: 10/12/2012>
-- Description:<Description: After data insert into table qvsystem on SQL server
-- This trigger will fire and insert that data into table qvsystem
[code].......

But when i make trigger after insert on table in SQL Server to insert data From SQL server to Oracle server, however not success and SQL server raise error below:

OLE DB provider "OraOLEDB.Oracle" for linked server "QVHKTEST" returned message "New transaction cannot enlist in the specified transaction coordinator. ".
Msg 7391, Level 16, State 2, Procedure Insert_data, Line 16

The operation could not be performed because OLE DB provider "OraOLEDB.Oracle" for linked server "QVHKTEST" was unable to begin a distributed transaction.

I don't know how to configure them.

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved