Server Utilities :: Data Pump For Exporting And Importing Extremely Large Data Files
Sep 24, 2010
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
View 4 Replies
ADVERTISEMENT
Sep 2, 2013
I am trying to export a partition of a table and import it to another database. I get the below error when I try to import.
ORA-14400: inserted partition key does not map to any partition
If I export the table(for that particular partition) and import the table(after dropping the table) in destination, the partitions and sub partitions are created without any problem.
The table is Range Partitioned and Sub partitioned in List. So I had to perform the below operation if I want to retain other data in the Destination table.
1. Drop the existing partition
2. Create the partition and sub partition, same as source
3. Execute imp
In fact I had to perform step#2, as if I split the partition also, the sub partition gets replicated in the new partition, which again throws the same error. Is there better way of managing the partitions and subpartition in destination with exp/imp utility, so that I need not perform step#1 and step#2 manually.
View 11 Replies
View Related
Apr 1, 2011
I have database A (Working in Live environment) and Database B copy of Database (Not live) I have Restored whole database (A) RMAN backup file on Database (B) Previous week now i don't want to change anything in any schema and want to import only updated and new records in the table in Database B
There are around 20 schema If for example i have everything in new database B all required database objects like Procedure,functions, packages with indexes in all tables and data in tables, i just want to add new data and updated data.
IF i do following in source database
expdp directory=dpump_dir dumpfile=table_data.dmp content=data_only schemas=ACCMAIN,HRMAIN,..... include=TABLE
AND Import in destination database B, will it add new data and update existing one in table and not touch the table structure and indexes.
View 5 Replies
View Related
Jul 25, 2011
extract a huge amount of data from a couple of views... the problem is that they want it in TXT files with fixed record length. There will be like 6 files, for a total amount of about 10GB.
export those tables in the fastest possible way? If I'm not mistaken exp and expdp can't create txt files, so do I really need to use utl_file or spool?
View 1 Replies
View Related
Nov 9, 2010
I would like to know if a dump file taken from 10G could be imported into to an Oracle 8I database.
View 1 Replies
View Related
Sep 22, 2010
I'm trying to export a relatively large database but it's a bit more complicated than that.For one schema I need a full export / import (data included).
For another 10 schemas I need them empty, with the exception of a table in some of them which needs to be exported / imported with all data inside.Is it possible to do this with datapump utility (impdp, expdp)?
Afterwards I will be running some scripts to populate the DB instance with critical data / metadata.
View 1 Replies
View Related
Sep 2, 2011
How (or having a script) to get, in PL/SQL, the parameters that have been given on a Data Pump command (export or import): mode (easy), tables/schemas list, exclude/include values and so on?
10.2 (preferred) or 11.2 as you want.
View 2 Replies
View Related
Oct 14, 2013
I met some problems with data pump tools. We have a large db(oracle 10.2.0.5, single instance on hpux v11.11, data about 8TB), while we wanna migrate to Linux ia 64bit, 10.2.0.5 RAC. The migrate window is around 20 hours, so the window can NOT be assign more hours.
As we consider if impdp & expdp work can start together, that will be save a lot of time. So I wonder if there are any ways to implement this or any other ways can speed up and make the data integrate better?
View 4 Replies
View Related
Feb 22, 2011
why datapump is faster than normal exp ? one ans that i know is dp use block mode and exp use byte mode . is there any other major reason? if say i have database of size 10g and want to take datapump backup, but condition is that i can take dumpfile of size 2g only. there is any way to take full backup of database in part wise .
View 1 Replies
View Related
Dec 9, 2010
DATA PUMP in Oracle 10g. i want to know how to use this DATA PUMP and where i have to use this DATA PUMP concept.
View 3 Replies
View Related
Aug 14, 2012
I am getting following errors while importing data.
SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX
ORA-39083: Object type INDEX failed to create with error:
ORA-06550: line 2, column 1:
PLS-00201: identifier 'CTXSYS.DRIIMP' must be declared
ORA-06550: line 2, column 1:
[code]......
View 6 Replies
View Related
Jul 22, 2012
OS: RHEL
DB: 11.2.0.2
Every time i try to refresh my production DB with the a old expdp dumpfile using data pump i always face the issue of grants and creation of synonym. I would like to tell you that my DB has three schemas which have lots of dependencies among them and before refreshing them i drop the schemas and recreate the same.
Drop user user_name cascade;So i want to know, is there a script from which i can get all the grants of the DB before dropping the schemas, so that after import i can grant the same and also a query with which i will be able to get all the synonyms of the DB.
View 8 Replies
View Related
Aug 26, 2011
Can we query on database to check if data pump was run on scheduled time incase we dont have access to operating system to see the log file.
View 1 Replies
View Related
Sep 25, 2010
We have a QA database on a VM server with Windows 2003 operating system and oracle 10.2.0.1 installed along with limited disk space.We received an expdp file from a client that is large enough that we had to copy it to a network drive (40GB). I created a new directory called IMPDMP with the directory path (using UNC pathing) to \serversharefoldersubfolder (our network mapped P drive, yes I included the backslash, but I have tried without it also). I also included the parfile here. I checked the grants and they seem to be fine
SQL> select * from session_roles where role like '%DATABASE' or role like 'DBA';
ROLE
------------------------------
DBA
EXP_FULL_DATABASE
IMP_FULL_DATABASE
SQL> select * from session_privs where privilege like '%DICT%';
PRIVILEGE
----------------------------------------
SELECT ANY DICTIONARY
ANALYZE ANY DICTIONARY
[code]....
My questions are this:
1) In interactive mode, does a dummy file expdat.dmp have to exist in the DATA_PUMP_DIR directory?
2) does my export have to reside in the DATA_PUMP_DIR directory (again, no disk space to handle the DMP file), one of the hard drives is just big enough to handle the space but since it has datafiles there also, it would crash during import when trying to extend.
View 3 Replies
View Related
Jun 22, 2010
In oracle 10g data pump (Logical backup) How to take the consistent backup. What parameter can we use?
View 2 Replies
View Related
Feb 13, 2012
I'm installing a new application-testing server, i have installed 11g r2 instant clients & SQL* Plus client.
when i'm trying to run an expdp command, i get this:
'expdp' is not recognized as an internal or external command
Now, i understand this is because i don't have the Bin directory of a client installation in my Path of the OS. My question is, which one exactly i need for using data-pump utility, and where to download it?
I've found lots of posts of people that had issues with defining the ORA_HOME$in in the $PATH, or having a client incompatibility issue throughout the web, but no answer to my specific question.
View 4 Replies
View Related
Jan 15, 2012
I'm getting the following massage:
ORA-31693: Table data object "033"."EZMILRIKUZ" failed to load/unload and is being skipped due to error:
ORA-00922: missing or invalid option
my backup syntax is:
033/******@INTORCL DIRECTORY=exp_dir DUMPFILE=033.dmp LOGFILE=033.LOG FULL=N REUSE_DUMPFILES=Y FLASHBACK_TIME="TO_TIMESTAMP(TO_CHAR(SYSDATE,'YYYY-MM-DD HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS')"
I thought there was a problem with the table so i created a new one and now I'm getting the same error on a different table (the third on in the list:
. . exported "033"."PRQ" 192.9 KB 479 rows
. . exported "033"."EZMIL" 558.8 KB 1229 rows
ORA-31693: Table data object "033"."MIL" failed to load/unload and is being skipped due to error:
ORA-00922: missing or invalid option
when i takeoff the "FLASHBACK_TIME" parameter it works fine. ButI need this parameter.
View 4 Replies
View Related
Aug 25, 2011
I want do this connected in windows 2008 r2 with oracle 11G R2 execute an import, that will do a full import, from a linux with oracle 10g called "SUPORTE1"
I´m trying this in the windows2008 machine
impdp system/manager@w2k811r2 full=y DIRECTORY=dpump NETWORK_LINK=SUPORTE1;
and I get the follow errors
ORA-39001: valor de argumento invßlido ( argument valor invalid)
ORA-39200: O nome do link "SUPORTE1;" Ú invßlido. ( link name invliad)
ORA-44004: nome SQL qualificado invßlido ( sql name invalid)
I tested the connection, db-link and created the directory.
View 3 Replies
View Related
Jun 16, 2011
i succeeded to expdp to ASM diskgroup such as
create directory asmexpdir as '+RECO/FILTDB/EXPDP';
grant read,write on directory asmexpdir to oraasfs;
expdp oraasfs/oraasfs2301 directory=asmexpdir dumpfile=SBSR_EXP.dmp tables=TM_SFS_CUST_01 logfile=EXPDP_LOG:SBSR_EXP.log
SUCCESS MESSAGE
. . exported "ORAASFS"."TM_SFS_CUST_01" 387.2 MB 817684 rows
Master table "ORAASFS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ORAASFS.SYS_EXPORT_TABLE_01 is:
+RECO/filtdb/expdp/sbsr_exp.dmp
Job "ORAASFS"."SYS_EXPORT_TABLE_01" successfully completed at 03:34:59
And I like to run this daily and delete after 14 days. but it show error, what can be the solution to run this script?
#!/bin/bash
#Script to Perform Datapump Export backup Every Day
################################################################
#Change History
[code]...
View 9 Replies
View Related
Jun 17, 2010
While using impdp I had set parallel to 16. Currently parallelism is 16 but there are only two workers with worker parallism 1.
1st worker is executing and 2nd worker is waiting.
Is my impdp running parallely? What does work parallelism mean?
View 1 Replies
View Related
Jun 29, 2010
I have received a dump, which i need to put on a newly created schema, there is a particular table with more then 4 million rows, and other tables have hardly few thousand rows.
I want to import it in a way where only 1000 rows get imported for this table and other tables do not get affected. Is there a way to do it?
Note: Tables in dump are more then 200.
View 7 Replies
View Related
May 10, 2010
i take export of one table (export complet successfully without warnings) when i am going to import into prduction databae the data in the table no coming i past the table structure and import command and logfile for import.
import command
imp user/user@database file=C:Tempuserdrop_id10g _id_rac_ra_headertables.dmp log=C:Tempuserdrop_id10g _id_rac_ra_headertablesimp.log tables=t_id_rac_ra_header fromuser=user touser=user statistics=none ignore=yes
my table structure in production database is
CREATE TABLE t_id_rac_ra_header
(ra_company VARCHAR2(10) NOT NULL,
ra_key NUMBER NOT NULL,
ra_doc_type VARCHAR2(50) NOT NULL,
ra_doc_number VARCHAR2(25) NOT NULL,
ra_doc_date DATE DEFAULT SYSDATE NOT NULL,
ra_reserve_key NUMBER,
[code].....
View 4 Replies
View Related
Sep 16, 2011
I have a rather complicated process to import text files into my DB.I'm given thousands of files every day, separated by "," and with 80 fields each. With a bash script, I take the 45 fields I need and then split each file into x number of files grouping the rows by three fields.Then I use SQL Loader to insert them into de DB.
The problem is that now I must insert on two tables and the "WHEN" clause doesn't allow the use of > and <.
To make things a litle clearer take this text file (already splited and grouped and ready to be inserted):
...
1,1,135,1900,0,12,114,2011/08/25 17:19:00,135,...
1,1,135,1900,0,13,119,2011/08/25 17:19:00,136,...
1,1,135,1900,0,14,117,2011/08/25 17:19:00,137,...
1,1,135,1900,0,15,113,2011/08/25 17:19:00,138,...
1,1,135,1900,0,16,119,2011/08/25 17:19:00,139,...
...
When field 6 is higher or equal to 14, it must go to table a.When field 6 is lower than 14, it must go to table b.I can't use external tables as I'm in a different server.
View 1 Replies
View Related
Dec 12, 2011
There are multiple directories created in server for data pump. which one to use for export data pump ?
View 6 Replies
View Related
Dec 24, 2010
If i export data using thw below query it shows the error:
>expdp test1/test1 DIRECTORY=datapump DUMPFILE=expfull.dmp query=auth_test:"where TXNREQDTTIME<'20-MAY-10'" tables=auth_test
bash-3.00$ expdp test1/test1 DIRECTORY=datapump DUMPFILE=expfull-3.dmp query=auth_test:"where TXNREQDTTIME<'20-MAY-10'" tables=auth_test
Export: Release 10.2.0.1.0 - Production on Saturday, 25 December, 2010 5:10:06
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "TEST1"."SYS_EXPORT_TABLE_01": test1/******** DIRECTORY=datapump DUMPFILE=expfull-3.dmp query=auth_test:"where TXNREQDTTIME<20-MAY-10" tables=auth_test
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
[code]....
View 4 Replies
View Related
Sep 13, 2011
I need to use Data Pump for the first time on my production Database.Currently on Testing Database, when i am taking schema level export there are no errors or warnings in the log file but when i importing it gives fallowing ORA in the import log file. i searched on google,the only way i found is to recompile the invalid objects. how to avoid this warnings in log file.
"ORA-39082: Object type ALTER_PROCEDURE:"QUANTISV4"."P_CTM_ABN_INVST_EQUITY" created with compilation warnings"
View 4 Replies
View Related
Mar 11, 2013
IMPORT PARTITION TABLE Through Data Pump.
I have a table with RANGE PARTITION. I wanted to import this into another server with the same partitions.
But when I imported the table, The table created with the Partition but the data is not inserted in partition wise.
But I could see the Entire table's ROW COUNT.
View 5 Replies
View Related
Jul 26, 2010
I am trying to import database dump using the following command
impdp system/xxxx@xxxx schemas=staging
remap_schema=staging:staging directory=DUMPDIR dumpfile=staging.dmp logfile=impdpstaing.log
TRANSFORM=SEGMENT_ATTRIBUTES:n
its importing data fine upto some stage after that oracle gives the following error
Processing object type SCHEMA_EXPORT/JAVA_SOURCE/JAVA_SOURCE
ORA-39097: Data Pump job encountered unexpected error -1423
ORA-39065: unexpected master process exception in DISPATCH
ORA-01423: error encountered while checking for extra rows in exact fetch
ORA-04030: out of process memory when trying to allocate 123404 bytes (QERHJ has
h-joi,kllcqas:kllsltba)
ORA-39014: One or more workers have prematurely exited.
Job "SYSTEM"."SYS_IMPORT_SCHEMA_04" stopped due to fatal error at 11:42:03
I though its due to lack of memory, so i have increased pga_aggregate_target=512MB to 600MB still i am getting a same error.
View 5 Replies
View Related
May 18, 2011
I did the datapump export and import from one schema to a new schema in the same database. I had to use different tablespace. I used the following parameters in the parfiles :
export parfile
directory
dumpfile
logfile
parallel
import parfile
directory
dumpfile
logfile
parallel
remap_schema
remap_tablespace
Tell me whether I need to use different parameters than the one I used? Can I use both remap_schema and remap_tablespace at a time?
View 1 Replies
View Related
Feb 15, 2007
As I put data pump import command: got the error..........
Import: Release 10.1.0.2.0 - Production on Thursday, 15 February, 2007 3:54
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_SQL_FILE_FULL_02" successfully loaded/unloaded
Starting "SYSTEM"."SYS_SQL_FILE_FULL_02": system/******** directory=data_pump dumpfile=prashant_dp.
dmp SQLFILE=prashant_imp.sql logfile=prashant_imp.log
[code]...
View 8 Replies
View Related