Server Utilities :: Want To Convert Exp Script To Expdp Script
Jul 27, 2011
I am having export script which performs normal export operaion on full database. As no I want to convert the same script with expdp (datapump). So what are the changes I need to take care.As per my knowledge I have to perform following task:
1. Create a dicrectory
2. change command from exp to expdp with directory name mentioned.
View 2 Replies
ADVERTISEMENT
May 29, 2013
I have one doubt on Expdp & RMAN. Do EXPDP utilities does backup at block level as what RMAN is doing? Which one is faster, expdp or RMAN?
View 16 Replies
View Related
Apr 13, 2011
I would like to export specific tables(not entire schema) including metadata. I am using a parameter file for expdp.
Tables=emp,dept
Does this also include all metadata or should i also add the below Include in the parfile ?
INCLUDE =Indexes,Sequences,Procedures,Views
View 3 Replies
View Related
Oct 1, 2012
We had AIX OS on 570 machine and database 10.2.0.4. We took expdp and it took 2 and hour to complete every night.
Now we upgrade to 10.2.0.5 and 770 machine and now same command takes 6 hours to complete even database and hardware is upgraded
Command is
expdp T24SILK/oracle directory=backup dumpfile=exp_beod_T24_%U_$dt
.dmp logfile=exp_T24_$dt.log EXCLUDE=TABLE:"LIKE '%TRACE'" parallel=6
View 1 Replies
View Related
May 27, 2011
I am trying to export Schema using expdp command. but its going hang after few minutes. it seems that it stucks any where. Even I am trying with normal scott schema it is also hanging.
View 16 Replies
View Related
Jun 4, 2010
I export a table using exp utility it take 30 mins to complete the export.The same i have done in expdp utility it take 10 mins to complete the export.
How it happens?
View 3 Replies
View Related
Jun 3, 2010
While trying to expdp using Query logics, getting syntax related erros shown below:
expdp system/xxxx SCHEMAS=LOG NETWORK_LINK=DBLINK1 INCLUDE=TABLE:"IN('DAILY_LOG')" QUERY=LOG.DAILY_LOG:"where entry_date< to_char(sysdate -1,'yyyymmdd')" DIRECTORY=dump DUMPFILE=log_exp.dmp logfile=log_exp.log
But gives the following error
ORA-31693: Table data object "LOG"."DAILY_LOG" failed to load/unload and is being skipped due to error:
ORA-00904: "YYYYMMDD": invalid identifier
I tried with simple sql with YYYMMDD and it works fine, the entry_date is a char field. in QUERY where i'm doing wrong here?
View 4 Replies
View Related
Oct 5, 2013
i want to exclude only data of some particular tables not complete table object when exporting using expdp.
View 13 Replies
View Related
Aug 26, 2012
I have a server configured to German & English. when i connect with SQLPLUS, i have German language server output, but when i do "alter session set nls_language='AMERICAN'" - it solves the issue for me.
I need the same for expdp command, but I don't know how to do this. I have tried to add a parameter nls_language, but expdp doesn't recognize it. Is it possible to somehow see server output of the expdp & writing it to the log file in English?
View 5 Replies
View Related
Jun 16, 2011
i succeeded to expdp to ASM diskgroup such as
create directory asmexpdir as '+RECO/FILTDB/EXPDP';
grant read,write on directory asmexpdir to oraasfs;
expdp oraasfs/oraasfs2301 directory=asmexpdir dumpfile=SBSR_EXP.dmp tables=TM_SFS_CUST_01 logfile=EXPDP_LOG:SBSR_EXP.log
SUCCESS MESSAGE
. . exported "ORAASFS"."TM_SFS_CUST_01" 387.2 MB 817684 rows
Master table "ORAASFS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ORAASFS.SYS_EXPORT_TABLE_01 is:
+RECO/filtdb/expdp/sbsr_exp.dmp
Job "ORAASFS"."SYS_EXPORT_TABLE_01" successfully completed at 03:34:59
And I like to run this daily and delete after 14 days. but it show error, what can be the solution to run this script?
#!/bin/bash
#Script to Perform Datapump Export backup Every Day
################################################################
#Change History
[code]...
View 9 Replies
View Related
Aug 17, 2013
I want to take an export of table MESSAGE, and filter it for the day of 17 JUL 2013 (just to limit the size). i used the following expdp command but its not working.
expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile=FA0001P_BG_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
But with select query i am able to retrieve the rows for the specific date.
select * from MESSAGE where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
Here is the command with syntax error.
[oracle@orcl log]$ expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile= DB_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
-bash: syntax error near unexpected token `('
View 3 Replies
View Related
Jun 5, 2012
I have to servers 'A' and 'B', On Server there is a schema with the name "test" having a table "t1". I want to import this t1 table to server B.
Is it possible to export dump using expdp to remote host.
I found that there is an option for this like "network_link". for testing this, I created a dblink from Server "B" to "A" named "vxmldb".
When I am using the below command on Server B there I am getting the following error.
C:>expdp directory=data_pump_dir logfile=test.log network_link=vxmldb schemas=test dumpfile=test.dmp
Export: Release 11.1.0.6.0 - 64bit Production on Tuesday, 05 June, 2012 14:22:07
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Username: system/vxmldb@vxmldb
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39170: Schema expression 'TEST' does not correspond to any schemas.
In above command
directory ---> Server "B" location
network_link ----- > dblink name which is created on Server "B" to access Server "A"
schemas ------ > schema name which is to be exported . Exists on Server "A" DB
username/password ---- >> higher level username/password for Server "A".
@connectString ----- >> connecting to Server "A"
View 15 Replies
View Related
May 29, 2012
We are DB users (not DBAs) and used always exp/imp bevore application upgrade.
Was googling arround and read something like "Oracle Data Pump - Time to let go of Exp / Imp". It seems exp/imp is obsolete.
Our system doesn't have "expdp" command
> find . -name expdp
>
is this because of too old SQL*Plus?
> sqlplus
SQL*Plus: Release 8.1.7.0.0 - Production on Tue May 29 16:05:28 2012
(c) Copyright 2000 Oracle Corporation. All rights reserved.
Enter user-name: ^C^C
- does our DBA need to give us privileges to run expdp/impdp?
- is that true that a expdp/impdp dump will be on the Oracle server (not the client machine)?
View 4 Replies
View Related
Feb 1, 2012
I am having one prod and one devl with prod having stream setup.
I have to refresh devl with prod , but if i will go by full expdp then db_links also get imported into the devl and may cause problem in devl.
Is there any other way using expdp to exclude the stream objects while doing import.
View 1 Replies
View Related
Dec 2, 2010
From some day I have this error during export data pump:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-31687: error creating worker process with worker id 1
ORA-31687: error creating worker process with worker id 1
ORA-31688: Worker process failed during startup.
This error is random, if I retry after few minutes the expdp work correctly.
View 8 Replies
View Related
May 1, 2010
If I would be using expdp using remap_schema will it also remap grants and synonyms ?
View 5 Replies
View Related
Apr 13, 2012
How we can overwrite existing dump file for expdp in oracle 10g because everytime we excute expdp and dmp file exist we get below error
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "C:scott_emp.dmp"
ORA-27038: created file already exists
OSD-04010: <create> option specified, file already exists
We have one feature in 11g reuse_dumpfiles=y ,which doesnt work in 10g, I want something which can overwrite existing dumpfile in 10g?
View 1 Replies
View Related
Aug 4, 2011
I am using expdp/impdp to migrate 4 TB database from solaris to Linux.But the import process is taking forever.
View 13 Replies
View Related
Mar 30, 2007
I'm taking export dump using expdp of some schema's of total size is 300GB. This is the par file:
DIRECTORY=expdp
FILESIZE=32212254720
DUMPFILE=expdp_schema01.dmp,expdp_schema02.dmp,expdp_schema03.dmp,expdp_schema04.dmp,expdp_schema05.dmp,expdp_schema06.dmp,expdp_sche ma07.dmp,expdp_schema08.dmp,expdp_schema09.dmp,expdp_schema10.dmp,expdp_schema11.dmp,expdp_schema12.dmp,expdp_schema13.d
[code]....
here one biggest schema size is 250GB and the total size of all the schema's is 300GB. The file where am taking the dump has 350GB space but even then the expdp failed saying
ORA-39095: Dump file space has been exhausted: Unable to allocate 8192 bytes
why it failed and how to restart it and make sure it runs successfully without error.
View 4 Replies
View Related
Feb 23, 2012
how to find the versions of exp and imp utilities of database server from windows command prompt?
Note: Currently i have 10.2.0.10 oracle software installed on my local machine.
View 4 Replies
View Related
Oct 29, 2013
I want to load lakhs of records into a table. My problem is when after loading the ¼ of records my process is abend due to the size of my rollback segment area. I don't have an option to increase it. So, Is there any way to go for intermediate commits when I am using the imp or sqlldr utilities to load the entire data without abend?
View 2 Replies
View Related
Feb 20, 2012
I am familiar with tool Netca. However there is one more utility exist for the same functionatlty which is netmgrI checked with many DBAs for the exact difference, however I did not get the best answer from them. I also have checked in google but not exactly got the difference. list the exact difference between those 2 tools (netca, netmgr)
View 1 Replies
View Related
Sep 3, 2013
In SQL server 30 stored procedures are there how to convert all the stored procedure from SQL server to Oracle 11g..
View 5 Replies
View Related
Oct 8, 2010
I'm converting a SQL Server db to Oracle 10g. So far it's going pretty well. However, I've hit a significant performance snag trying to run queries against the converted view whose SQL is below. In SQL Server it runs pretty quickly, 10-15 secs. This query presently returns about 1.7 million records. code below, and assuming I've done nothing to optimize the database or the objects involved,
BS
------------------
CREATE VIEW DST_TMP_VIEW_ACCT_XACTN_CRN AS
SELECT CUSTACCNUM,
FUNDID,
[code]...
View 2 Replies
View Related
Mar 24, 2010
How to execute the SQLLDR, where Data File Reside in another Server?
View 1 Replies
View Related
Sep 15, 2010
I have exported data of one user an importing into another schema at another server. when i am trying to imoport it is working fine for quite no of imports into tables, but after some time it starts giving me below mention error...
IMP-00008: unrecognized statement in the export file:
<
IMP-00008: unrecognized statement in the export file:
<
IMP-00008: unrecognized statement in the export file:
<ے
IMP-00008: unrecognized statement in the export file:
+A
IMP-00008: unrecognized statement in the export file:
[code]...
View 6 Replies
View Related
Nov 23, 2011
I have a requirement to read flat text file(around 15000 lines) residing at a client location from DB server and write into a table in One cell.
I tried UTL_FILE and DBMS_LOB but, i am not able to access client location to read the file as it reads path from Oracle Directory.
eg.
my client path is 198.168.1.1 and my DB server is in unix say 192.168.1.10.
file location is: \192.168.1.1shareabc.txt
So I created One Oracle directory as MY_DIR having DIRECTORY_PATH as '\192.168.1.1share'.
But both UTL_FILE and DBMS_LOB is not able to access the file.
Error Message:
-------------
Unable to process CLOB -22288 ~ ORA-22288: file or LOB operation FILEOPEN
failed
No such file or directory
Few Details for reference:
-------------------------
File Location: \192.168.1.1shareabc.txt
Unix DB Server location: 192.168.1.10
Table : Test (filename varchar2(30), Content CLOB)
Oracle Dir: MYDIR
Directory_Path: \192.168.1.1share
View 7 Replies
View Related
Mar 16, 2011
I've a question regarding difference of character sets, while taking a export(logical backup) of database on directly to server(linux RHEL 2.1 AS) and export on a client (windows xp prof machine, where only a oracle 9i client is installed). On server it seems to fine and okay, but on client node i'm getting following error for almost all tables.
EXP-00091: Exporting questionable statistics.
My question is :
[1] Is it creating any sort of problem, if later on i import the data which was taken from client node.
[2] Why there is a difference(marginal) in dump(.dmp) file size.
[3] Is there any way to overcome it, or it is the natural behave of it. Means not a problem.
[4] If i'm using a long or blob as datatype for some of my table,is they have any problem if i persist like above.
Additional Information about character sets On server node :
Export done in US7ASCII character set and AL16UTF16 NCHAR character set server uses WE8ISO8859P1 character set (possible charset conversion)
On client node :
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set server uses US7ASCII character set (possible charset conversion)
View 7 Replies
View Related
Mar 7, 2011
I'm importing data from SQL Server to Oracle. I used BCP to export the data from SQL Server. Below is the 1st record of table trlc from the csv file.
trlc.CSV
11032|100|Wman| | |2008-02-08| |
Using SQL Loader to import into Oracle:
TRLC table in Oracle database:
est_no varchar2(10) default ' '
right_no number(4) default 0
maj_auth varchar2(15) default ' '
weight varchar2(10) default ' '
idm_ht varchar2(8) default ' '
c_date date
P_tkt varchar2(5) default ' '
sqlldr user/pwd@db02 control=trlc.ctl log=trlc.log
trlc.ctL:
load data
infile 'trlc.csv'
replace
into table trlc
fields TERMINATED BY '|'
TRAILING NULLCOLS
(est_no,right_no,maj_auth,weight,idm_ht,c_date,P_tkt)
The rows get inserted successfully. But the result sets are different, for example: When I do a select in SQL Server,'select len(weight) from trlc;' , I get the length as 0. But when I do a select in oracle database, I get the length as 1. Also, the result set varies for the query below:
select * from trlc where weight=' ';
(SQL Server returns 1 row but Oracle returns no rows)
Do I need to mention any conversion code for the weight field to accept ' ' value?
View 12 Replies
View Related
Jul 11, 2011
Here are the cards I've been dealt:
I've inherited a 10.2.0.1.0 instance running on a windows 2003 box; running fine, no problems other than system has been in production since 2005 and has gotten pretty old and tired. This old box has one tablespace on it... called "gateway".
I've installed 11.2.0.1 on a new (Windows 2008 R2 Enterprise 64-bit) server and created an empty database also called "gateway".
Now to move the data and views, objects, everything.
I've read up on a variety of migration techniques (oops, I mean "upgradation" LOL) and can follow the steps...
In short, I want to pull everything off of server a (10.2) and put it into production on server b (11.2). There seems to be quit a few options.
1. install 10.2 on my NEW server (server b), move the data over and get everything running, then install 11.2 and have it upgrade the database as part of the installation process.
2. drop the empty tablespace on server b, stop the database on server a, copy the files over from the old to the new home, run DBUA or set the compatibility attribute...
3. run some type of server a to server b utility that can bridge the two over the network. Some type of mirroring technique?
4. run file export scripts on server a, copy files to server b and run various import scripts
I tend to think that option 3 would be the best because both instances are in great health and are running right now. Is there a mechanism that allows the 11.2 instance to see and upgrade from a different server?
View 9 Replies
View Related