How To Convert Traditional Exp / Datapump
Sep 6, 2010
To convert the following traditional exp/imp to datapump .
1. exp custom/custom FILE=expgt.dmp TABLES=raj.GENERAL_TABLE QUERY = \"WHERE to_date\(lchg_time\) \= \(select db_date from mtda\) \" indexes=n statistics=none
2.exp custom/custom FILE=expdata.dmp TABLES=raj.BAL_TABLE,raj.GEN_TABLE ,raj.CREDITS_TABLE QUERY = \"WHERE to_date\(lchg_time\) \>= \(select db_date from mtda\) \" indexes=n statistics=none
in the second one there tables to the same dmp file with the same where clause
View 3 Replies
ADVERTISEMENT
Sep 18, 2013
i wanted to create additional database into existing one via database creation utility and wanted to change NLS characterset from WE8MSWIN1252 to AL32UTF8 ,will affect to existing database or it will work fine with the steps if possibleieexsiting database orcladdtional wants to create orcl02
View 6 Replies
View Related
Nov 4, 2011
I am trying to write a script for Oracle startup and finding the traditional dbstart/dbshut to be too simple.Basically need something that will start up a database in the same mode that it was in when it was last shut down.
Not running RAC, so CRS is not an option.So, the easy one, if the DB is a primary and open, startup normal.However, if the database is a snapshot standby, it could be previously opened in read-write, read-only or simply mounted.
Is there any OS file that indicates last state? I imagine I could poll the alert log for the last state...
If there isn't, I've thought about writing a script that goes out and polls the current state of the database a few times a day and then record it, but that seems a bit inelegant.
View 1 Replies
View Related
Nov 23, 2012
I had just successfully finished a full importing from Oracle 9i DB to Oracle 11gR2 DB. My export was a full db export.
Prior to this importing, my 11g was a newly created DB with the default SYS, System etc.. schema. Their passwords is different from those in 9i.
However, i realised that after importing... their passwords in 11g was replaced by those passwords in 9i, including SYS and SYSTEM user...
View 5 Replies
View Related
Nov 19, 2008
We have eight servers. One of them is for centeral management. The other seven is for development and test databases. Every night, we start a process which exports database schemas (without data) into the centeral server. An example of the process we run is just like below:
CODEcd /data1/backup
mknod exp_pipe_oracle1 p
compress<exp_pipe_oracle1>dev.dmp.Z&
exp user/****@dev file=exp_pipe_oracle1 owner=schema1,schema2,schema3 statistics=none rows=n log=dev.log
As you can see, we use @dev to connect remote database. But dump file is created in local (centeral) server. We are planning to use datapump instead of classic export in future. But I couldn't find a solution, that could take a database export in a remote server and create dump file in local. I have looked at NETWORK_LINK parameter; but it doesn't seem to work for our case.
Is it possible to backup a remote database with datapump and make it create dumpfile in local? (Of course we can use some solutions such as NFS, but we really not prefer this; if datapump has ability to remotly backup.)
View 5 Replies
View Related
May 21, 2012
I have a problem to read in data from dmp file to target table.
I've created a dmp file using:
CREATE TABLE table_name
ORGANIZATION EXTERNAL
(TYPE oracle_datapump
DEFAULT DIRECTORY directory_name
LOCATION ('file_name.dmp')
) AS SELECT col1, col2, col3 from source_table;
[Code]...
The file created via db_link from target db to remote db and then the data are insterted into target db table target_table.
desc target_table
col1 varchar2(20)
col2 number
col3 number
There are no values longer than 20 characters in source table, but when I insert into target_table(col1, col2, col3) as select col1, col2 col3 from source_table; I get ORA-12899: value too large for column "target_table"."col1" (actual: 25, maximum: 20).
I gues it has something to do with how the oracle datapump stored the data in dmp file. When I select col1 from source_table where length(col1) > 20; I get two values which clearly are not longer than 20 characters.
Selected values are:
748473358
693197674
where the bug is hidden or any "normal" workaround?
View 17 Replies
View Related
Jun 15, 2011
I am trying to use the export the database using datapump API dbms_datapump.I want to export the entire database excluding some schemas.
My question is how to exclude some schemas using datapump API?
View 1 Replies
View Related
Sep 8, 2013
Is it possible to export data to another directory other than datapump default directory(DATA_PUMP_DIR) in Linux.
View 5 Replies
View Related
Jul 2, 2013
what are the ways provided by oracle for database migration,except datapump? and which one is recomended
View 1 Replies
View Related
Apr 30, 2013
I am receiving two large export files from a vendor, so I have no control over the contents. I need to import these into our database. The two export files are very similar, except the one has slightly differenet columns in it. So, export file 1 may have a table:
COLUMN_A
COLUMN_B
COLUMN_C
The second file may have:
COLUMN_A
COLUMN_B
COLUMN_D
At the destination, I have a table that has:
COLUMN_A
COLUMN_B
COLUMN_C
COLUMN_D
Is there a parameter that would let me interchangably import either (or both) files into this destination table? This is my first attempt at data pump - but I know using import this has caused me issues. Not sure if the same limitations exist? Will the missing columns cause it to fail?
View 3 Replies
View Related
Oct 19, 2010
we have daily partitioned table, and for backup we are using data pump (expdp). we policy to drop partition after backup (archiving).
we have archived dump files for 1year, few days back developer made changes with table structure they added one new column to table.
Now we are unable to restore old partitions is there a way to restore partition if new column added / dropped from currect table.
View 4 Replies
View Related
Sep 27, 2011
have 2 Oracle server. One with Suse Linux (Oracle 10.2.0.4.0) and one with Windows 2003 Server x64 (Oracle 11.2.0.1.0).
I made a mistake by installing Oracle 11 x32 on a x64 server. Nevertheless it works for about half a year. Then my backups
with datapump don't work. I changed the SGA_TARGET down to 1024M and the datapump works again. Now I want to renew the Windows server and want to import the datapump schemes to the Linux server.
Exp: expdp SCHEMA1/****@db1 DIRECTORY=dmpdir DUMPFILE=SCHEMA1_EXPDAT.DMP LOGFILE=SCHEMA1_EXP.LOG VERSION=10.2.0.3 REUSE_DUMPFILES=YES
Imp: impdp SCHEMA1/**** DIRECTORY=dmpdir DUMPFILE=SCHEMA1_EXPDAT.DMP LOGFILE=SCHEMA1_IMP.LOG
Log:
..importing SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
..importing SCHEMA_EXPORT/SEQUENCE/SEQUENCE
[code]...
View 1 Replies
View Related
May 11, 2011
I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.
View 28 Replies
View Related
Jan 2, 2012
As a part of our back up we used to export the production data every day using Original export Utility but from 11g original export Utility is de supported and also datapump doesn't support XML Objects so is there any other way to export the full database else any option to export xml Object using datapump.
View 2 Replies
View Related
Mar 31, 2010
We have three instances on the RAC, When I create a directory is there a default instance that it gets created on? When I execute an datapump export on instance 1, it works but on 2 and 3 it fails with error relating to directory.
Error -
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation
Is my only option is to use shared storage space?
View 6 Replies
View Related
Jan 13, 2012
Currently we are using "exp and imp" utilities to unload from production and load into Dev server. While importing, we are following below steps
(1) Load only data [by specifying INDEXES=N in the par file]
(2) Unlock statistics
(3) Load indexes, other objects [by specifying ROWS=N]
After doing these steps, both data, indexes and others objects are loaded. To verify indexes, we are checking DBA_INDEXES.
DBA_INDEXES :
-------------
OWNER INDEX_NAME TABLE_NAME STATUS LAST_ANALYZED
----- ---------- ---------- ------ -------------
MYSCH CP_INDEX_1 CP_TABLE_1 VALID 14/JAN/12
Question :-
(1) Does imp utility rebuild the indexes while loading data ? or it simply takes the rows from dump and load into test system without building from scratch ?
(2) I am trying to replace 'exp' and 'imp' with datapump utilities ? But, I am confused about the parameters to be used ?
(a) Can I load both data and meta data at the same time (Using CONTENT=ALL option) ?
(b) I am planning to implement this in two steps :
first load only metadata using - CONTENT=METADATA_ONLY TABLE_EXISTS_ACTION=REPLACE
then, load data - CONTENT=DATA_ONLY.
View 1 Replies
View Related
Jan 3, 2012
I am trying to use the datapump tool to migrate a 10g db to 11g. Everything works fine except for the "nameless" check constraints.
View 7 Replies
View Related
Sep 5, 2012
One of my Friend gets error While datapump Export backup of Full database.
Pfa below error details:-
ORA-31693: Table data object "RADIOMIRCHI_PIP_HRMS"."GM_DEPT" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-00604: error occurred at recursive SQL level 3
ORA-21780: Maximum number of object durations exceeded.
[code]......
View 1 Replies
View Related
Oct 25, 2010
have a requirement to load .dmp files into existing staging tables and there is package to load the ODS tables from staging.So,I thought of using DBMS_Datapump utility to import the data from .DMP files to the Tables and this need be automated.
--Create Directory
CREATE
OR REPLACE DIRECTORY test_dir AS 'C:Test'
--grant Access to the User
GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
--Script to import
DECLARE
l_dp_handle1 NUMBER;
BEGIN
l_dp_handle1 := dbms_datapump.OPEN(operation => 'IMPORT',
[code]...
Errors
ERROR at line 1:
ORA-31634: job already exists
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 938
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4590
ORA-06512: at line 4
View 6 Replies
View Related
Mar 18, 2013
I have a Datapump Export File which was created in Schema mode.
I have to import the tabelles in a new database where a have to use the REMAP SCHEMA Parameter.
Additionally I would like to add a prefix to tablenames.
For example:
original tablename: THE_TABLE
Name after import: IMP_THE_TABLE
Is there a way to add a prefix while using Datapump Import?
View 5 Replies
View Related
Feb 7, 2011
I'm trying to deploy the schema using DATAPUMP API. The user from which the schema get deployed has the direct privilege of CREATE USER (not through role). But got the insufficient privileges error.
Processing object type SCHEMA_EXPORT/USER
ORA-31685: Object type USER:"SCOTT1" failed due to insufficient privileges. Failing sql is:
CREATE USER "SCOTT1" IDENTIFIED BY VALUES '4EBB0DDE3C79FE47' DEFAULT TABLESPACE "USERS"
TEMPORARY TABLESPACE "TEMP2" PROFILE "APP_PROFILE".
But the user get created successfully when run the CREATE statement manually.I have created the user manually and again run the deployment procedure. Got the below error for ROLE_GRANTS.
Processing object type SCHEMA_EXPORT/ROLE_GRANT
ORA-39083: Object type ROLE_GRANT failed to create with error:
ORA-01932: ADMIN option not granted for role 'EXP_FULL_DATABASE'
Failing sql is:
GRANT "EXP_FULL_DATABASE" TO "SCOTT1"
The user has EXP_FULL_DATABASE with ADMIN Option and IMP_FULL_DATABASE with ADMIN option direct privileges.which privileges the user needs to deploy the schema successfully?
View 1 Replies
View Related
May 29, 2013
how to WRITE a script for datapump EXPORT for the below list of tables in the ADAM schema in RMUAT2 database using include
1) PSOPR,PSDEFN,PSMTR,PSQFPR,PSMNT,PSOPR
2) PMNT,PMTS
3) RMOT,RMST
etc.... i WANT TO EXPORT ONLY FEW TABLES IN THE SCHEMA
View 7 Replies
View Related
May 23, 2012
I am using Datapump import using database link to import an entire schema from another Server but it gives issues with constraints.I tried to first only import the metadata and then disable the constraints and import data and enable constraint but in this case the temp tablespace keeps filling up and i am out of space. Is there any method to do a full import including constraints and indexes.
View 7 Replies
View Related
Apr 5, 2013
I have analyzed that, datapump estimation is 9.902GB. When i check size of .dmp file, it's shows 1.44Gb.
Export: Release 11.2.0.1.0 - Production on Fri Apr 5 02:00:05 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
;;;
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_FULL_01": system/******** dumpfile=expdp_LVGITRN_30_24_050413.dmp directory=DP_DIR logfile=expdp_LVGITRN_30_24_050413.log full=y exclude=statistics
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: [bold]9.902 GB [/bold]
Why Datapump estimate so much than actual size?
View 8 Replies
View Related
Aug 23, 2012
Expdp directory=xxx.dmp dumpfile=aaa.dmp logfile=xxx.log FULL=Y
: :: : : :: : : : ;
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 24.87 MB
Processing object type SCHEMA_EXPORT/USER
[code]...
then my export hangs..... checked in alert log nothing found.and then killed the job and reran again but same....checked the status and it's saying EXECUTING.
View 15 Replies
View Related
Aug 23, 2013
datapump on Windows 2003/R11.2 I have a batch file that creates a daily dump of a schema in the DATA_PUMP_DIR - However, it doesn't! The script is as follows: REM Script to perform data pump export of the user01 schema and move to the
desktop.SET CURRDATE=%DATE:~10,4%%DATE:~4,2%%DATE:~7,2%SET CURRTIME=%TIME:~0,2%%TIME:~3,2%%time:~6,2%SET CURRTIME=%CURRTIME: =0%SET DATESTAMP=%CURRDATE%_%CURRTIME%SET DUMP_FILE=user01_EXP_%DATESTAMP%.DMPSETLOG_FILE=user01_EXP_%DATESTAMP%.LOGexpdp "sys/pwd@db as sysdba" directory=DATA_PUMP_DIR dumpfile=%DUMP_FILE% schemas=user01 logfile=%LOG_FILE% MOVE
F:Oracleproduct11.2.0dbRDBMSlog\%DUMP_FILE% C:USERSADMINISTRATORDESKTOP\%DUMP_FILE%MOVE F:Oracleproduct11.2.0dbRDBMSlog\%LOG_FILE% C:USERSADMINISTRATORDESKTOP\%LOG_FILE%
For some reason when this is run from a remote server as a batch it fails to create a file although the output from the scripts has no errors apart from move statements and the expdp output is all good (it states that the file was created in the expected location).If the expdp command is run on the server itself it is all good.
View 4 Replies
View Related
Jun 13, 2013
the database (11gR2) is located on Linux server. A business application is installed on a Windows server with an Oracle client 11g.The application is able to start a datapump export, but as matter of fact the dumpfiles are always written to the Linux-server. The directoryobject is defined as DATA_PUMP_DIR (which is the default directory).
Now we are supposed to change the datapump export in a matterthat the dumpfiles get written to the Windows server. Creating a new directory (e.g. c:datapump) and starting than the datapump from theclient always raises the errors
ORA-39002: ...ORA-39070: ...ORA-29283: ...ORA-06512: in "SYS.UTL_FILE", Zeile 536ORA-29283: ...
Is it possible at all to start a datapump export from a Windows client and writing the dumpfiles to the Windows server itself? Or do the dumpfiles always written to the database-server?
View 9 Replies
View Related
Jun 9, 2010
I have a problem with DBMS_DATAPUMP.metadata_filter.Let's suppose that I need to export a huge list of tables (a,b,c,d,e,f,g,h,i...). Let's suppose that the list is dynamic do NOT want to use
DBMS_DATAPUMP.metadata_filter (handle => h1,
NAME => 'NAME_EXPR',
VALUE => 'IN (''a'', ''b'', ...)',
object_type => NULL);
In my_export_table there is the list:
CREATE TABLE my_export_table
(
EXPORT_OBJECT_NAME VARCHAR2(50 BYTE)
)
Now I'm trying to use this form:
DBMS_DATAPUMP.metadata_filter (handle => h1,
NAME => 'NAME_EXPR',
VALUE => 'IN (SELECT a.export_object_name FROM my_export_table a, user_objects b WHERE a.export_object_name = b.object_name AND b.object_type = ''TABLE'')',
object_type => NULL
);
but it results in error.
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
ORA-00942: table or view does not exist
[code]...
View 1 Replies
View Related
Oct 18, 2013
i have user with the name 'Rob' and this user has been assigned a role 'MY_SRC_ROLE' . I developed a table under rob schema and granted access to this table via role GRANT DELETE, INSERT, SELECT, UPDATE ON rob.emp TO MY_ SRC_ ROLE; I have 100 more users & they have been granted this role 'MY_SRC_ROLE'. These 100 users can now access emp table via Role 'MY_SRC_ROLE' without any issues. Now i took a datapump export & performed datapump import on target server which is also HP Unix with 11.20.3 .
On target server i have user 'JACK' and a role called 'MY_WORK_ROLE'. 5000 users have been granted 'MY_ WORK_ ROLE' on this server. I have used remap tablespace clause & remap schema clause in datapump import script. Once i performed an import , due to schema remap , i can see JACK now owns table 'emp', however grants are still not there, I tried searching on Google & oracle documentation, if somehow we can remap ROLE GRANTS also while doing datapump imp, but i couldn't find supporting syntax. can i assume datapump import is not capable to handle this particular scenario ? I was able to do it by manipulating sqlfile and replacing role name in that but i am looking for a sol. within datapump itself. how can grants assigned to ROLE 'X' be transferred to 'Role Y' via datapump import.
View 2 Replies
View Related
Apr 8, 2011
how can i change row as column in oracle.
I wrote the one query which is return the following output..
Query:
SELECT RMC_N_ID,NVL(DISCOUNT_N_ID,NULL) AS DISCOUNT_N_ID,NVL(NULL,'') AS LOADINGS_N_ID,NVL(NULL,'') AS WARCOMB_N_ID,NVL(NULL,'') AS ENDORSECOMB_N_ID FROM UWRMC_DISCOUNT WHERE RMC_N_ID =224 AND DISCOUNT_N_ID IS NOT NULL
UNION
SELECT RMC_N_ID,NVL(NULL,'') AS DISCOUNT_N_ID,NVL(LOADING_N_ID,NULL) AS LOADING_N_ID ,NVL(NULL,'') AS WARCOMB_N_ID,NVL(NULL,'') AS ENDORSECOMB_N_ID FROM UWRMC_LOADING WHERE RMC_N_ID =224
UNION
SELECT RMC_N_ID,NVL(NULL,'') AS DISCOUNT_N_ID,NVL(NULL,NULL) AS LOADING_N_ID ,WARCOMB_N_ID,NVL(NULL,'') AS ENDORSECOMB_N_ID FROM UWRMC_WARCOMB WHERE RMC_N_ID =224
UNION
SELECT RMC_N_ID,NVL(NULL,'') AS DISCOUNT_N_ID,NVL(NULL,NULL) AS LOADING_N_ID ,NVL(NULL,'') AS WARCOMB_N_ID, ENDORSECOMB_N_ID FROM UWRMC_ENDORSECOMB WHERE RMC_N_ID =224
Result:
RMC_N_ID DISCOUNT_N_ID LOADINGS_N_ID WARCOMB_N_ID ENDORSECOMB_N_ID
---------- ------------- ------------- ------------ ----------------
224 0 87 0 0
224 0 88 0 0
224 0 0 0 93
224 0 0 0 94
224 0 0 88 0
224 0 0 89 0
224 82 0 0 0
224 83 0 0 0
I want the result like below
RMC_N_ID DISCOUNT_N_ID LOADINGS_N_ID WARCOMB_N_ID ENDORSECOMB_N_ID
---------- ------------- ------------- ------------ ----------------
224 82 87 88 93
224 83 88 89 94
How can i get this..
View 3 Replies
View Related