Possible To Export Data To Another Directory Other Than Datapump

Sep 8, 2013

Is it possible to export data to another directory other than datapump default directory(DATA_PUMP_DIR) in Linux.

View 5 Replies


ADVERTISEMENT

Server Utilities :: Oracle Directory And Datapump

Mar 31, 2010

We have three instances on the RAC, When I create a directory is there a default instance that it gets created on? When I execute an datapump export on instance 1, it works but on 2 and 3 it fails with error relating to directory.

Error -
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation

Is my only option is to use shared storage space?

View 6 Replies View Related

Server Utilities :: Export XML Object Using Datapump

Jan 2, 2012

As a part of our back up we used to export the production data every day using Original export Utility but from 11g original export Utility is de supported and also datapump doesn't support XML Objects so is there any other way to export the full database else any option to export xml Object using datapump.

View 2 Replies View Related

Error While Datapump Export Backup Of Full Database

Sep 5, 2012

One of my Friend gets error While datapump Export backup of Full database.

Pfa below error details:-

ORA-31693: Table data object "RADIOMIRCHI_PIP_HRMS"."GM_DEPT" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-00604: error occurred at recursive SQL level 3
ORA-21780: Maximum number of object durations exceeded.
[code]......

View 1 Replies View Related

Export/Import/SQL Loader :: Exporting Query Using Datapump

May 29, 2013

how to WRITE a script for datapump EXPORT for the below list of tables in the ADAM schema in RMUAT2 database using include

1) PSOPR,PSDEFN,PSMTR,PSQFPR,PSMNT,PSOPR
2) PMNT,PMTS
3) RMOT,RMST

etc.... i WANT TO EXPORT ONLY FEW TABLES IN THE SCHEMA

View 7 Replies View Related

Server Utilities :: Datapump Export Taking Long Time (HUNG)?

Aug 23, 2012

Expdp directory=xxx.dmp dumpfile=aaa.dmp logfile=xxx.log FULL=Y
: :: : : :: : : : ;
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 24.87 MB
Processing object type SCHEMA_EXPORT/USER

[code]...

then my export hangs..... checked in alert log nothing found.and then killed the job and reran again but same....checked the status and it's saying EXECUTING.

View 15 Replies View Related

Export/Import/SQL Loader :: DataPump Succeeds But Dump File Does Not Get Created

Aug 23, 2013

datapump on Windows 2003/R11.2 I have a batch file that creates a daily dump of a schema in the DATA_PUMP_DIR - However, it doesn't! The script is as follows: REM Script to perform data pump export of the user01 schema and move to the

desktop.SET CURRDATE=%DATE:~10,4%%DATE:~4,2%%DATE:~7,2%SET CURRTIME=%TIME:~0,2%%TIME:~3,2%%time:~6,2%SET CURRTIME=%CURRTIME: =0%SET DATESTAMP=%CURRDATE%_%CURRTIME%SET DUMP_FILE=user01_EXP_%DATESTAMP%.DMPSETLOG_FILE=user01_EXP_%DATESTAMP%.LOGexpdp "sys/pwd@db  as sysdba" directory=DATA_PUMP_DIR dumpfile=%DUMP_FILE% schemas=user01 logfile=%LOG_FILE%  MOVE

F:Oracleproduct11.2.0dbRDBMSlog\%DUMP_FILE% C:USERSADMINISTRATORDESKTOP\%DUMP_FILE%MOVE F:Oracleproduct11.2.0dbRDBMSlog\%LOG_FILE% C:USERSADMINISTRATORDESKTOP\%LOG_FILE% 

For some reason when this is run from a remote server as a batch it fails to create a file although the output from the scripts has no errors apart from move statements and the expdp output is all good (it states that the file was created in the expected location).If the expdp command is run on the server itself it is all good. 

View 4 Replies View Related

Export/Import/SQL Loader :: Using Datapump On Windows Client When Database Is On Linux

Jun 13, 2013

the database (11gR2) is located on Linux server. A business application is installed on a Windows server with an Oracle client 11g.The application is able to start a datapump export, but as matter of fact the dumpfiles are always written to the Linux-server. The directoryobject is defined as DATA_PUMP_DIR (which is the default directory).

Now we are supposed to change the datapump export in a matterthat the dumpfiles get written to the Windows server. Creating a new directory (e.g. c:datapump) and starting than the datapump from theclient always raises the errors 

ORA-39002: ...ORA-39070: ...ORA-29283: ...ORA-06512: in "SYS.UTL_FILE", Zeile 536ORA-29283: ... 

Is it possible at all to start a datapump export from a Windows client and writing the dumpfiles to the Windows server itself? Or do the dumpfiles  always written to the database-server?

View 9 Replies View Related

Export/Import/SQL Loader :: External Tables Loading Multiple Files From Directory One By One

Oct 4, 2013

the following situation, I have a directory named /dat/global/stock/  inside this i will get files named differently for example below.abcdef.112dfgrt.2......

 Here i want to load this file one by one into the external tables and generate one more file based on some enrichment.

Step 1. Have to take first file and to load into the ext table.
Step 2. Enrichment
Step 3.File generation. 

Now here i am facing a problem that in that particular directory i usually get 1000 files so i need to get file one by one and to put in one more directory. how can i get file one by one and generate file by using oracle loader 

View 4 Replies View Related

Export/Import/SQL Loader :: Setting For External Table - Create File In Directory DIR-1?

Nov 20, 2012

Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production

I am new in external table so i have tried following cmd.

create directory dir_1 as 'E:ora_dirt' ;
grant read, write on directory dir_1 to HR;
select * from all_directories;
create table emp_ext
(emp_id number,
emp_name varchar2(30)

[code]...

since I am not able to see DIR_1 in E: drive due to which i havnt created  'emp.dat' file and on executing select on external table i m geting expected error *"ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-04043: table column not found in external source: EMP_ID"*

how to create that file in directory "DIR_1" .

View 2 Replies View Related

Server Utilities :: Data Migration Using Datapump?

May 11, 2011

I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.

View 28 Replies View Related

Export/Import/SQL Loader :: Assigned To ROLE X Be Transferred To Role Y Via Datapump Import

Oct 18, 2013

i have user with the name 'Rob' and this user has been assigned a role 'MY_SRC_ROLE' . I developed a table under rob schema and granted access to this table via role GRANT DELETE, INSERT, SELECT, UPDATE ON rob.emp TO MY_ SRC_ ROLE; I have 100 more users & they have been granted this role 'MY_SRC_ROLE'. These 100 users can now access emp table via Role 'MY_SRC_ROLE' without any issues. Now i took a datapump export & performed datapump import on target server which is also HP Unix with 11.20.3 .

On target server i have user 'JACK' and a role called 'MY_WORK_ROLE'. 5000 users have been granted 'MY_ WORK_ ROLE' on this server. I have used remap tablespace clause & remap schema clause in datapump import script. Once i performed an import , due to schema remap , i can see JACK now owns table 'emp', however grants are still not there, I tried searching on Google & oracle documentation, if somehow we can remap ROLE GRANTS also while doing datapump imp, but i couldn't find supporting syntax. can i assume datapump import is not capable to handle this particular scenario ? I was able to do it by manipulating sqlfile and replacing role name in that but i am looking for a sol. within datapump itself. how can grants assigned to ROLE 'X' be transferred to 'Role Y' via datapump import.

View 2 Replies View Related

Export/Import/SQL Loader :: Export Data Pump On Remote Location

Oct 11, 2012

I have one prod server (11.1.0.6.0) servers on Windows 2003 R2 64 bit.

Server Name (PRODDB)

I do not have access to that prod server , i want to take one export data pump from my client machine and due to space issue in prod server , i want to keep dump file in my client machine itself. i can take traditional export and keep the dump file in my client machine but i do not know how to achieve the same via data pump ...

How to generate dump file in client machine itself via data pump ?

View 18 Replies View Related

Export/Import/SQL Loader :: Export Data Program (ociuldr) Cannot Run In 64bits Win2008 Environment

Jun 29, 2012

export data program "ociuldr" can not run in 64bits win2008 environment. Where can i download 64 bit version of "ociuldr" program. I have read some article . The article mention that the export data program ociuldr.exe need to recompile into 64 bits version. Finally I also want to ask the import data program sql loader "sqlldr.exe". Can it run in 64 bits environment. Where can i downlaod 64 bits version of "sqlldr" program.

View 1 Replies View Related

PL/SQL :: Loading Data From Local Directory

Jan 22, 2013

I have a requirement like, I received .dat files and placed in my local directory and there is a process/method where this data which is there in .dat file will insert into my oracle external tables?Any link of the example with clear steps?

View 9 Replies View Related

SQL & PL/SQL :: Read Blob Data From Database Into Directory?

Feb 12, 2010

I can store my video in to my database.but i cannot read such file.....using this procedure

ORA-29285: file write error

CREATE OR REPLACE PROCEDURE Extract_bfile
(p_id IN NUMBER)
IS
vblob BFILE;
vstart NUMBER := 1;
bytelen NUMBER := 32000;
len NUMBER;

[code]....

View 4 Replies View Related

Application Express :: 4.1 - Active Directory Data

Aug 22, 2013

I have one question. Is there any way to get some users data from active directory? I already have authentication scheme wich interact with AD, but now I need to get e-mail address from user who will login into application. Our Apex version is 4.1.

View 2 Replies View Related

Data Guard :: Archivelog Source Directory In Standby Database

Jul 15, 2013

I created a standby database yesterday and everything is working correctly.  I need to make some tweaks, however, and one of those is the directory that the standby database expects the archivelogs to be.

I found that the archivelogs were being shipped to $ORACLE_HOME/dbs and named arch*.arc.  Interestingly a log switch ships the archivelog minus the "arch" at the front and all archivelogs now do not have that format. I moved the archivelogs to the correct location and renamed them.  I have been able to set the correct location using standby_archive_dest and can see that the archivelogs are shipped to there.

Also, v$archived_log shows the correct path and filename. The problem I have is that when I come to apply the archivelogs it still seems to think that they should be in $ORACLE_HOME/dbs and named with "arch" at the front.  what parameter I need to change to tell oracle the correct path and filename to use when applying the archivelogs? 

View 9 Replies View Related

Export/Import/SQL Loader :: Export And Import Of Data Not Table And Data?

Sep 11, 2012

Export and import of data in oracle forms...i have created 02 boutons one for export his trigger like this:

eclare
alrt number;
v_directory varchar2(200) := 'c:ackup'; --- that if the C Drive not the Drive that the windows had installed in it.
path varchar2(100):='back_up'
||to_char(sysdate,'dd_mm_yyyy-hh24_mi_ss');
v_exp varchar2(200) := 'exp hamada/hamada2013@orcl file = '
||v_directory
||''
||path
||'.dmp';
[code]....

this code is correct he expot not only the data but also the creation of the table ....for exemple i do export and everything is good until now and i find the .dmp in the folder backup .. but when i deleted all data from my app and try to import this .dmp iit show me error it tell me thet the table phone is already created...just export the data of phone not the creation of table and data ???? or how can i import just the data from this .dmp ??

View 3 Replies View Related

Oracle Data Pump - Export Data From Schemas Or Tables

Oct 11, 2012

I need to export only the data from schemas or tables, how to do that with Oracle Data Pump? when we use schemas parameter this export all schema, not only the data right?

View 7 Replies View Related

How To Convert Traditional Exp / Datapump

Sep 6, 2010

To convert the following traditional exp/imp to datapump .

1. exp custom/custom FILE=expgt.dmp TABLES=raj.GENERAL_TABLE QUERY = \"WHERE to_date\(lchg_time\) \= \(select db_date from mtda\) \" indexes=n statistics=none

2.exp custom/custom FILE=expdata.dmp TABLES=raj.BAL_TABLE,raj.GEN_TABLE ,raj.CREDITS_TABLE QUERY = \"WHERE to_date\(lchg_time\) \>= \(select db_date from mtda\) \" indexes=n statistics=none

in the second one there tables to the same dmp file with the same where clause

View 3 Replies View Related

Datapump Use In Remote Servers

Nov 19, 2008

We have eight servers. One of them is for centeral management. The other seven is for development and test databases. Every night, we start a process which exports database schemas (without data) into the centeral server. An example of the process we run is just like below:

CODEcd /data1/backup
mknod exp_pipe_oracle1 p
compress<exp_pipe_oracle1>dev.dmp.Z&
exp user/****@dev file=exp_pipe_oracle1 owner=schema1,schema2,schema3 statistics=none rows=n log=dev.log

As you can see, we use @dev to connect remote database. But dump file is created in local (centeral) server. We are planning to use datapump instead of classic export in future. But I couldn't find a solution, that could take a database export in a remote server and create dump file in local. I have looked at NETWORK_LINK parameter; but it doesn't seem to work for our case.

Is it possible to backup a remote database with datapump and make it create dumpfile in local? (Of course we can use some solutions such as NFS, but we really not prefer this; if datapump has ability to remotly backup.)

View 5 Replies View Related

SQL & PL/SQL :: Datapump Field Length

May 21, 2012

I have a problem to read in data from dmp file to target table.

I've created a dmp file using:

CREATE TABLE table_name
ORGANIZATION EXTERNAL
(TYPE oracle_datapump
DEFAULT DIRECTORY directory_name
LOCATION ('file_name.dmp')
) AS SELECT col1, col2, col3 from source_table;

[Code]...

The file created via db_link from target db to remote db and then the data are insterted into target db table target_table.

desc target_table
col1 varchar2(20)
col2 number
col3 number

There are no values longer than 20 characters in source table, but when I insert into target_table(col1, col2, col3) as select col1, col2 col3 from source_table; I get ORA-12899: value too large for column "target_table"."col1" (actual: 25, maximum: 20).

I gues it has something to do with how the oracle datapump stored the data in dmp file. When I select col1 from source_table where length(col1) > 20; I get two values which clearly are not longer than 20 characters.

Selected values are:

748473358           
693197674           

where the bug is hidden or any "normal" workaround?

View 17 Replies View Related

Server Utilities :: Datapump API?

Jun 15, 2011

I am trying to use the export the database using datapump API dbms_datapump.I want to export the entire database excluding some schemas.

My question is how to exclude some schemas using datapump API?

View 1 Replies View Related

Export Using Data Pump

Jul 2, 2010

I would like to ask if there is the possibility using the data pump export utility to export my full database plus some partitions tables by selecting specific partitions. Can i have all these criteria in only one data pump export? If yes any example?

View 2 Replies View Related

PL/SQL :: Export Table Data Into CSV

Jan 15, 2013

I'm stuck up with a requirement, where I have to export data of all tables from a schema into separate table specific CSV files.

I tried using SQL Plus 'SPOOL' but it gives me additional queries executed on top of CSV generated. Also, i'm not getting the headers (coz i'm using pagesize 0 )

View 4 Replies View Related

SQL & PL/SQL :: Export Table Data

Oct 9, 2010

I am new to ORACLE and i am having a table with images stored in that.Now i want to export only data into a another server with out creating the table again.Because table is already available in another server.Can any one give me a best example to finish this

View 4 Replies View Related

Ways Provided For Database Migration Except Datapump?

Jul 2, 2013

what are the ways provided by oracle for database migration,except datapump? and which one is recomended

View 1 Replies View Related

Datapump - Importing With Inconsistent Table Structure

Apr 30, 2013

I am receiving two large export files from a vendor, so I have no control over the contents. I need to import these into our database. The two export files are very similar, except the one has slightly differenet columns in it. So, export file 1 may have a table:

COLUMN_A
COLUMN_B
COLUMN_C

The second file may have:

COLUMN_A
COLUMN_B
COLUMN_D

At the destination, I have a table that has:

COLUMN_A
COLUMN_B
COLUMN_C
COLUMN_D

Is there a parameter that would let me interchangably import either (or both) files into this destination table? This is my first attempt at data pump - but I know using import this has caused me issues. Not sure if the same limitations exist? Will the missing columns cause it to fail?

View 3 Replies View Related

Oracle Datapump - Table Structure Change?

Oct 19, 2010

we have daily partitioned table, and for backup we are using data pump (expdp). we policy to drop partition after backup (archiving).

we have archived dump files for 1year, few days back developer made changes with table structure they added one new column to table.

Now we are unable to restore old partitions is there a way to restore partition if new column added / dropped from currect table.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved