Server Utilities :: Oracle Directory And Datapump
Mar 31, 2010
We have three instances on the RAC, When I create a directory is there a default instance that it gets created on? When I execute an datapump export on instance 1, it works but on 2 and 3 it fails with error relating to directory.
Error -
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation
Is my only option is to use shared storage space?
View 6 Replies
ADVERTISEMENT
Jun 15, 2011
I am trying to use the export the database using datapump API dbms_datapump.I want to export the entire database excluding some schemas.
My question is how to exclude some schemas using datapump API?
View 1 Replies
View Related
May 11, 2011
I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.
View 28 Replies
View Related
Jan 2, 2012
As a part of our back up we used to export the production data every day using Original export Utility but from 11g original export Utility is de supported and also datapump doesn't support XML Objects so is there any other way to export the full database else any option to export xml Object using datapump.
View 2 Replies
View Related
Jan 13, 2012
Currently we are using "exp and imp" utilities to unload from production and load into Dev server. While importing, we are following below steps
(1) Load only data [by specifying INDEXES=N in the par file]
(2) Unlock statistics
(3) Load indexes, other objects [by specifying ROWS=N]
After doing these steps, both data, indexes and others objects are loaded. To verify indexes, we are checking DBA_INDEXES.
DBA_INDEXES :
-------------
OWNER INDEX_NAME TABLE_NAME STATUS LAST_ANALYZED
----- ---------- ---------- ------ -------------
MYSCH CP_INDEX_1 CP_TABLE_1 VALID 14/JAN/12
Question :-
(1) Does imp utility rebuild the indexes while loading data ? or it simply takes the rows from dump and load into test system without building from scratch ?
(2) I am trying to replace 'exp' and 'imp' with datapump utilities ? But, I am confused about the parameters to be used ?
(a) Can I load both data and meta data at the same time (Using CONTENT=ALL option) ?
(b) I am planning to implement this in two steps :
first load only metadata using - CONTENT=METADATA_ONLY TABLE_EXISTS_ACTION=REPLACE
then, load data - CONTENT=DATA_ONLY.
View 1 Replies
View Related
Jan 3, 2012
I am trying to use the datapump tool to migrate a 10g db to 11g. Everything works fine except for the "nameless" check constraints.
View 7 Replies
View Related
Sep 8, 2013
Is it possible to export data to another directory other than datapump default directory(DATA_PUMP_DIR) in Linux.
View 5 Replies
View Related
Oct 25, 2010
have a requirement to load .dmp files into existing staging tables and there is package to load the ODS tables from staging.So,I thought of using DBMS_Datapump utility to import the data from .DMP files to the Tables and this need be automated.
--Create Directory
CREATE
OR REPLACE DIRECTORY test_dir AS 'C:Test'
--grant Access to the User
GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
--Script to import
DECLARE
l_dp_handle1 NUMBER;
BEGIN
l_dp_handle1 := dbms_datapump.OPEN(operation => 'IMPORT',
[code]...
Errors
ERROR at line 1:
ORA-31634: job already exists
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 938
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4590
ORA-06512: at line 4
View 6 Replies
View Related
Mar 18, 2013
I have a Datapump Export File which was created in Schema mode.
I have to import the tabelles in a new database where a have to use the REMAP SCHEMA Parameter.
Additionally I would like to add a prefix to tablenames.
For example:
original tablename: THE_TABLE
Name after import: IMP_THE_TABLE
Is there a way to add a prefix while using Datapump Import?
View 5 Replies
View Related
Feb 7, 2011
I'm trying to deploy the schema using DATAPUMP API. The user from which the schema get deployed has the direct privilege of CREATE USER (not through role). But got the insufficient privileges error.
Processing object type SCHEMA_EXPORT/USER
ORA-31685: Object type USER:"SCOTT1" failed due to insufficient privileges. Failing sql is:
CREATE USER "SCOTT1" IDENTIFIED BY VALUES '4EBB0DDE3C79FE47' DEFAULT TABLESPACE "USERS"
TEMPORARY TABLESPACE "TEMP2" PROFILE "APP_PROFILE".
But the user get created successfully when run the CREATE statement manually.I have created the user manually and again run the deployment procedure. Got the below error for ROLE_GRANTS.
Processing object type SCHEMA_EXPORT/ROLE_GRANT
ORA-39083: Object type ROLE_GRANT failed to create with error:
ORA-01932: ADMIN option not granted for role 'EXP_FULL_DATABASE'
Failing sql is:
GRANT "EXP_FULL_DATABASE" TO "SCOTT1"
The user has EXP_FULL_DATABASE with ADMIN Option and IMP_FULL_DATABASE with ADMIN option direct privileges.which privileges the user needs to deploy the schema successfully?
View 1 Replies
View Related
May 23, 2012
I am using Datapump import using database link to import an entire schema from another Server but it gives issues with constraints.I tried to first only import the metadata and then disable the constraints and import data and enable constraint but in this case the temp tablespace keeps filling up and i am out of space. Is there any method to do a full import including constraints and indexes.
View 7 Replies
View Related
Apr 5, 2013
I have analyzed that, datapump estimation is 9.902GB. When i check size of .dmp file, it's shows 1.44Gb.
Export: Release 11.2.0.1.0 - Production on Fri Apr 5 02:00:05 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
;;;
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_FULL_01": system/******** dumpfile=expdp_LVGITRN_30_24_050413.dmp directory=DP_DIR logfile=expdp_LVGITRN_30_24_050413.log full=y exclude=statistics
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: [bold]9.902 GB [/bold]
Why Datapump estimate so much than actual size?
View 8 Replies
View Related
Aug 23, 2012
Expdp directory=xxx.dmp dumpfile=aaa.dmp logfile=xxx.log FULL=Y
: :: : : :: : : : ;
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 24.87 MB
Processing object type SCHEMA_EXPORT/USER
[code]...
then my export hangs..... checked in alert log nothing found.and then killed the job and reran again but same....checked the status and it's saying EXECUTING.
View 15 Replies
View Related
Jun 9, 2010
I have a problem with DBMS_DATAPUMP.metadata_filter.Let's suppose that I need to export a huge list of tables (a,b,c,d,e,f,g,h,i...). Let's suppose that the list is dynamic do NOT want to use
DBMS_DATAPUMP.metadata_filter (handle => h1,
NAME => 'NAME_EXPR',
VALUE => 'IN (''a'', ''b'', ...)',
object_type => NULL);
In my_export_table there is the list:
CREATE TABLE my_export_table
(
EXPORT_OBJECT_NAME VARCHAR2(50 BYTE)
)
Now I'm trying to use this form:
DBMS_DATAPUMP.metadata_filter (handle => h1,
NAME => 'NAME_EXPR',
VALUE => 'IN (SELECT a.export_object_name FROM my_export_table a, user_objects b WHERE a.export_object_name = b.object_name AND b.object_type = ''TABLE'')',
object_type => NULL
);
but it results in error.
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
ORA-00942: table or view does not exist
[code]...
View 1 Replies
View Related
Feb 12, 2013
On last week we have migrated our oracle database from 9i to 10g through imp utility ,but now i have facing one small issue where as new have our old live database with us and suppose we have fired
"SELECT* FROM V$PARAMETER WHERE NAME LIKE 'utl%'"
to check directory name and valus then in output its shown name:-utl_file_dir and valus:-E:RAB but in our new database its does not shown any value like E:RAB in migrated database as i have recreated that directory in new database but still that issue persist .
View 3 Replies
View Related
Sep 28, 2010
howto to setup an oracle server and "import" my old databases just from the copied directory structure d:\oracle\product\... ("cold backup") to the new install.i was trying to setup an new server by shutting down the db and just copying back my data/control/log/*.ora files which doesnt seem to work.
i could see the old tables structures in the local sqlplus console, but the databases are not reachable over the net.
also the enterprise manager is not able to startup the dbs, it show just an running listener.
View 3 Replies
View Related
May 1, 2013
I've Googled several times but the result is 50% null ^_^
How can I use the Active Directory to route the clients request to the Oracle Database server so that I can do the basic operation on the Oracle DB remotely.
As shown here [URL]........
View 1 Replies
View Related
Oct 12, 2012
How to access Active directory(Microsoft 2003) with oracle database 11g.
View 1 Replies
View Related
Mar 8, 2012
What two Active Directory services are stopped when you install Active Directory before Oracle 10g? I know the error message for that and I know why it happened but I just need to know the two services so I can start them again. I think it happened because I installed Active Directory first so when I installed Oracle second it stopped two services and I just need to know them. The error message is:
Active Directory is missing binaries, please restart and try again
View 2 Replies
View Related
Sep 27, 2011
have 2 Oracle server. One with Suse Linux (Oracle 10.2.0.4.0) and one with Windows 2003 Server x64 (Oracle 11.2.0.1.0).
I made a mistake by installing Oracle 11 x32 on a x64 server. Nevertheless it works for about half a year. Then my backups
with datapump don't work. I changed the SGA_TARGET down to 1024M and the datapump works again. Now I want to renew the Windows server and want to import the datapump schemes to the Linux server.
Exp: expdp SCHEMA1/****@db1 DIRECTORY=dmpdir DUMPFILE=SCHEMA1_EXPDAT.DMP LOGFILE=SCHEMA1_EXP.LOG VERSION=10.2.0.3 REUSE_DUMPFILES=YES
Imp: impdp SCHEMA1/**** DIRECTORY=dmpdir DUMPFILE=SCHEMA1_EXPDAT.DMP LOGFILE=SCHEMA1_IMP.LOG
Log:
..importing SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
..importing SCHEMA_EXPORT/SEQUENCE/SEQUENCE
[code]...
View 1 Replies
View Related
Oct 19, 2010
we have daily partitioned table, and for backup we are using data pump (expdp). we policy to drop partition after backup (archiving).
we have archived dump files for 1year, few days back developer made changes with table structure they added one new column to table.
Now we are unable to restore old partitions is there a way to restore partition if new column added / dropped from currect table.
View 4 Replies
View Related
Mar 7, 2011
I'm importing data from SQL Server to Oracle. I used BCP to export the data from SQL Server. Below is the 1st record of table trlc from the csv file.
trlc.CSV
11032|100|Wman| | |2008-02-08| |
Using SQL Loader to import into Oracle:
TRLC table in Oracle database:
est_no varchar2(10) default ' '
right_no number(4) default 0
maj_auth varchar2(15) default ' '
weight varchar2(10) default ' '
idm_ht varchar2(8) default ' '
c_date date
P_tkt varchar2(5) default ' '
sqlldr user/pwd@db02 control=trlc.ctl log=trlc.log
trlc.ctL:
load data
infile 'trlc.csv'
replace
into table trlc
fields TERMINATED BY '|'
TRAILING NULLCOLS
(est_no,right_no,maj_auth,weight,idm_ht,c_date,P_tkt)
The rows get inserted successfully. But the result sets are different, for example: When I do a select in SQL Server,'select len(weight) from trlc;' , I get the length as 0. But when I do a select in oracle database, I get the length as 1. Also, the result set varies for the query below:
select * from trlc where weight=' ';
(SQL Server returns 1 row but Oracle returns no rows)
Do I need to mention any conversion code for the weight field to accept ' ' value?
View 12 Replies
View Related
Feb 4, 2011
My database is running in oracle 10g. I have more than 25 oracle directories which are being used for batch/reporting jobs. I found read and write privileges are missing for 7 Dirs. I checked the last ddl time, it shows Jan 26th.
I want know how to check which userid revoked it.
View 4 Replies
View Related
Jul 14, 2011
i have manualy create a database using this
1)ORACLE_SID=ORA10G... (set into Env variable )
2)Go to F:oracleproduct10.2.0adminORA10G create new folder (ORA10G)
3) In ORA10G create new all folders adump,bdump,cdump,dpdump,pfile,udump
4) In pfile take a copy of existing init.ora from my orcl databse
5) Change this init.ora
6) Change parameters
7) db_name , instance_name ,control_files ,background_dump
dest,user_dump_dest,
save this file as "init.ora"
9) Create new folder in Oradata --- "ORA10G"
10) create service
Oradim NEW SID ORA10G SYSPWD ORACLE STARTMODE AUTO SPFILE
11) then i have check service is started or not in Control panel- administrative tools - sevices--OracleserviceORA10g---started
12) i have Shut down previously started database
13) Connect to SQL promp
F:oracleproduct10.2.0db_1in> set oracle_sid=ora10g
F:oracleproduct10.2.0db_1in>SQLPLUS
SQL*Plus: Release 10.2.0.5.0 - Production on Thu Jul 14 16:46:59 2011
Copyright (c) 1982, 2010, Oracle. All Rights Reserved.
Enter user-name: SYS AS SYSDBA
Enter password:
Connected to an idle instance.
14)than create spfile
SQL> CREATE SPFILE FROM PFILE = 'F:ORACLEPRODUCT10.2.0ADMINORA10GPFILEINIT.ORA';
File created.
15) than i have run this command
SQL> startup nomount
ORA-02778: Name given for the log directory is invalid
==
i have attached my orcl init.ora file and my new init.ora file also here
======
my orcl init.ora file here that i have used here for this database
====
initorcl.ora
===========
orcl.__db_cache_size=385875968
orcl.__java_pool_size=4194304
orcl.__large_pool_size=4194304
[Code].....
View 19 Replies
View Related
Oct 31, 2013
I have a dump file (exp done from Oracle 10.2) and i want to import this dump in Oracle 11.2.0.1 (Windows 7 32 bit). The import work nice until, I got a crash message from Windows:
..............................................................................
Faulting application name: imp.exe, version: 0.0.0.0, time stamp: 0x4bb5f1da
Faulting module name: ntdll.dll, version: 6.1.7601.18247, time stamp: 0x521ea91c
Exception 0xc0000005
Fault offset: 0x0001f9c3
Faulting process id: 0x10b4
Faulting application start time: 0x01ced5c178550439
Faulting application path: D:appionut_trascaproduct11.2.0dbhome_1inimp.exe
Faulting module path: C:WindowsSYSTEM32
tdll.dll
Report Id: fa06ee43-41b8-11e3-82a9-001fe1f2e49e
..................................
View 1 Replies
View Related
May 16, 2012
where find the oracle metalink to resolve the ora-00600 and ORA-07445 error at the oracle website.
View 2 Replies
View Related
Oct 3, 2013
I write a file on database server by following
CREATE OR REPLACE DIRECTORY TEST_DIR AS 'c: emp'
GRANT READ, WRITE ON DIRECTORY TEST_DIR TO myuser
SQL> select * from all_directories;
OWNER DIRECTORY_NAME
------------------------------ ---------------------
DIRECTORY_PATH
----------------------------------------------------
SYS TEST_DIR
c: emp
SYS PUBLIC_DIR
E:PUBLIC_dir
and then
PROCEDURE run_query(p_sql IN VARCHAR2
,p_dir IN VARCHAR2
,p_header_file IN VARCHAR2
,p_data_file IN VARCHAR2 := NULL) IS
[code]...
but I get error wut_118.
View 1 Replies
View Related
Jun 14, 2006
Is there is any way of reading from oracle dump file?
View 16 Replies
View Related
Dec 1, 2011
how to migrate a table from 10g xe database to a oracle9i database.Both the database are stand alone and we cannot create a dblink.
View 2 Replies
View Related
Feb 23, 2012
I have the problem with import in Oracle 8.1.7.The size of import file is 29600 kb and tablespace size is 16gb and when I try to make import oracle back this message:
IMP-00003: ORACLE error 1659 encountered
ORA-01659: unable to allocate MINEXTENTS beyond 7 in tablespace DATA
The data tablespace is full. I think that the import file contains information about the original tablespace from which has made export. But I don't now how to resolve the problem
View 10 Replies
View Related