I want to load lakhs of records into a table. My problem is when after loading the ΒΌ of records my process is abend due to the size of my rollback segment area. I don't have an option to increase it. So, Is there any way to go for intermediate commits when I am using the imp or sqlldr utilities to load the entire data without abend?
Can i have 2 versions of oracle database installed on same server under one user on AIX also on windows,
Currently we have oracle 10.2.0.4.0 installed on AIX server, can we install oracle 11gR2 on the same server under on user, & both the versions will be in use.
How to manage the profile of the user having 2 oracle versions on AIX & on windows how to handle it.
I am familiar with tool Netca. However there is one more utility exist for the same functionatlty which is netmgrI checked with many DBAs for the exact difference, however I did not get the best answer from them. I also have checked in google but not exactly got the difference. list the exact difference between those 2 tools (netca, netmgr)
I need to move database ORCL into our existing central database CNTR (both are on same OS and oracle version) I started exp each schema from ORCL and imp in CNTR.
But there is one schema EXMP in database ORCL which also exists in CNTR database with same tables, indexes . The data under schema EXMP in ORCL should be added to schema EXMP in CNTR.
On last week we have migrated our oracle database from 9i to 10g through imp utility ,but now i have facing one small issue where as new have our old live database with us and suppose we have fired
"SELECT* FROM V$PARAMETER WHERE NAME LIKE 'utl%'"
to check directory name and valus then in output its shown name:-utl_file_dir and valus:-E:RAB but in our new database its does not shown any value like E:RAB in migrated database as i have recreated that directory in new database but still that issue persist .
Process started but after sometime it gave 1200 errors. Is it due to the Different Database name or Is it because i did not create table space in destination database.
I am trying to run the imp command from dos prompt.
Here is the error I got .
C:Documents and SettingssairammMy Documentsartmsexp_AUDT2_04292012>imp sairamm/mypassword@aws fromuser=AUDT2 toUSER=SAIRAMM file=exp_AUDT2_04292012.dmp log=imp_AUDT2_04292012.log BUFFE R=10000000 GRANTS=y Import: Release 9.2.0.1.0 - Production on Mon Apr 30 14:23:29 2012 Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, Oracle Label Security, OLAP, Data Mining,Oracle Database Vault and Real Application Testing option
IMP-00010: not a valid export file, header failed verification IMP-00000: Import terminated unsuccessfully
I have installed asm lib and asm instance has started successfully and is working if i am using
EXPORT ORACLE_SID=+ASM
but if I am using DBCA after that to create database on ASM activated disk group I am having problem in creation of spfile. How to follow steps to create database using already installed /mounted disk group with DBCA.
I want to create such a script to clone the Database user with the new name. Just like we do normal import and export I want that i should enter just the username of the existing user and username of the new user I want to get created, the password for the same.
It should create the new user with all roles and the default roles and privileges of old user.
I need to import one schema into 3 different DBS. out of 3, have imported into 2 databases without any error. during 3rd import iam getting below error:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP and Data Mining options. ORA-31626: job does not exist [code]....
I am using same parfile. no changes on the syntax. this is a schema export.
I try to use rman duplicate database to bring entire databse form Windows XP system to another Windows XP. I have been trying to use rmain duplicate database without success.
I failed on "startup force nomount......". The system always prompt me an error as "ORA-12514: TNS:listener does not currently know of service requested in connect descriptor" and following by TNSLSNR.exe has encountered a problem and need to be close......"
I have tried to re-install the oracle multiple times and failed at the same problem as above. So, I am thinking of using imp and exp to do the work. Is it possible? If yes, how to do it.
Import: Release 10.2.0.1.0 - Production on Wednesday, 17 March, 2010 11:07:02 Copyright (c) 2003, 2005, Oracle. All rights reserved. Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production Starting "MUBA"."SYS_IMPORT_TABLE_01": muba/******** tables=FUNCTION_NO directory=testdump NETWORK_LINK=DBLINK1 Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
I want to create two or three sachems on my production server which should be the same copy of my another second production server. And I access this second server through VPN connection on toad9.0.1. And I access my production server through VNC viewer and database through toad.
How cloud I create schema on my first prod. server from second server.
LOAD DATA INFILE * INTO TABLE image_table REPLACE FIELDS TERMINATED BY ',' (
[code].....
step 3)
Then i have run this command
F:oracleproduct10.2.0db_1in>sqlldr control=F:practicecontrol.ctl Username:system Password so i got this error SQL*Loader: Release 10.2.0.5.0 - Production on Wed Jun 8 13:47:27 2011 Copyright (c) 1982, 2007, Oracle. All rights reserved. SQL*Loader-404: Column FILE_ID present more than once in IMAGE_TABLE's INTO TABL E block.
is it possible to remap the database schema during export?
Our developers have their databases stored within individual schemas and i want to provide a dumpfile that each developer is possible to easily import to his schema. But when i want to impdp the dumpfile i have to know the schema name within the dumpfile to do a remap to the individual developers schema -> so providing a specific schema name for within the dumpfile would be great.
At the moment i'm getting the ORA-39146, schema does not exist on importing the database..
Is this possible to run a direct impdp with database link for transport tablespace instead of first creating export dump and then using impdp.
I tried the below command and it shows up error.I have gone through the entire Oracle docs but could not find the exact command with network link for transport tablespace.
but actuly i want to file_name is my actul employee name like Ramesh Patel so what are the changes i can do into my this control file so i can get this file name as my employee name and using this tool i can upload 1500 images into database easily
I want to import 1 .dmp file into oracle database. I dont know what exactly what that .dmp file contains e.g i dont know the Users inside the dump.While importing it gives me the error.
1. Do i need to create those users first and then import if yes then how would i know how many users are inside that dump.
2. Currently the objects are created in SYSTEM user by default. I want to import those objects in the MACL user which i created. How can i do it?
IMP-00003: ORACLE error 1917 encountered ORA-01917: user or role 'MALCCOMAN' does not exist IMP-00017: following statement failed with ORACLE error 1917:
How can i import whole schema from other database. i have a database WHO2 WHERE I HAVE A schema called stg_cupid_rp2 i want to import into data WH0T in stg_cupid_rp2 using DBlink EXP_SAT.
I'm trying to export a relatively large database but it's a bit more complicated than that.For one schema I need a full export / import (data included).
For another 10 schemas I need them empty, with the exception of a table in some of them which needs to be exported / imported with all data inside.Is it possible to do this with datapump utility (impdp, expdp)?
Afterwards I will be running some scripts to populate the DB instance with critical data / metadata.