I'm trying to export a relatively large database but it's a bit more complicated than that.For one schema I need a full export / import (data included).
For another 10 schemas I need them empty, with the exception of a table in some of them which needs to be exported / imported with all data inside.Is it possible to do this with datapump utility (impdp, expdp)?
Afterwards I will be running some scripts to populate the DB instance with critical data / metadata.
but it will return me below error, i have only access to user shan our client cant allow me to use system or sysdba schema or any other required grants or privileges so is there any way to take metadata backup of user shan from user shan.
EXP-00008: ORACLE error 942 encountered ORA-00942: table or view does not exist EXP-00024: Export views not installed, please notify your DBA EXP-00000: Export terminated unsuccessfully
I am exporting a table that is 3 GB in size and also Partitioned with option NOCOMPRESS specified.
Now when i export it with COMPRESS=N option of exp utility then it should take 3 Gb in target server but will exporting it with COMPRESS=Y will save some storage during import or once NOCOMPRESS option specified on partition has no impact on exp utility COMPRESS=Y option and it will take 3 GB space in both cases
Is this true that whether u specify COMPRESS=N|Y during export it does not matter the size will be 3 GB always after import?
I would like to export few tables from the physical standby which is in read only mode.
I have tried both the exp and expdp methods and could successfully export and import the tables from physical standby using exp unfortunately the expdp does not allow this process from a read only database.
Does this mean that we still have to use the exp feature instead of expdp ?
Note : I would expect a proper response from experts and no unwanted comments like "Contact Oracle support" or "Paste the entire command here" or "Read the Manuals" or "Why i am exporting from Standby and not from Primary" etc.
extract a huge amount of data from a couple of views... the problem is that they want it in TXT files with fixed record length. There will be like 6 files, for a total amount of about 10GB.
export those tables in the fastest possible way? If I'm not mistaken exp and expdp can't create txt files, so do I really need to use utl_file or spool?
I am trying to export a partition of a table and import it to another database. I get the below error when I try to import.
ORA-14400: inserted partition key does not map to any partition
If I export the table(for that particular partition) and import the table(after dropping the table) in destination, the partitions and sub partitions are created without any problem.
The table is Range Partitioned and Sub partitioned in List. So I had to perform the below operation if I want to retain other data in the Destination table.
1. Drop the existing partition 2. Create the partition and sub partition, same as source 3. Execute imp
In fact I had to perform step#2, as if I split the partition also, the sub partition gets replicated in the new partition, which again throws the same error. Is there better way of managing the partitions and subpartition in destination with exp/imp utility, so that I need not perform step#1 and step#2 manually.
I have taken database backup using exp command and when I try to import in other pc the foreign keys are not imported. It saying error message that no matching unique key or primary key for this column.
how will i take backup including with primary keys?
Sunddenly my exports hangs at 'exporting cluster definitions'. I had been using this database since last 4 years and it never cause a problem or hangs at this level. here i'm pasting my screen details. it is my production db.
[oracle1@wbh_as1 smbshare]$ exp wb/wb
Export: Release 9.2.0.1.0 - Production on Thu Dec 23 00:02:44 2010
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production With the Partitioning, OLAP and Oracle Data Mining options JServer Release 9.2.0.1.0 - Production Enter array fetch buffer size: 4096 >
Export file: expdat.dmp > wb
(2)U(sers), or (3)T(ables): (2)U >
Export grants (yes/no): yes >
Export table data (yes/no): yes >
Compress extents (yes/no): yes >
Export done in US7ASCII character set and AL16UTF16 NCHAR character set server uses WE8ISO8859P1 character set (possible charset conversion) . exporting pre-schema procedural objects and actions . exporting foreign function library names for user WB . exporting PUBLIC type synonyms . exporting private type synonyms . exporting object type definitions for user WB About to export WB's objects ... . exporting database links . exporting sequence numbers . exporting cluster definitions
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
Import: Release 10.2.0.1.0 - Production on Wednesday, 17 March, 2010 11:07:02 Copyright (c) 2003, 2005, Oracle. All rights reserved. Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production Starting "MUBA"."SYS_IMPORT_TABLE_01": muba/******** tables=FUNCTION_NO directory=testdump NETWORK_LINK=DBLINK1 Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
I need to move database ORCL into our existing central database CNTR (both are on same OS and oracle version) I started exp each schema from ORCL and imp in CNTR.
But there is one schema EXMP in database ORCL which also exists in CNTR database with same tables, indexes . The data under schema EXMP in ORCL should be added to schema EXMP in CNTR.
On last week we have migrated our oracle database from 9i to 10g through imp utility ,but now i have facing one small issue where as new have our old live database with us and suppose we have fired
"SELECT* FROM V$PARAMETER WHERE NAME LIKE 'utl%'"
to check directory name and valus then in output its shown name:-utl_file_dir and valus:-E:RAB but in our new database its does not shown any value like E:RAB in migrated database as i have recreated that directory in new database but still that issue persist .
Process started but after sometime it gave 1200 errors. Is it due to the Different Database name or Is it because i did not create table space in destination database.
I am trying to run the imp command from dos prompt.
Here is the error I got .
C:Documents and SettingssairammMy Documentsartmsexp_AUDT2_04292012>imp sairamm/mypassword@aws fromuser=AUDT2 toUSER=SAIRAMM file=exp_AUDT2_04292012.dmp log=imp_AUDT2_04292012.log BUFFE R=10000000 GRANTS=y Import: Release 9.2.0.1.0 - Production on Mon Apr 30 14:23:29 2012 Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, Oracle Label Security, OLAP, Data Mining,Oracle Database Vault and Real Application Testing option
IMP-00010: not a valid export file, header failed verification IMP-00000: Import terminated unsuccessfully
I have installed asm lib and asm instance has started successfully and is working if i am using
EXPORT ORACLE_SID=+ASM
but if I am using DBCA after that to create database on ASM activated disk group I am having problem in creation of spfile. How to follow steps to create database using already installed /mounted disk group with DBCA.
I want to create such a script to clone the Database user with the new name. Just like we do normal import and export I want that i should enter just the username of the existing user and username of the new user I want to get created, the password for the same.
It should create the new user with all roles and the default roles and privileges of old user.
I need to import one schema into 3 different DBS. out of 3, have imported into 2 databases without any error. during 3rd import iam getting below error:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP and Data Mining options. ORA-31626: job does not exist [code]....
I am using same parfile. no changes on the syntax. this is a schema export.
I try to use rman duplicate database to bring entire databse form Windows XP system to another Windows XP. I have been trying to use rmain duplicate database without success.
I failed on "startup force nomount......". The system always prompt me an error as "ORA-12514: TNS:listener does not currently know of service requested in connect descriptor" and following by TNSLSNR.exe has encountered a problem and need to be close......"
I have tried to re-install the oracle multiple times and failed at the same problem as above. So, I am thinking of using imp and exp to do the work. Is it possible? If yes, how to do it.