I created new 10g database & trying to import the export of 9i database (ABC). i have below questions:
-I just installed 10g have only system,sys ..etc only default user & databases.Do i need to create a same database i have in 9i (ABC) in 10g before importing data?
-i tried to import the export of 9i (ABC database , for only XYZ user) in 10g system user , i got the table imported with 0 rows
Log:
+++++++++++++++++++++++++
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Export file created by EXPORT:V09.02.00 via conventional path
Warning: the objects were exported by P2KTEST, not by you
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
. importing P2KTEST's objects into P2KTEST
. . importing table "TESTRATN" 0 rows imported
Import terminated successfully without warnings.
I did import from 9i to 11gr2 , 1. i create 11gr2 DB , 2.created tablespace with 8kb block, 3 imported 9i dump to 11gr2 DB.
Now iam getting SOME ERRORS: In IMP LOG
1. ORA-29339: tablespace block size 4096 does not match configured block sizes == for all the tablespaces.(But i create TBS with 8kb block before IMPORT)
2. ORA-23327: imported deferred rpc data does not match platform of importing db
I am new to Oracle DBA. m facing one problem. i imported and exported my data from one oracle DB to another by using exp/imp.....in both exp/imp log files, its shows the fallowing messages at the end.
"Import terminated successfully with warnings." "Export terminated successfully with warnings."
but when i count there are some rows missing in some tables...
what could be the cause? is there any other way to cross check whether export/import was successful...
I have oracle database 11.2.0.3 OS Hp-ux I have one table with around 5 lac of rows I want to copy this table to another oracle database 11.2.0.3 with different OS windows server 2008 R2.
I'm facing a really strange problem in Forms with an FMB that has a couple of canvas, but the main one canvas is showing a JPG (like a Wall Paper for the application, in fact this FMB would be the main FMB and it will call the main menu also). The problem is on run-time (locally, running on windows, just in some Laptops). At execution time, there's a FRM-92102 error, but only deleting the image, and trying again, the Screen run perfect!
The worst part is that same FMB run perfect on other laptops. I've tried almost everything : Creating again, just the form with one canvas and importing the image, and Run, but same error appears again. Trying with another image, same results.
I was running IMPDP for my company's database.Everything is just fine except one problem.A big table(61.1G) could't be imported.The warning message shows "imported TBL_XXXX 61.1G 0 out of 147653981rows".The alert log had no warning message.I didn't know
I imported a schema HR from export DUMP ....i can find all the objects of schema HR in the imported database... but i got an error for a plan_table which is assigned to USERS tablespace in the source database.. ...
HERE COMES THE ERROR I GOT DURING IMPORT:
[oracle@localhost mom]$ imp file=exp_schema_ref.dmp log=imp.ref.log fromuser=hr touser=hr commit=y Import: Release 10.2.0.1.0 - Production on Mon Jul 26 00:03:57 2010 Copyright (c) 1982, 2005, Oracle. All rights reserved. Username: sys/sys as sysdba
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production With the Partitioning, OLAP and Data Mining options
Export file created by EXPORT:V10.02.01 via direct path import done in US7ASCII character set and AL16UTF16 NCHAR character set . importing HR's objects into HR . . importing table "COUNTRIES" 25 rows imported . . importing table "DEPARTMENTS" 27 rows imported . . importing table "EMPLOYEES" 107 rows imported [code]....
I have dump file of 17 GB,which i want to import in my database db1 in user amol ;so i created new user under db1 as below ,before this i have created tablespace so that i can import my data only to that tablespace only. My steps are as below.
create user amol identified by amol default tablespace ptaxold1 temporary tablespace tem; imp amol/amol then i mentioned my dump file but after importing it does not show any table under user amol.
I am migrating a oracle 9i database to 11g r3. I can only use imp. As the database is huge, I have split the exp dump by schemas. In my recent test, i have split the schema into 4 seperate threads to be imported into the new oracle11g database. The 4 thread of imp consist of almost similar sizes of schema (Eg thread 1 - Schema 1, 2 ,3. Thread 2 - Schema 4,5,6 etc)
All the dump files are in the same mount point. When i execute the import (4 threads) together, the total import timing is each thread is between 2.5 days to 3.5 days.
Then i proceed to try only 1 thread, only 2 hrs. So could this be a IO issue or oracle memory problem?
A deadlock among DDL and parse locks is detected.This deadlock is usually due to user errors in the design of an application or from issuing a set of concurrent statements which can cause a deadlock. This should not be reported to Oracle Support. The following information may aid in finding the errors which cause the deadlock:
ORA-04020: deadlock detected while trying to lock object MV_LOTS
But the query of import is still runing even not showing any amount of rows to be imported.
i already make the tablespace in which the table was previosuly before dropping but when i check the sapce of tablespace that is also not consuming one error i got preiviously while performing this task is:
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production With the Partitioning, OLAP and Data Mining options Master table "CDR"."SYS_IMPORT_TABLE_03" successfully loaded/unloaded Starting "CDR"."SYS_IMPORT_TABLE_03": cdr/********@tsiindia directory=TEST_DIR dumpfile=CAT_IN_DATA_042012.DMP tables=CAT_IN_DATA_042012 logfile=impdpCAT_IN_DATA_042012.log
[code]....
i check streams_pool_size it will show zero and then i make it to 48M and after that
SQL> show parameter streams_pool_size; NAME TYPE VALUE ----------- streams_pool_size big integer 48M
BANNER ---------------------------------------------------------------- Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod PL/SQL Release 10.2.0.1.0 - Production CORE 10.2.0.1.0 Production TNS for 32-bit Windows: Version 10.2.0.1.0 - Production NLSRTL Version 10.2.0.1.0 - ProductionOS version:
Windows 7 64bit I have schema(scott) export with schema level option and imported with different name as (scott1).At regular period of time i need to import the scott to scott1 without affecting existing records.such as *1. Need to append new created records.* *2. Need to append updated records.*
for the above requirement I did in the following way expdp xxxx/******** schemas=SCOTT directory=dumpdir dumpfile=SCOTT_28-SEP-2012.dmp logfile=exp_SCOTT_28-SEP-2012.log imported in the following way impdp xxxx/******** AS SYSDBA REMAP_SCHEMA=SCOTT:SCOTT1 directory=DUMPDIR dumpfile=SCOTT_28-SEP-2012.dmp logfile=imp_SCOTT2_28-09-2012.log TRANSFORM=SEGMENT_ATTRIBUTES:n TABLE_EXISTS_ACTION=APPEND.
The problem is i couldn'table to append the records to existing tables the log error show such ways.
ORA-31684: Object type USER:"SCOTT1" already exists Processing object type SCHEMA_EXPORT/SYSTEM_GRANT Processing object type SCHEMA_EXPORT/ROLE_GRANT Processing object type SCHEMA_EXPORT/DEFAULT_ROLE Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA [code].....
Import: Release 11.2.0.3.0 - Production on Tue Apr 23 13:10:51 2013 Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Starting "IMPDB"."SYS_IMPORT_TABLE_01": abc/******** directory=DATA_PUMP_DIR network_link=TESTAR logfile=net_import_proddev.log TABLES=impdb.abc parallel=12 REMAP_SCHEMA=IMPDB:ABC
aix 6.111.2.0.3 I have an expdp dump from prod to be imported to our test database.I have imported it using impdp, but to my surprise the tables were imported but lots of indexes were not created? even If I have used TRANSFORM=SEGMENT_ATTRIBUTES:N just to use the default USERS tablespace. How do I import the indexes separately, skipping the tables and other objects?
I try to transfer data from one database to another one through data pump via SQL Developer (data amount is quite important) exporting several tables. Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
"Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...= ORA-39001: argument value invalid ORA-39000: .... ORA-31619: ...
The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
I am trying to migrate a table to a new table that has the field sequence changed and also has a new field added. My main question is if it is possible to have datapump add values to the new field in the target table.For example:
-original table has fields a, b, d, c -new table has fields b, c, d, a, e
I want to load the new table and also include adding values for field e. In this case, field e is a year field, so it should be loaded with '2012'..Does datapump have the ability to do this? Is reorganizing the fields going to cause me any problems? We are on oracle version 11.2.0.3
When I do the import the of succeeding dump, I drop the existing schema "SQL> drop user username cascade;" and import dump by " impdp system .... ". I would like to import a dump to an existing instance but only data import and will leave the current packages and other metadata untouched and unchanged on the said existing instance.
1. Do i need to drop user before the import if my requirements are the above?
2. If i need to drop user, what should be script.
3. For the import itself, what parameter should i use?
4. What are the necessaries I need to consider before doing the import.
Imagine you have 100 schemas backed up (expdp) in a dumpfile and you want to import just one schema from that dumpfile in a DB. You can specify just that one schema you want using SCHEMAS parameter in the impdp. But things are not straightforward when you want use REMAP_SCHEMA.
Here is my scenario: ===================
I took the expdp dump of schemas A and B in one go. So, dumpfile has objects from both A and B.The dumpfile name is : schemas_AandB.dmpNow , I want to create schema C from A using REMAP_SCHEMA parameter
-- Putting each parameter in a separate line for readability impdp PSTREF/PSTREF_123 DIRECTORY=ADET_EFX_DIR DUMPFILE=schemas_AandB.dmp LOGFILE=CreatingCfromA-Impdp.log REMAP_SCHEMA=A:CEverything goes fine. Schema C is created from Schema A in the dumpfile.
But impdp is trying to create schema B as well because schema B was present in the dumpfile. Since the schema B and its objects are already in the DB , I get the following errors.
ORA-31684: Object type USER:"B" already exists ORA-31684: Object type PROCEDURE:"B"."SP_CLEAREXPIREDSESSIONDATA" already exists ORA-31684: Object type PROCEDURE:"B"."SP_DELETESESSIONDATA" already exists ORA-31684: Object type PROCEDURE:"B"."SP_DELETESTATECONTEXTINFO" already exists
[code]...
Trying to avoid schema B in the dumpfile from being imported by specifying SCHEMASBut I got the following error ORA-39065: unexpected master process exception in MAIN ORA-12801: error signaled in parallel query server PZ99, instance oracth214:HEWRAC1 (1) ORA-01460: unimplemented or unreasonable conversion requestedMaybe REMAP_SCHEMA and SCHEMAS parameters won't work together.
Is there any way to prevent the impdp from importing user B and its objects ?
The source server (9.2) is /oracle/data/xxx the destination will be /u02/oradata/xxx how do I get the imp to change the paths.
I know you can't do db_file_name_convert, as that only works in dataguard with rman, and it looks like you can only set compatible down to 10.2, it's a very small DB, (5Mb) but it's a live license server DB, I need it intact.
I tried a straight import, but that just barfed with "IMP-00015: following statement failed because the object already exists:" on every statement.
EDIT: it's lying, the /oracle path doesn't exist, just FYI.
I am trying to export and import tables from oracle...For this i am using imp exp utility...But this is being carried out from cmd prompt...IS there any way by which i can execute these from sql prompt..So that by establishing jdbc connection thgh java code...i cud run the code...
my problem is that whenever i want to import a dump file(oracle 10g) oracle just import 4 tables and then goes into hang state(not responding) i'm using old import method (not datapump).