Error While Doing Import Using Data Dump?
Dec 8, 2012
we have a requirement to export full database dump from source database to target database.
source database - Oracle 11.2.0.1
OS version - AIX 5.3
character set: UTF8
target database - Oracle 11.2.0.3
OS version - 6.1
character set: AL32UTF8
i did export from source database and give it to DBA of target database as dump files.
when he tried to import using this dump, he got the following error for 3 tables
ORA-02374: conversion error loading table
ORA-12899: value too large for column
ORA-02372: data for row
DBA is telling there is character conversion issue and i need to change the source database character set (NLS_CHARACTERSET) and then export these 3 tables separately.
But on analysis, i found UTF8 is subset of AL32UTF8 and hence oracle would do this conversion implicitly.
My query is:
1. For this issue, only solution is to change the source database character set as same as target database, then do the export or any other way available?
2. If i need to change the source database character set, would it affect other data available there?
3. Is there any way available doing character set conversion while doing "expdp" on the fly?
4. Is this issue comes because of oracle version (11.2.0.1 to 11.2.0.3 ) or OS (AIX 5.3 to 6.1)?
View 2 Replies
ADVERTISEMENT
Feb 15, 2013
When I do the import the of succeeding dump, I drop the existing schema "SQL> drop user username cascade;" and import dump by " impdp system .... ". I would like to import a dump to an existing instance but only data import and will leave the current packages and other metadata untouched and unchanged on the said existing instance.
1. Do i need to drop user before the import if my requirements are the above?
2. If i need to drop user, what should be script.
3. For the import itself, what parameter should i use?
4. What are the necessaries I need to consider before doing the import.
View 12 Replies
View Related
Nov 16, 2011
How to import constraints only from a dump file using Oracle data pump.
View 1 Replies
View Related
Jul 5, 2012
I'm rying to import schema's from a dump file that came from a different environment.
What I have is:
1. dump file
2. log file of the export
I'm trying to import the file(containing three schemas) with remap_schemas, and it fails, gives a lot of ORA-00959: tablespace 'string' does not exist.
Now, I've read in OTN:
[URL]
that what you need to do in that case is to use the REMAP_TABLESPACE option,to redirect the objects to a different tablespace.
I don't see a name of the tablespace I'm getting the error for in the export log.I don't know if I have more tablespaces I have to redirect with REMAP_TABLESPACE.
I don't want to perform this 3 times, have an error, by that find out what's the next tablespace needing redirection and only then starting over...
How can I know from the dump file and the log file,what is the tablespace names i need for the redirection to my names? Or its just that the tablespace giving me the error is the only one in the dump file?
View 3 Replies
View Related
Feb 19, 2012
Is it possible to import a dump file using impdp data pump utility on oracle 10g where the export dump was taken using traditional exp utility and vice versa.
View 1 Replies
View Related
Sep 17, 2012
I try to transfer data from one database to another one through data pump via SQL Developer (data amount is quite important) exporting several tables. Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
"Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...=
ORA-39001: argument value invalid
ORA-39000: ....
ORA-31619: ...
The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
View 4 Replies
View Related
Oct 14, 2012
I have taken expdp dump from 11g database running in Development env.Now i want to import this dump into 10g database running in QA env.
while taking export from 11g database i used this script and backup was sucesssful
expdp system TABLES=sss_exp_test.EXP_SB_HEADER_TMP VERSION=10.2 DIRECTORY=RMSDEV_IMP_DIR DUMPFILE=EXP_SB_HEADER_TMP.dmp LOGFILE=EXP_SB_HEADER_TMP_expdp.log
When i trying to import this dump in 10g getting error.
impdp system TABLES=sss_exp.EXP_SB_HEADER_TMP DIRECTORY=RMS_DATA_PUMP DUMPFILE=EXP_SB_HEADER_TMP.dmp LOGFILE=EXP_SB_HEADER_TMP_impdp.log
Import: Release 10.2.0.5.0 - 64bit Production on Sunday, 14 October, 2012 19:58:53
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Password:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39002: invalid operation
ORA-39040: Schema expression "SCHEMAS" must identify exactly one schema.
View 8 Replies
View Related
Jan 7, 2013
I have new machine with oracle 11g and i have exported dump from oracle 10g . Now i need to import that dump on oracle 11g.
View 3 Replies
View Related
Jan 17, 2013
I am writing import dump, the dump is exported from user that i don't know it and tablespace i don't know it.
How can i for the import dump , from the user and tablespace inside the dump to my customized user
View 11 Replies
View Related
Jul 26, 2010
I am trying to import database dump using the following command
CODEimpdp system/xxxx@xxxx schemas=staging
remap_schema=staging:staging directory=DUMPDIR dumpfile=staging.dmp logfile=impdpstaing.log
TRANSFORM=SEGMENT_ATTRIBUTES:n
its importing data fine upto some stage after that oracle gives the following error
CODEProcessing object type SCHEMA_EXPORT/JAVA_SOURCE/JAVA_SOURCE
ORA-39097: Data Pump job encountered unexpected error -1423
ORA-39065: unexpected master process exception in DISPATCH
ORA-01423: error encountered while checking for extra rows in exact fetch
ORA-04030: out of process memory when trying to allocate 123404 bytes (QERHJ has
h-joi,kllcqas:kllsltba)
ORA-39014: One or more workers have prematurely exited.Job "SYSTEM"."SYS_IMPORT_SCHEMA_04" stopped due to fatal error at 11:42:03
I though its due to lack of memory, so i have increased pga_aggregate_target=512MB to 600MB still i am getting a same error. System configuration:
Microsoft windows Server 2003 R2,
standard Edition
Service Pack 2
Intel® Xeon CPU 3.00GHz,
4 GB RAM
SWAP MEM: 4106 MB
View 2 Replies
View Related
May 28, 2011
i had import a database dump file & this database has username/password say abc/xyz. my database username/password is system/vinod. after importing i'm unable to login with system/vinod(error:invalid username/password) and one strange thing was happened;a user was created with abc/xyz(username/password). after altering my user only then i'm able to login with my original user.
View 1 Replies
View Related
Mar 14, 2012
my problem is that whenever i want to import a dump file(oracle 10g) oracle just import 4 tables and then goes into hang state(not responding) i'm using old import method (not datapump).
View 1 Replies
View Related
Sep 20, 2010
i am getting the below error while doing an import thru data pump. looks like shared pool is not having enough mem allocated.
whether i can ignore this error? what it is trying to do with the DELETE FROM "SYS"."IMPDP_STATS"; INSERT INTO "SYS"."IMPDP_STATS"
ERROR
=====
. . imported "DBA1KSD"."TDF" 0 KB 0 rows
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
[Code]....
View 6 Replies
View Related
Jan 14, 2011
We have an old full export .dmp file from a 10g db and there are 451 records in one specific table that we need to export. Is it possible to IMP just the one specific table from a full dump? Or, another option, can we extract the records from the one table in the .dmp file into an xml file?
View 14 Replies
View Related
Feb 7, 2011
I have exported a schema dump with schema name as 'A'.I want to import that dump in to schema 'B'.how ?
View 5 Replies
View Related
Feb 24, 2012
I am facing a problem importing DMP file in 11g. While importing it gives me error not responding. I have to attached the jpg file for that to clear you my point whats wrong is going during import. My Dump is on 9i i want to import that on 11G R2.
View 4 Replies
View Related
Jun 18, 2012
We have the following scenario:
1) We get a daily Oracle 11 Dump
2) We only want to import the current changes in a MySQL database .
For this we need:
a) Is there a tool, script or best practices to get only the differences (delta) ?
b) How to import an Oracle dump or only teh differences into a MySQL database ?
View 2 Replies
View Related
Jan 3, 2012
I want to import dump file (without 2 tables) .The dump file contains 100 tables,indexes and constraints. So out of 100 tables i want to import 98 tables from dump file (without 2 tables).
View 13 Replies
View Related
Mar 31, 2010
If I want to import 10g export dump file in to the 9i database, I am connecting to 10g database from 9i database using
exp user/password@10gdb ....
However is there any option like executing 9i catexp.dat on 10g database and do the export from 10g database itself to be imported into 9i?
View 6 Replies
View Related
May 31, 2011
I tried to import a dump in 11g that was taken in oracle 9i. The import started but it hangs after some time. Exactly say it check only the character set of the DB's then it hangs. let me know if there are any specific procedures to import a dump from 9i to 11g directly.
View 8 Replies
View Related
Sep 26, 2012
I'm importing a dump using this parameters:
impdp system schemas=schemaname directory=DIR transform=segment_attributes:n:table dumpfile=FILE.DMP logfile=FILE.logand upon import, i have this error.
Failing sql is:
GRANT SELECT ON "schemaname"."tablename" TO "NAME"
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'NAME' does not exist
Failing sql is:
I know that "NAME" was created on the previous instance either role or user where the dump came. My question is, how can i remove this error since this role/user is not needed to the new instance and what parameter should i include to my import script?
View 2 Replies
View Related
Sep 25, 2010
We have a QA database on a VM server with Windows 2003 operating system and oracle 10.2.0.1 installed along with limited disk space.We received an expdp file from a client that is large enough that we had to copy it to a network drive (40GB). I created a new directory called IMPDMP with the directory path (using UNC pathing) to \serversharefoldersubfolder (our network mapped P drive, yes I included the backslash, but I have tried without it also). I also included the parfile here. I checked the grants and they seem to be fine
SQL> select * from session_roles where role like '%DATABASE' or role like 'DBA';
ROLE
------------------------------
DBA
EXP_FULL_DATABASE
IMP_FULL_DATABASE
SQL> select * from session_privs where privilege like '%DICT%';
PRIVILEGE
----------------------------------------
SELECT ANY DICTIONARY
ANALYZE ANY DICTIONARY
[code]....
My questions are this:
1) In interactive mode, does a dummy file expdat.dmp have to exist in the DATA_PUMP_DIR directory?
2) does my export have to reside in the DATA_PUMP_DIR directory (again, no disk space to handle the DMP file), one of the hard drives is just big enough to handle the space but since it has datafiles there also, it would crash during import when trying to extend.
View 3 Replies
View Related
Apr 25, 2011
I am trying to export/import of a schema who's size is around 60 GB.
Export parfile goes like this..
file=expdmp1.dmp, expdmp2.dmp, expdmp3.dmp, expdmp4.dmp, expdmp5.dmp, expdmp6.dmp, expdmp7.dmp
filesize=10240M
log=explog.log
owner=owner1
Import parfile goes like this..
file=impdmp1.dmp, impdmp2.dmp, impdmp3.dmp, impdmp4.dmp, impdmp5.dmp, impdmp6.dmp, impdmp7.dmp
filesize=10240M
log=implog.log
fromuser=owner1
touser=owner2
ignore=y
I am going to run this on production. So want to check it..
View 2 Replies
View Related
Jan 17, 2013
I have 3 dump files: A.dmp, B.dmp, C.dmp . Can I use multiple REMAP_TABLESPACE entries in a par file to remap the table spaces for the above dump file?
Parfile would look something like this:
DIRECTORY=dpump
DUMPFILE=A.dmp,B.dmp,C.dmp
JOB_NAME=import_3_schemas
REMAP_TABLESPACE=A1:D1
REMAP_TABLESPACE=B1:E1
REMAP_TABLESPACE=C1:F1
The first remap entry is only relevant to A.dmp file
The second remap entry is only relevant to B.dmp file
etc.
View 2 Replies
View Related
Aug 2, 2012
i have more than 100 dumpfiles to import into my oracle 11g database. i know how to import(impdp) for same named dumps but here all the dumpfile names are totally different(ex: aa.dmp,bb.dmp,).
View 3 Replies
View Related
Jul 26, 2010
I am trying to import database dump using the following command
impdp system/xxxx@xxxx schemas=staging
remap_schema=staging:staging directory=DUMPDIR dumpfile=staging.dmp logfile=impdpstaing.log
TRANSFORM=SEGMENT_ATTRIBUTES:n
its importing data fine upto some stage after that oracle gives the following error
Processing object type SCHEMA_EXPORT/JAVA_SOURCE/JAVA_SOURCE
ORA-39097: Data Pump job encountered unexpected error -1423
ORA-39065: unexpected master process exception in DISPATCH
ORA-01423: error encountered while checking for extra rows in exact fetch
ORA-04030: out of process memory when trying to allocate 123404 bytes (QERHJ has
h-joi,kllcqas:kllsltba)
ORA-39014: One or more workers have prematurely exited.
Job "SYSTEM"."SYS_IMPORT_SCHEMA_04" stopped due to fatal error at 11:42:03
I though its due to lack of memory, so i have increased pga_aggregate_target=512MB to 600MB still i am getting a same error.
View 5 Replies
View Related
Feb 15, 2007
As I put data pump import command: got the error..........
Import: Release 10.1.0.2.0 - Production on Thursday, 15 February, 2007 3:54
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "SYSTEM"."SYS_SQL_FILE_FULL_02" successfully loaded/unloaded
Starting "SYSTEM"."SYS_SQL_FILE_FULL_02": system/******** directory=data_pump dumpfile=prashant_dp.
dmp SQLFILE=prashant_imp.sql logfile=prashant_imp.log
[code]...
View 8 Replies
View Related
Nov 12, 2010
I am trying to import a dump into my oracle 10.2.0.3.0 database on my win 7 professional laptop. The dump is exported from my win xp desktop pc running Oracle 10.2.0.1.0
Below is the error i get:
Import: Release 10.2.0.3.0 - Production on Fri Nov 12 15:57:52 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Username: system/password@orcl
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production
With the Partitioning, OLAP and Data Mining options
Import file: EXPDAT.DMP > F:PersonalDPISIMBA.dmp
Enter insert buffer size (minimum is 8192) 30720>
Export file created by EXPORT:V10.02.01 via conventional path
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
List contents of import file only (yes/no): no >
Ignore create error due to object existence (yes/no): no >
Import grants (yes/no): yes >
Import table data (yes/no): yes >
Import entire export file (yes/no): no >
Username:paymaster
Enter table<T> or partition<T:P> names. NULL list means all tables for user
Enter table<T> or partition<T:P> name or . if done:
when I press enter key, the console hangs and a window appears with "Console Window Host has stopped working" then the console closes prematurely.
View 1 Replies
View Related
Feb 25, 2011
What is the Default Path of Log File after Import Dump in Oracle 10g.
View 1 Replies
View Related
Jun 20, 2008
i have full export dump file....from this, i need to import only one procedure belongs to schema : IC_MIGR_DATA... i need to import into SCHEMA : rep_user...
iam giving syntax:
impdp system/icg0ld@ICPRD directory=DUMPDIR dumpfile=IC_FULL_19062008.dmp logfile=imp_IC_FULL_190608.log schemas=rep_user parfile=imp_proc.par
parfile :
--------
INCLUDE=PROCEDURE:"LIKE 'IC_MIGR_DATA.JET_UPLIFT'"
while importing, iam getting below error,
*****[oracle10@AIICDELL IC]$ impdp system/icg0ld@ICPRD directory=DUMPDIR dumpfile=IC_FULL_19062008.dmp logfile=imp_IC_FULL_190608.log schemas=rep_user parfile=imp_proc.par
Import: Release 10.2.0.2.0 - 64bit Production on Friday, 20 June, 2008 16:19:46
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-31694: master table "SYSTEM"."SYS_IMPORT_SCHEMA_01" failed to load/unload
ORA-31644: unable to position to block number 30698425 in dump file "/AIIC_backup/expbkp/dumps/IC/IC_FULL_19062008.dmp"
******
how to import this one procedure JET_UPLIFT , this has to be imported into REP_USER schema, owner of this procedure is IC_MIGR_DATA
View 5 Replies
View Related