Server Utilities :: Error While Importing Dump File In Oracle 10g R1

Nov 1, 2012

While trying to import a schema using Data Dump, I am facing the following issue - UDI-00018 - Import utility version can not be more recent than the Data Dump server.Following is the version information of the source and target DB and the utilities :

Source DB server : 10.1.0.2.0
Export utility : 10.1.0.2.0
Import utility : 10.1.0.2.0

Target DB server : 10.1.0.2.0
Export utility : 10.2.0.1.0
Import utility : 10.2.0.1.0

View 5 Replies


ADVERTISEMENT

Server Utilities :: Importing 9i Dump Into 10g?

Jun 22, 2011

how can i import the oracle 9i dump file into 10g database, while iporting i get following error imp-00002 fail to open dump file

View 4 Replies View Related

Server Utilities :: Error / Bad Dump File Specification / Invalid Argument Value

Nov 21, 2012

I restore Dump File in Oracle 10g .

my Command is :

impdp DUMPFILE=BAK910830.DMP NOLOGFILE=Y

An error message is as follows:

"Invalid argument value"

"Bad dump file specification"

"Dump File may ba an original export Dump file "

I think the dump File is in Oracle 11g Format .

View 8 Replies View Related

Server Utilities :: Importing Dump From Higher Database Version 2 Lower

Aug 3, 2010

We can Import dump from higher database version to lower database version.

View 26 Replies View Related

Server Utilities :: Overwrite Existing Dump File In Expdp In Oracle 10g?

Apr 13, 2012

How we can overwrite existing dump file for expdp in oracle 10g because everytime we excute expdp and dmp file exist we get below error

ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "C:scott_emp.dmp"
ORA-27038: created file already exists
OSD-04010: <create> option specified, file already exists

We have one feature in 11g reuse_dumpfiles=y ,which doesnt work in 10g, I want something which can overwrite existing dumpfile in 10g?

View 1 Replies View Related

Server Utilities :: Default Path Of Log File After Import Dump In Oracle 10g

Feb 25, 2011

What is the Default Path of Log File after Import Dump in Oracle 10g.

View 1 Replies View Related

Server Utilities :: Import Constraints Only From Dump File Using Oracle Data Pump?

Nov 16, 2011

How to import constraints only from a dump file using Oracle data pump.

View 1 Replies View Related

Server Utilities :: Import A Dump File Using Impdp Data Pump Utility On Oracle 10g

Feb 19, 2012

Is it possible to import a dump file using impdp data pump utility on oracle 10g where the export dump was taken using traditional exp utility and vice versa.

View 1 Replies View Related

Server Utilities :: Importing From DMP File Taken From Different Edition

Mar 2, 2010

i m having full export .dmp file taken from oracle 10.2.0.1 Enterprise Edition.

i need to reinstall 10.2.0.1 standard edition and want to create database using the dump file which is taken from the Enterprise edition.

View 3 Replies View Related

Server Utilities :: Character Set Error While Importing

Jun 9, 2011

I have oracle 10.2.0.4.0 installed on Window server 2008 [Machine A] and oracle 10.2.0.1.0 on windowx xp [Machine b]. Now I have taken the export of database on windows server 2k8 [Machine a] by puting the entry in the tnsname.ora file of Windox XP [Machine b].

Now when I am importing on the same machine I am getting the below mentioned error:

C:Documents and Settingsdsharma>IMP FROMUSER=SYSTEM TOUSER=ESCDBO FILE='D:sharevcc53_0106.dmp' LOG='D:sharevcc53_0106_IMP.LOG' ignore=y

Import: Release 10.2.0.1.0 - Production on Thu Jun 9 20:08:11 2011
Copyright (c) 1982, 2005, Oracle. All rights reserved.

Username: system
Password:

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production..With the Partitioning, OLAP and Data Mining options

Export file created by EXPORT:V10.02.01 via conventional path
import done in WE8MSWIN1252 character set and UTF8 NCHAR character set
import server uses AL32UTF8 character set (possible charset conversion)
export client uses US7ASCII character set (possible charset conversion)
Import terminated successfully without warnings.

View 14 Replies View Related

Circular Synonym Error While Importing Dump?

Mar 5, 2011

I have a db instance in which several schemas are there. I have taken a dump of a particular schema(user) using oracle exp command

exp <username>/<passwd>@<dbname> FULL=y FILE=export.dmp LOG=exportdb.log CONSISTENT=y

Now I have another db where I wish to import the above dump. This is not a empty db, however, I have dropped the particular user from this db for which I have created a dump above. Then I have created the same user using 'create user....'.And now I am trying to import the above dump into this.

imp '<username>/<passwd>@<dbname>' FILE=C:\DUMP\Dump\export.dmp LOG=C:\DUMP\logs\imp_dmp.log FULL=Y

import goes fine but I gets the circular synonym error for some of the types.

IMP-00003: ORACLE error 4055 encountered
ORA-04055: Aborted: "O_BULK_TARGET_SELECTOR" formed a non-REF mutually-dependent cycle with "T_GENERAL_IDLIST".
ORA-06550: line 5, column 25:
PLS-00421: circular synonym 'PUBLIC.T_GENERAL_IDLIST'

The db from which I created the dmp has no errors for these objects, but while impoting it gives these errors.I have even tried taking the whole database dmp(not a particular user), and importing it into full empty database. Then also I gets the same error.

View 3 Replies View Related

Importing From A Dump File - Incompatible Version Number

Sep 3, 2012

import from dump (.dmp) file. I'm running Oracle database 11g Enterprise edition release 11.1.0.6.0.

the import statement I'm using is

impdp system/password@orcl full=Y DIRECTORY=data_pump_dir dumpfile=mydmpfile.dmp logfile=min.log

and the error I'm getting is "incompatible versionnumber 3.1 in the dumpfile mydmpfile.dmp"

The dump file was exported using oracle 11.2.0.2.0. I tried to download/unzip the client version of instantclient 11.2.0.2 and add it to the PATH variable in windows and then re-run the script, but it didn't work.

How I should go from here to import this dump file without reinstalling the whole database?

View 3 Replies View Related

Server Utilities :: Importing From 9i To 11g - Getting ORA-29339 Error Because Of Block Size?

Oct 5, 2010

We are working on migrating from 9.2.0.4 to 11.2 and we've set up a test machine so that we could test the install and the import (as well as test additional 11g features that we want to begin using).

So we created the database and created all of the tablespaces beforehand.

Our import command is

$ORACLE_HOME/bin/imp system/manager FULL=Y BUFFER=140000 FILE=/dbexport/Lhtech.exp VOLSIZE=2000M GRANTS=Y INDEXES=Y COMMIT=Y IGNORE=Y

However, when we run the import, we get the errors like so:

Import: Release 11.2.0.1.0 - Production on Tue Oct 5 15:01:19 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export file created by EXPORT:V09.02.00 via conventional path

[code]....

First of all, the block size in our "newly" created tablespaces is 8192...and these are obviously trying to recreate the tablespaces with a block size of 2048.

1) Why is it not ignoring these create tablespace commands when those tablespaces already exist?

2) how in the world do we get around the block size issue? We've tried nearly everything we could find, but we've still not had any luck.

View 8 Replies View Related

Server Utilities :: Dump File Determination

Mar 29, 2013

Is it possible to determine whether the dump file is created using data pump export or normal export method by just looking at dump file, If yes, how ?

Why i am asking such question is...normal export and data pump export would create a dump file with an same extension filename.dmp. So to avoid confusion during import, i would want to determine by what method the dump file was created.

Also this would be useful for me at the scenario when the customer sends me only the dumpfile and ask to import into target database. ( may be the customer don't know in what method the dump file was created ).

View 23 Replies View Related

Server Utilities :: FTP Dump File Over Network

Apr 19, 2010

I have A Daily hot backup using Expdp Command On oracle 10g R2 installed on the Linux server. And I'm trying to move this Dump File to Another directory on Windows server 2003 over network using Ftp script which will be run after the export process finished Automatically.

View 9 Replies View Related

Server Utilities :: Dump File Generation

Jan 18, 2012

I have a question on export dump file generation.

select sum(bytes)/(1024*1024*1024) "GB" from dba_segments where owner='JACK';

The above select query give the output of Schema size with 15 GB. When i perform the same schema export, the dump file size generating is 2 GB. What is the difference between the two scenarios as how come there could be a variation in file size?

View 6 Replies View Related

Server Utilities :: How To Reimport 9i Dump File To 10g

May 29, 2010

I am in the process of upgrading our 9i DB to 10g . As they are on different servers, I have installed 10g on the new server and applied the latest patchset 10.2.0.4.

I am creating the production database and importing th e9i dump file into this.Now I will be testing the whole application that uses this database.After a week, I need to take the latest 9i dump and export to the new 10g DB.

Do I need to just import the latest 9i dump into the 10g db or do I need to do anything else?

View 3 Replies View Related

Server Utilities :: Import Dump File In 11g

Feb 24, 2012

I am facing a problem importing DMP file in 11g. While importing it gives me error not responding. I have to attached the jpg file for that to clear you my point whats wrong is going during import. My Dump is on 9i i want to import that on 11G R2.

View 4 Replies View Related

Server Utilities :: Export Dump File

Jul 29, 2011

Is it possible to identify what level of export by looking at export dumpfile .. whether it is a schema export,full export,table export,..

If yes.. how ?

View 3 Replies View Related

Server Utilities :: Import Dump File Without 2 Tables

Jan 3, 2012

I want to import dump file (without 2 tables) .The dump file contains 100 tables,indexes and constraints. So out of 100 tables i want to import 98 tables from dump file (without 2 tables).

View 13 Replies View Related

Server Utilities :: IMPDP (ORA-31619 Invalid Dump File)

May 29, 2012

I need to recreate/ clone my database to a new machine. The two machine are not connected in the network.

Step 1. (Oracle 10.2.0.5 AIX 64-bit)
expdp username/password@db1 full=y dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_expdp.log job_name=full_exp

Step 2.
FTP dump files to Windows

Step 3. (Oracle 10.1.0.2 Windows 32-bit)
impdp username/password@db1 dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_impdp.log full=y

I got:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"

Done in AIX:
create directory dp as '/bak'
grant read, write on directory dp to public;
grant exp_full_database to username;

Done in Windows:
create directory dp as 'C:P7DB';
grant read, write on directory dp to public;
grant exp_full_database to username;
grant imp_full_database to username;

View 8 Replies View Related

Server Utilities :: What Kind Of Backup By Looking At Export Dump File

Jan 2, 2012

Is there a way to know what kind of Backup (table/tablespace/full/schema) by looking at export dump file ? If yes, can you tell me the command ?

View 1 Replies View Related

Server Utilities :: EXP Versus IMPDP Interrogating Dump File

Jul 16, 2010

I have a bit of an issue with Oracle datapump dump files.

Today, I manage the export and import of oracle dump files. As part of the batch export process I have a script which essentially says:

For each schema realated to my application in THIS instance, export schema via the system user (system user allows me privs to all schemas).

On the import UI side of things I am able to run a "head -20" command on the dmp file and determine the "export client version", "date the schema was dumped", and "what schema it was dumped from". All useful info presented in my UI.

Sample output: Begin

EXPORT:V09.02.00
DSYSTEM
RUSERS
8192
Wed Jun 30 11:51:21 UserXXX.dmp
#C##
#C##

[code]....

Sample output: End

in that I allow the importation of production schemas into test schemas, (contained in a different tablespace). Based on naming convention I can determine the schema type (production or test). Additionally and probably most importantly, I am assured where the data has come from.

In looking at "expdp" and the dump file. Using the same method as above, it appears the data pump dump DOES NOT carry similar headers. Because of this, I am unable to return very little useful info from the dump file.

I realize I could run the impdp with the "sqlfile=myfile.sql" and then interrogate the sql file for the info. But on large dump files this would be fairly time consuming compared to a "head -20" on a dump file.

View 4 Replies View Related

Server Utilities :: How To Add Date / Time In Export Dump File In Linux

Jul 11, 2013

I want to know how to add date/time in export dump file in Linux using parfile script. I keep getting an error "contain an invalid substitution variables"

my parfile is:

Dumpfile = Daily_Full_%U_`date "+%Y%m%d%H%S"`.dmp or
Dumpfile = Daily_Full_%U_`%date%`.dmp

View 1 Replies View Related

Server Utilities :: Data Pump Import - How To Identify Dump File Tablespaces

Jul 5, 2012

I'm rying to import schema's from a dump file that came from a different environment.

What I have is:

1. dump file
2. log file of the export

I'm trying to import the file(containing three schemas) with remap_schemas, and it fails, gives a lot of ORA-00959: tablespace 'string' does not exist.

Now, I've read in OTN:

[URL]

that what you need to do in that case is to use the REMAP_TABLESPACE option,to redirect the objects to a different tablespace.

I don't see a name of the tablespace I'm getting the error for in the export log.I don't know if I have more tablespaces I have to redirect with REMAP_TABLESPACE.

I don't want to perform this 3 times, have an error, by that find out what's the next tablespace needing redirection and only then starting over...

How can I know from the dump file and the log file,what is the tablespace names i need for the redirection to my names? Or its just that the tablespace giving me the error is the only one in the dump file?

View 3 Replies View Related

Server Utilities :: Imported Dump File Does Not Show Any Table Under User Amol

Dec 20, 2011

I have dump file of 17 GB,which i want to import in my database db1 in user amol ;so i created new user under db1 as below ,before this i have created tablespace so that i can import my data only to that tablespace only. My steps are as below.

CREATE TABLESPACE ptaxold1 DATAFILE '/home/oracle/oracle/product/10.2.0/oradata/cvsdbm/ptaxold1.dbf' SIZE 6024M AUTOEXTEND ON;

create user amol identified by amol default tablespace ptaxold1 temporary tablespace tem; imp amol/amol then i mentioned my dump file but after importing it does not show any table under user amol.

View 12 Replies View Related

Server Utilities :: Importing Limited Data In Oracle

Jun 29, 2010

I have received a dump, which i need to put on a newly created schema, there is a particular table with more then 4 million rows, and other tables have hardly few thousand rows.

I want to import it in a way where only 1000 rows get imported for this table and other tables do not get affected. Is there a way to do it?

Note: Tables in dump are more then 200.

View 7 Replies View Related

Server Utilities :: Importing Data From SQL Server To Oracle?

Mar 7, 2011

I'm importing data from SQL Server to Oracle. I used BCP to export the data from SQL Server. Below is the 1st record of table trlc from the csv file.

trlc.CSV

11032|100|Wman| | |2008-02-08| |

Using SQL Loader to import into Oracle:

TRLC table in Oracle database:

est_no varchar2(10) default ' '
right_no number(4) default 0
maj_auth varchar2(15) default ' '
weight varchar2(10) default ' '
idm_ht varchar2(8) default ' '
c_date date
P_tkt varchar2(5) default ' '

sqlldr user/pwd@db02 control=trlc.ctl log=trlc.log

trlc.ctL:

load data
infile 'trlc.csv'
replace
into table trlc
fields TERMINATED BY '|'
TRAILING NULLCOLS
(est_no,right_no,maj_auth,weight,idm_ht,c_date,P_tkt)

The rows get inserted successfully. But the result sets are different, for example: When I do a select in SQL Server,'select len(weight) from trlc;' , I get the length as 0. But when I do a select in oracle database, I get the length as 1. Also, the result set varies for the query below:

select * from trlc where weight=' ';

(SQL Server returns 1 row but Oracle returns no rows)

Do I need to mention any conversion code for the weight field to accept ' ' value?

View 12 Replies View Related

Server Utilities :: Reading From Oracle Dump

Jun 14, 2006

Is there is any way of reading from oracle dump file?

View 16 Replies View Related

Server Utilities :: Import Oracle 10g Dump Into 9i Database

Mar 31, 2010

If I want to import 10g export dump file in to the 9i database, I am connecting to 10g database from 9i database using
exp user/password@10gdb ....

However is there any option like executing 9i catexp.dat on 10g database and do the export from 10g database itself to be imported into 9i?

View 6 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved