Server Utilities :: IMPDP (ORA-31619 Invalid Dump File)
May 29, 2012
I need to recreate/ clone my database to a new machine. The two machine are not connected in the network.
Step 1. (Oracle 10.2.0.5 AIX 64-bit)
expdp username/password@db1 full=y dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_expdp.log job_name=full_exp
Step 2.
FTP dump files to Windows
Step 3. (Oracle 10.1.0.2 Windows 32-bit)
impdp username/password@db1 dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_impdp.log full=y
I got:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"
Done in AIX:
create directory dp as '/bak'
grant read, write on directory dp to public;
grant exp_full_database to username;
Done in Windows:
create directory dp as 'C:P7DB';
grant read, write on directory dp to public;
grant exp_full_database to username;
grant imp_full_database to username;
View 8 Replies
ADVERTISEMENT
Jul 16, 2010
I have a bit of an issue with Oracle datapump dump files.
Today, I manage the export and import of oracle dump files. As part of the batch export process I have a script which essentially says:
For each schema realated to my application in THIS instance, export schema via the system user (system user allows me privs to all schemas).
On the import UI side of things I am able to run a "head -20" command on the dmp file and determine the "export client version", "date the schema was dumped", and "what schema it was dumped from". All useful info presented in my UI.
Sample output: Begin
EXPORT:V09.02.00
DSYSTEM
RUSERS
8192
Wed Jun 30 11:51:21 UserXXX.dmp
#C##
#C##
[code]....
Sample output: End
in that I allow the importation of production schemas into test schemas, (contained in a different tablespace). Based on naming convention I can determine the schema type (production or test). Additionally and probably most importantly, I am assured where the data has come from.
In looking at "expdp" and the dump file. Using the same method as above, it appears the data pump dump DOES NOT carry similar headers. Because of this, I am unable to return very little useful info from the dump file.
I realize I could run the impdp with the "sqlfile=myfile.sql" and then interrogate the sql file for the info. But on large dump files this would be fairly time consuming compared to a "head -20" on a dump file.
View 4 Replies
View Related
Feb 19, 2012
Is it possible to import a dump file using impdp data pump utility on oracle 10g where the export dump was taken using traditional exp utility and vice versa.
View 1 Replies
View Related
Nov 21, 2012
I restore Dump File in Oracle 10g .
my Command is :
impdp DUMPFILE=BAK910830.DMP NOLOGFILE=Y
An error message is as follows:
"Invalid argument value"
"Bad dump file specification"
"Dump File may ba an original export Dump file "
I think the dump File is in Oracle 11g Format .
View 8 Replies
View Related
Aug 2, 2012
i have more than 100 dumpfiles to import into my oracle 11g database. i know how to import(impdp) for same named dumps but here all the dumpfile names are totally different(ex: aa.dmp,bb.dmp,).
View 3 Replies
View Related
Mar 29, 2013
Is it possible to determine whether the dump file is created using data pump export or normal export method by just looking at dump file, If yes, how ?
Why i am asking such question is...normal export and data pump export would create a dump file with an same extension filename.dmp. So to avoid confusion during import, i would want to determine by what method the dump file was created.
Also this would be useful for me at the scenario when the customer sends me only the dumpfile and ask to import into target database. ( may be the customer don't know in what method the dump file was created ).
View 23 Replies
View Related
Apr 19, 2010
I have A Daily hot backup using Expdp Command On oracle 10g R2 installed on the Linux server. And I'm trying to move this Dump File to Another directory on Windows server 2003 over network using Ftp script which will be run after the export process finished Automatically.
View 9 Replies
View Related
Jan 18, 2012
I have a question on export dump file generation.
select sum(bytes)/(1024*1024*1024) "GB" from dba_segments where owner='JACK';
The above select query give the output of Schema size with 15 GB. When i perform the same schema export, the dump file size generating is 2 GB. What is the difference between the two scenarios as how come there could be a variation in file size?
View 6 Replies
View Related
May 29, 2010
I am in the process of upgrading our 9i DB to 10g . As they are on different servers, I have installed 10g on the new server and applied the latest patchset 10.2.0.4.
I am creating the production database and importing th e9i dump file into this.Now I will be testing the whole application that uses this database.After a week, I need to take the latest 9i dump and export to the new 10g DB.
Do I need to just import the latest 9i dump into the 10g db or do I need to do anything else?
View 3 Replies
View Related
Feb 24, 2012
I am facing a problem importing DMP file in 11g. While importing it gives me error not responding. I have to attached the jpg file for that to clear you my point whats wrong is going during import. My Dump is on 9i i want to import that on 11G R2.
View 4 Replies
View Related
Jul 29, 2011
Is it possible to identify what level of export by looking at export dumpfile .. whether it is a schema export,full export,table export,..
If yes.. how ?
View 3 Replies
View Related
Jan 3, 2012
I want to import dump file (without 2 tables) .The dump file contains 100 tables,indexes and constraints. So out of 100 tables i want to import 98 tables from dump file (without 2 tables).
View 13 Replies
View Related
Nov 1, 2012
While trying to import a schema using Data Dump, I am facing the following issue - UDI-00018 - Import utility version can not be more recent than the Data Dump server.Following is the version information of the source and target DB and the utilities :
Source DB server : 10.1.0.2.0
Export utility : 10.1.0.2.0
Import utility : 10.1.0.2.0
Target DB server : 10.1.0.2.0
Export utility : 10.2.0.1.0
Import utility : 10.2.0.1.0
View 5 Replies
View Related
Jan 2, 2012
Is there a way to know what kind of Backup (table/tablespace/full/schema) by looking at export dump file ? If yes, can you tell me the command ?
View 1 Replies
View Related
Jul 11, 2013
I want to know how to add date/time in export dump file in Linux using parfile script. I keep getting an error "contain an invalid substitution variables"
my parfile is:
Dumpfile = Daily_Full_%U_`date "+%Y%m%d%H%S"`.dmp or
Dumpfile = Daily_Full_%U_`%date%`.dmp
View 1 Replies
View Related
Apr 13, 2012
How we can overwrite existing dump file for expdp in oracle 10g because everytime we excute expdp and dmp file exist we get below error
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "C:scott_emp.dmp"
ORA-27038: created file already exists
OSD-04010: <create> option specified, file already exists
We have one feature in 11g reuse_dumpfiles=y ,which doesnt work in 10g, I want something which can overwrite existing dumpfile in 10g?
View 1 Replies
View Related
Feb 25, 2011
What is the Default Path of Log File after Import Dump in Oracle 10g.
View 1 Replies
View Related
Nov 16, 2011
How to import constraints only from a dump file using Oracle data pump.
View 1 Replies
View Related
Jul 5, 2012
I'm rying to import schema's from a dump file that came from a different environment.
What I have is:
1. dump file
2. log file of the export
I'm trying to import the file(containing three schemas) with remap_schemas, and it fails, gives a lot of ORA-00959: tablespace 'string' does not exist.
Now, I've read in OTN:
[URL]
that what you need to do in that case is to use the REMAP_TABLESPACE option,to redirect the objects to a different tablespace.
I don't see a name of the tablespace I'm getting the error for in the export log.I don't know if I have more tablespaces I have to redirect with REMAP_TABLESPACE.
I don't want to perform this 3 times, have an error, by that find out what's the next tablespace needing redirection and only then starting over...
How can I know from the dump file and the log file,what is the tablespace names i need for the redirection to my names? Or its just that the tablespace giving me the error is the only one in the dump file?
View 3 Replies
View Related
Dec 20, 2011
I have dump file of 17 GB,which i want to import in my database db1 in user amol ;so i created new user under db1 as below ,before this i have created tablespace so that i can import my data only to that tablespace only. My steps are as below.
CREATE TABLESPACE ptaxold1 DATAFILE '/home/oracle/oracle/product/10.2.0/oradata/cvsdbm/ptaxold1.dbf' SIZE 6024M AUTOEXTEND ON;
create user amol identified by amol default tablespace ptaxold1 temporary tablespace tem; imp amol/amol then i mentioned my dump file but after importing it does not show any table under user amol.
View 12 Replies
View Related
Oct 8, 2010
Getting below error While Impdp..
Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
ORA-39083: Object type PROCACT_SCHEMA failed to create with error:
ORA-31625: Schema ADAS is needed to import this object, but is unaccessible
ORA-28031: maximum of 148 enabled roles exceeded
[code]...
ORA-06512: at "SYS.KUPW$WORKER", line 1342
ORA-06512: at line 2
Job "SYS"."SYS_IMPORT_FULL_01" stopped due to fatal error at 17:13:38
View 2 Replies
View Related
Apr 30, 2009
When I am exporting a schema that own several DBA_JOBS using expdp, and then importing it into another database - the DBA_JOB change ownership to the schema I used for the import.
For example :
the log_user,priv_user and schema_user used to be schema 'A'
but after the impdp, assuming I imported as SYSTEM, the log_user,priv_user and schema_user are now SYSTEM !.
Is this an expected behavior ? How can I avoid it ? the job will fail executing as SYSTEM , and I need to manually create it as user A again.
View 5 Replies
View Related
Jul 9, 2013
I have a question to clarify regarding user creation during export and import.
Will user get created along with roles,privileges by default when using impdp command ?
View 6 Replies
View Related
May 8, 2012
I did an export using the following parfile (see below) I want to import all the objects associated with this schema into another DB but I want don't want to over-write any of the permissions such as grants.
Is there a way I can get the grants into a sql file before I do the import. If so, provide an example.
cat exp_par
DUMPFILE=exp.dmp
LOGFILE=exp.log
DIRECTORY=DBBACKUP
schemas=t1
View 1 Replies
View Related
May 26, 2010
I'm trying to do a network datapump between oracle databases, and it seems to continually hang when it gets to the point where it should be processing table data.
C:>impdp DP_USER/DP_USER parfile=sde_webmap_2.par
Import: Release 11.1.0.7.0 - 64bit Production on Wednesday, 26 May, 2010 17:42:03
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "DP_USER"."SYS_IMPORT_FULL_01": DP_USER/******** parfile=sde_webmap_2.par
Estimate in progress using BLOCKS method...
[code]...
It just sits at this point indefinitely.The parfile for those interested:
directory=datapumps
logfile=sde_webmap_2.log
network_link=backup
full=y
INCLUDE=SCHEMA:"IN ('SDE_WEBMAP_BUSINESS','SDE_WEBMAP_BUSINESS_A','SDE_WEBMAP_BUSINESS_B')"
And the results from V$SESSION_LONGOPS
69 SYS_IMPORT_FULL_01 IMPORT 0 1031 MB 5/26/2010 5:50:37 PM 5/26/2010 6:03:29 PM
View 3 Replies
View Related
Sep 25, 2011
As we know,there is a parameters named indexes of orignal imp,it use to generate create index ddl,Is there a parameter in impdp compare to it?
View 6 Replies
View Related
Oct 23, 2013
I was running IMPDP for my company's database.Everything is just fine except one problem.A big table(61.1G) could't be imported.The warning message shows "imported TBL_XXXX 61.1G 0 out of 147653981rows".The alert log had no warning message.I didn't know
View 12 Replies
View Related
Nov 9, 2012
I exported three databases : servicedesk, report and mostcmdb on server A to a dumpfile. Moved them to similar dump location on server B. Now, I want to import them into B by using:
[SQL> impdp system/OSS_MOS7100 dumpfile=MOSTCMDB_0211.DMP schemas=mostcmdb
The error was:
SP2-0734: unknown command beginning "impdp syst..." - rest of line ignored.
View 7 Replies
View Related
Feb 13, 2012
1) My database dump size near about 4GB , which is provided by the vendor .
2) In the dump , total objects are 364949 , where
Table : 121316
LOB object : 121315
(Normal+LOB) indexes : 122317
3) Now when I run the import using system or another user , it hangs on the below stage for 70+ hours ..
impdp ntest/ntest directory=test_dir dumpfile=JBLLIVE.31Jan2012.11.50AM.dmp remap_schema=JBLLIVE:NTEST logfile=ntest_10feb.log
Import: Release 11.2.0.1.0 - Production on Fri Feb 10 09:49:50 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "NTEST"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "NTEST"."SYS_IMPORT_FULL_01": ntest/******** directory=test_dir dumpfile=JBLLIVE.31Jan2012.11.50AM.dmp remap_schema=JBLLIVE:NTEST logfile=ntest_10feb.log
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"NTEST" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
----
In this situation I observed the worker status and see that some table and some LOB objects including LOB indexes are imported . Worker process do it in background but it does not show in the front import log file (I dont understand why it not shows in the import logfile). it imports one table,one LOB , one LOB index ..then again one table,one LOB , one LOB index ... in this way .
And my observation first it inserts data into the LOB tables and then it inserts into normal table . And when it is starting to insert data to the normal table then this table's log are shown in the import logfile.
an example of our data type :
Objects :
===================================================
LOB_FD17_RGS_TSTCD2 LOB
FD17_RGS_VERSION TABLE
(here i see one table has one LOB segment, in this way 121316 table has 121316 LOB)
SQL> desc FD17_RGS_VERSION
Name Null? Type
----------------------------------------- -------- ----------------------------
RECID VARCHAR2(255)
XMLRECORD BLOB
Our observation perhaps inserting blob mainly occurs the slowness . Is there any patch or is there any bug regarding BLOB/LOB objects in oracle-11gR2
View 6 Replies
View Related
Apr 15, 2012
I started datapump not using parallel option
I issued the following command
impdp t24/t24 directory=dp_dump dumpfile=bef_cob_%U.dmp schemas=t24
The total dumpfile size is 200 GB..
Now I want to add parallel process to the job..
View 9 Replies
View Related