Server Utilities :: Import Dump File Without 2 Tables
Jan 3, 2012
I want to import dump file (without 2 tables) .The dump file contains 100 tables,indexes and constraints. So out of 100 tables i want to import 98 tables from dump file (without 2 tables).
I am facing a problem importing DMP file in 11g. While importing it gives me error not responding. I have to attached the jpg file for that to clear you my point whats wrong is going during import. My Dump is on 9i i want to import that on 11G R2.
I'm rying to import schema's from a dump file that came from a different environment.
What I have is:
1. dump file 2. log file of the export
I'm trying to import the file(containing three schemas) with remap_schemas, and it fails, gives a lot of ORA-00959: tablespace 'string' does not exist.
Now, I've read in OTN:
[URL]
that what you need to do in that case is to use the REMAP_TABLESPACE option,to redirect the objects to a different tablespace.
I don't see a name of the tablespace I'm getting the error for in the export log.I don't know if I have more tablespaces I have to redirect with REMAP_TABLESPACE.
I don't want to perform this 3 times, have an error, by that find out what's the next tablespace needing redirection and only then starting over...
How can I know from the dump file and the log file,what is the tablespace names i need for the redirection to my names? Or its just that the tablespace giving me the error is the only one in the dump file?
Is it possible to import a dump file using impdp data pump utility on oracle 10g where the export dump was taken using traditional exp utility and vice versa.
I tried to import a dump in 11g that was taken in oracle 9i. The import started but it hangs after some time. Exactly say it check only the character set of the DB's then it hangs. let me know if there are any specific procedures to import a dump from 9i to 11g directly.
i have more than 100 dumpfiles to import into my oracle 11g database. i know how to import(impdp) for same named dumps but here all the dumpfile names are totally different(ex: aa.dmp,bb.dmp,).
I am trying to import a dump into my oracle 10.2.0.3.0 database on my win 7 professional laptop. The dump is exported from my win xp desktop pc running Oracle 10.2.0.1.0
Below is the error i get:
Import: Release 10.2.0.3.0 - Production on Fri Nov 12 15:57:52 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Username: system/password@orcl
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production With the Partitioning, OLAP and Data Mining options
Import file: EXPDAT.DMP > F:PersonalDPISIMBA.dmp
Enter insert buffer size (minimum is 8192) 30720>
Export file created by EXPORT:V10.02.01 via conventional path import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set List contents of import file only (yes/no): no >
Ignore create error due to object existence (yes/no): no >
Import grants (yes/no): yes >
Import table data (yes/no): yes >
Import entire export file (yes/no): no > Username:paymaster
Enter table<T> or partition<T:P> names. NULL list means all tables for user Enter table<T> or partition<T:P> name or . if done:
when I press enter key, the console hangs and a window appears with "Console Window Host has stopped working" then the console closes prematurely.
i have full export dump file....from this, i need to import only one procedure belongs to schema : IC_MIGR_DATA... i need to import into SCHEMA : rep_user...
Import: Release 10.2.0.2.0 - 64bit Production on Friday, 20 June, 2008 16:19:46
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - 64bit Production With the Partitioning, OLAP and Data Mining options ORA-39002: invalid operation ORA-31694: master table "SYSTEM"."SYS_IMPORT_SCHEMA_01" failed to load/unload ORA-31644: unable to position to block number 30698425 in dump file "/AIIC_backup/expbkp/dumps/IC/IC_FULL_19062008.dmp" ******
how to import this one procedure JET_UPLIFT , this has to be imported into REP_USER schema, owner of this procedure is IC_MIGR_DATA
Is it possible to determine whether the dump file is created using data pump export or normal export method by just looking at dump file, If yes, how ?
Why i am asking such question is...normal export and data pump export would create a dump file with an same extension filename.dmp. So to avoid confusion during import, i would want to determine by what method the dump file was created.
Also this would be useful for me at the scenario when the customer sends me only the dumpfile and ask to import into target database. ( may be the customer don't know in what method the dump file was created ).
I have A Daily hot backup using Expdp Command On oracle 10g R2 installed on the Linux server. And I'm trying to move this Dump File to Another directory on Windows server 2003 over network using Ftp script which will be run after the export process finished Automatically.
select sum(bytes)/(1024*1024*1024) "GB" from dba_segments where owner='JACK';
The above select query give the output of Schema size with 15 GB. When i perform the same schema export, the dump file size generating is 2 GB. What is the difference between the two scenarios as how come there could be a variation in file size?
I am in the process of upgrading our 9i DB to 10g . As they are on different servers, I have installed 10g on the new server and applied the latest patchset 10.2.0.4.
I am creating the production database and importing th e9i dump file into this.Now I will be testing the whole application that uses this database.After a week, I need to take the latest 9i dump and export to the new 10g DB.
Do I need to just import the latest 9i dump into the 10g db or do I need to do anything else?
I got: ORA-39001: invalid argument value ORA-39000: bad dump file specification ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"
Done in AIX: create directory dp as '/bak' grant read, write on directory dp to public; grant exp_full_database to username;
Done in Windows: create directory dp as 'C:P7DB'; grant read, write on directory dp to public; grant exp_full_database to username; grant imp_full_database to username;
While trying to import a schema using Data Dump, I am facing the following issue - UDI-00018 - Import utility version can not be more recent than the Data Dump server.Following is the version information of the source and target DB and the utilities :
Source DB server : 10.1.0.2.0 Export utility : 10.1.0.2.0 Import utility : 10.1.0.2.0
Target DB server : 10.1.0.2.0 Export utility : 10.2.0.1.0 Import utility : 10.2.0.1.0
I have a bit of an issue with Oracle datapump dump files.
Today, I manage the export and import of oracle dump files. As part of the batch export process I have a script which essentially says:
For each schema realated to my application in THIS instance, export schema via the system user (system user allows me privs to all schemas).
On the import UI side of things I am able to run a "head -20" command on the dmp file and determine the "export client version", "date the schema was dumped", and "what schema it was dumped from". All useful info presented in my UI.
in that I allow the importation of production schemas into test schemas, (contained in a different tablespace). Based on naming convention I can determine the schema type (production or test). Additionally and probably most importantly, I am assured where the data has come from.
In looking at "expdp" and the dump file. Using the same method as above, it appears the data pump dump DOES NOT carry similar headers. Because of this, I am unable to return very little useful info from the dump file.
I realize I could run the impdp with the "sqlfile=myfile.sql" and then interrogate the sql file for the info. But on large dump files this would be fairly time consuming compared to a "head -20" on a dump file.
I want to know how to add date/time in export dump file in Linux using parfile script. I keep getting an error "contain an invalid substitution variables"
my parfile is:
Dumpfile = Daily_Full_%U_`date "+%Y%m%d%H%S"`.dmp or Dumpfile = Daily_Full_%U_`%date%`.dmp
I have dump file of 17 GB,which i want to import in my database db1 in user amol ;so i created new user under db1 as below ,before this i have created tablespace so that i can import my data only to that tablespace only. My steps are as below.
create user amol identified by amol default tablespace ptaxold1 temporary tablespace tem; imp amol/amol then i mentioned my dump file but after importing it does not show any table under user amol.
How to import dump into specific tablespace instead of default tablespace users.
I want to import my dump file to newly created tablespace ,so how can i do that . I have created new user called cvm and while creating it i mentioned default tablespace to newly created tablespace . But when i try to import my dumo file it goes to users tablespace .
I was told to move 8 tables along with constraints,indexes,grants,rows,triggers from one database to another database.I did export and import for that.The command i used was
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR
character set server uses WE8ISO8859P1 character set (possible charset conversion) About to export specified tables via Direct Path ... . . exporting table tab1 12 rows exported EXP-00091: Exporting questionable statistics. EXP-00091: Exporting questionable statistics.
[code]...
Here is the import output log Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export file created by EXPORT:V10.02.01 via direct path import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set import server uses WE8ISO8859P1 character set (possible charset conversion) . importing JAM's objects into JAM . [code]...
Everything got imported successfully . Still i have a doubt in export and import command, whether the command that i used for export and import was correct or if there is anything need to be added in command.
i'm trying to do an export/import process using command prompt and the idea is export a records based on the date condition. and the date will be the parameter. my code is like this:
exp <username>/<password>@<database> file=<table_name>.dmp tables=<source_table> query="where <date> between &start_date AND &end_date";
is it possible to do like this, that it should prompt you to enter the start and end date?