Server Utilities :: FTP Dump File Over Network
Apr 19, 2010
I have A Daily hot backup using Expdp Command On oracle 10g R2 installed on the Linux server. And I'm trying to move this Dump File to Another directory on Windows server 2003 over network using Ftp script which will be run after the export process finished Automatically.
View 9 Replies
ADVERTISEMENT
Mar 29, 2013
Is it possible to determine whether the dump file is created using data pump export or normal export method by just looking at dump file, If yes, how ?
Why i am asking such question is...normal export and data pump export would create a dump file with an same extension filename.dmp. So to avoid confusion during import, i would want to determine by what method the dump file was created.
Also this would be useful for me at the scenario when the customer sends me only the dumpfile and ask to import into target database. ( may be the customer don't know in what method the dump file was created ).
View 23 Replies
View Related
Jan 18, 2012
I have a question on export dump file generation.
select sum(bytes)/(1024*1024*1024) "GB" from dba_segments where owner='JACK';
The above select query give the output of Schema size with 15 GB. When i perform the same schema export, the dump file size generating is 2 GB. What is the difference between the two scenarios as how come there could be a variation in file size?
View 6 Replies
View Related
May 29, 2010
I am in the process of upgrading our 9i DB to 10g . As they are on different servers, I have installed 10g on the new server and applied the latest patchset 10.2.0.4.
I am creating the production database and importing th e9i dump file into this.Now I will be testing the whole application that uses this database.After a week, I need to take the latest 9i dump and export to the new 10g DB.
Do I need to just import the latest 9i dump into the 10g db or do I need to do anything else?
View 3 Replies
View Related
Feb 24, 2012
I am facing a problem importing DMP file in 11g. While importing it gives me error not responding. I have to attached the jpg file for that to clear you my point whats wrong is going during import. My Dump is on 9i i want to import that on 11G R2.
View 4 Replies
View Related
Jul 29, 2011
Is it possible to identify what level of export by looking at export dumpfile .. whether it is a schema export,full export,table export,..
If yes.. how ?
View 3 Replies
View Related
Jan 3, 2012
I want to import dump file (without 2 tables) .The dump file contains 100 tables,indexes and constraints. So out of 100 tables i want to import 98 tables from dump file (without 2 tables).
View 13 Replies
View Related
May 29, 2012
I need to recreate/ clone my database to a new machine. The two machine are not connected in the network.
Step 1. (Oracle 10.2.0.5 AIX 64-bit)
expdp username/password@db1 full=y dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_expdp.log job_name=full_exp
Step 2.
FTP dump files to Windows
Step 3. (Oracle 10.1.0.2 Windows 32-bit)
impdp username/password@db1 dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_impdp.log full=y
I got:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"
Done in AIX:
create directory dp as '/bak'
grant read, write on directory dp to public;
grant exp_full_database to username;
Done in Windows:
create directory dp as 'C:P7DB';
grant read, write on directory dp to public;
grant exp_full_database to username;
grant imp_full_database to username;
View 8 Replies
View Related
Nov 1, 2012
While trying to import a schema using Data Dump, I am facing the following issue - UDI-00018 - Import utility version can not be more recent than the Data Dump server.Following is the version information of the source and target DB and the utilities :
Source DB server : 10.1.0.2.0
Export utility : 10.1.0.2.0
Import utility : 10.1.0.2.0
Target DB server : 10.1.0.2.0
Export utility : 10.2.0.1.0
Import utility : 10.2.0.1.0
View 5 Replies
View Related
Jan 2, 2012
Is there a way to know what kind of Backup (table/tablespace/full/schema) by looking at export dump file ? If yes, can you tell me the command ?
View 1 Replies
View Related
Jul 16, 2010
I have a bit of an issue with Oracle datapump dump files.
Today, I manage the export and import of oracle dump files. As part of the batch export process I have a script which essentially says:
For each schema realated to my application in THIS instance, export schema via the system user (system user allows me privs to all schemas).
On the import UI side of things I am able to run a "head -20" command on the dmp file and determine the "export client version", "date the schema was dumped", and "what schema it was dumped from". All useful info presented in my UI.
Sample output: Begin
EXPORT:V09.02.00
DSYSTEM
RUSERS
8192
Wed Jun 30 11:51:21 UserXXX.dmp
#C##
#C##
[code]....
Sample output: End
in that I allow the importation of production schemas into test schemas, (contained in a different tablespace). Based on naming convention I can determine the schema type (production or test). Additionally and probably most importantly, I am assured where the data has come from.
In looking at "expdp" and the dump file. Using the same method as above, it appears the data pump dump DOES NOT carry similar headers. Because of this, I am unable to return very little useful info from the dump file.
I realize I could run the impdp with the "sqlfile=myfile.sql" and then interrogate the sql file for the info. But on large dump files this would be fairly time consuming compared to a "head -20" on a dump file.
View 4 Replies
View Related
Jul 11, 2013
I want to know how to add date/time in export dump file in Linux using parfile script. I keep getting an error "contain an invalid substitution variables"
my parfile is:
Dumpfile = Daily_Full_%U_`date "+%Y%m%d%H%S"`.dmp or
Dumpfile = Daily_Full_%U_`%date%`.dmp
View 1 Replies
View Related
Apr 13, 2012
How we can overwrite existing dump file for expdp in oracle 10g because everytime we excute expdp and dmp file exist we get below error
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "C:scott_emp.dmp"
ORA-27038: created file already exists
OSD-04010: <create> option specified, file already exists
We have one feature in 11g reuse_dumpfiles=y ,which doesnt work in 10g, I want something which can overwrite existing dumpfile in 10g?
View 1 Replies
View Related
Feb 25, 2011
What is the Default Path of Log File after Import Dump in Oracle 10g.
View 1 Replies
View Related
Nov 21, 2012
I restore Dump File in Oracle 10g .
my Command is :
impdp DUMPFILE=BAK910830.DMP NOLOGFILE=Y
An error message is as follows:
"Invalid argument value"
"Bad dump file specification"
"Dump File may ba an original export Dump file "
I think the dump File is in Oracle 11g Format .
View 8 Replies
View Related
Nov 16, 2011
How to import constraints only from a dump file using Oracle data pump.
View 1 Replies
View Related
Jul 5, 2012
I'm rying to import schema's from a dump file that came from a different environment.
What I have is:
1. dump file
2. log file of the export
I'm trying to import the file(containing three schemas) with remap_schemas, and it fails, gives a lot of ORA-00959: tablespace 'string' does not exist.
Now, I've read in OTN:
[URL]
that what you need to do in that case is to use the REMAP_TABLESPACE option,to redirect the objects to a different tablespace.
I don't see a name of the tablespace I'm getting the error for in the export log.I don't know if I have more tablespaces I have to redirect with REMAP_TABLESPACE.
I don't want to perform this 3 times, have an error, by that find out what's the next tablespace needing redirection and only then starting over...
How can I know from the dump file and the log file,what is the tablespace names i need for the redirection to my names? Or its just that the tablespace giving me the error is the only one in the dump file?
View 3 Replies
View Related
Dec 20, 2011
I have dump file of 17 GB,which i want to import in my database db1 in user amol ;so i created new user under db1 as below ,before this i have created tablespace so that i can import my data only to that tablespace only. My steps are as below.
CREATE TABLESPACE ptaxold1 DATAFILE '/home/oracle/oracle/product/10.2.0/oradata/cvsdbm/ptaxold1.dbf' SIZE 6024M AUTOEXTEND ON;
create user amol identified by amol default tablespace ptaxold1 temporary tablespace tem; imp amol/amol then i mentioned my dump file but after importing it does not show any table under user amol.
View 12 Replies
View Related
Feb 19, 2012
Is it possible to import a dump file using impdp data pump utility on oracle 10g where the export dump was taken using traditional exp utility and vice versa.
View 1 Replies
View Related
Sep 29, 2010
I want to create two or three sachems on my production server which should be the same copy of my another second production server. And I access this second server through VPN connection on toad9.0.1. And I access my production server through VNC viewer and database through toad.
How cloud I create schema on my first prod. server from second server.
View 11 Replies
View Related
Feb 23, 2012
i've 0racle 9i database, i wanted to know the easiest way how to copy that database on another computer throw local network,
View 15 Replies
View Related
Jun 22, 2011
how can i import the oracle 9i dump file into 10g database, while iporting i get following error imp-00002 fail to open dump file
View 4 Replies
View Related
Jun 14, 2006
Is there is any way of reading from oracle dump file?
View 16 Replies
View Related
Mar 4, 2010
i need to dump the table from A schema to B schema with different table name.
Suppose i have TABLE A IN "A" SCHEMA i need to dump the table with DATA+sTRUCTURE in " B"SCHEMA WITH TABLE NAME AS B.
View 3 Replies
View Related
Feb 7, 2011
I have exported a schema dump with schema name as 'A'.I want to import that dump in to schema 'B'.how ?
View 5 Replies
View Related
Jan 14, 2011
I am using below command to import a schema using network link. Command is :
impdp system directory = IMP_DIR schemas = XYZ network_link = PQR remap_schema = XYZ:XYZ exclude=view: "= 'XYZ.VW_ACCEPTDETAILS'"
This command is giving below error
LRM-00116: syntax error at 'view:' following '='
When I have tried Like in place of '=' sign i.e. EXCLUDE = VIEW:"LIKE '%VW_ACCEPTDETAILS%'" , it gives me below error:
UDI-00014: invalid value for parameter, 'exclude'
View 4 Replies
View Related
Apr 9, 2010
We have two databases running on 10.2.0.4 and 9.2.0.8. Both are having the same unpartitioned table of size 80G. I am exporting the table on 10g by using parallel=8 and dumpfile with %U option. That took around 4 hours to export the table.
And on 9.2.0.8, i am exporting using below parameters, taking around 5 hours.
buffer=2000000
recordlength=64000
options i can try to speed up the export in both versions.
View 2 Replies
View Related
Mar 31, 2010
If I want to import 10g export dump file in to the 9i database, I am connecting to 10g database from 9i database using
exp user/password@10gdb ....
However is there any option like executing 9i catexp.dat on 10g database and do the export from 10g database itself to be imported into 9i?
View 6 Replies
View Related
Jan 11, 2012
I want to take a schema level export .The schema size is 115 GB size . Do we require same amount of space to be available in server side (where we are taking a dump) as the schema size or less or more space is required in server side ?
View 6 Replies
View Related
May 31, 2011
I tried to import a dump in 11g that was taken in oracle 9i. The import started but it hangs after some time. Exactly say it check only the character set of the DB's then it hangs. let me know if there are any specific procedures to import a dump from 9i to 11g directly.
View 8 Replies
View Related