Server Utilities :: Export / Import Errors?
Apr 26, 2011
I was asked to do export/import of some schemas from 10g(linux) to 11g(AIX) using original expor/import method. I did not consider the character set and started doing export and import. while exporting, I get questionable statistics error in export log file. In the import log, I see the error like CREATE DATABASE LINK "xxxxxxxxxxxxx" CONNECT TO "xxxx" IDENTIFIED BY...
What can be done with these errors?
View 4 Replies
ADVERTISEMENT
Sep 6, 2010
While doing import got the below errors.
Failing sql is:
BEGIN DBMS_JOB.ISUBMIT( JOB=> 361, NEXT_DATE=> TO_DATE('2010-09-06 21:18:27', 'YYYY-MM-DD:HH24:MI:SS'), INTERVAL=> 'SYSDATE
+ 45/86400', WHAT=> 'PK_MONITORS.SP_OVERDUE_JOB;', NO_PARSE=> TRUE); END;
ORA-39083: Object type JOB failed to create with error:
[code]...
View 2 Replies
View Related
Feb 11, 2013
I am trying to use NETWORK_LINK option in datapump and import a table from one server to another. I gave the below command :
C:>impdp example/example@db DIRECTORY=DATA_PUMP_DIR
NETWORK_LINK=db.legal.regn.net remap_schema=BI:example
tables=BI.BI_DIRECT dumpfile=BI.dmp logfile=BI.log
Got the following errors :
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
Is this error related to the permission in the OS level (windows 7 in my case)? I manually created the folder 'DATA_PUMP_DIR' in the specified directory path. Though the directory I created (DATA_PUMP_DIR) shows read-only in the general tab of the property, I am able to create files under the folder 'DATA_PUMP_DIR'.
View 25 Replies
View Related
Jun 12, 2013
I wanted to migrate two schemas from:
Source Database machine OS: OEL 5.9 , 64-bit
Source Database version/addition: 11.2.0.1.0 SE
to:
Target Database machine OS: Red Hat Enterprise Linux Server release 5.4 , 64-bit
Target Database version/addition: 10.2.0.4.0 EE
Therefore, I did an export using Oracle Data Pump:
Export: Release 11.2.0.1.0 - Production on Tue Jun 11 13:13:28 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.1.0 - 64bit Production
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_04": system/******** directory=DATA_PUMP_DIR dumpfile=exp_PRODSCHEMAS_20130611.dmp schemas=PRD_100,SHR_100 logfile=exp_PRODSCHEMAS_20130611.log VERSION=10.2.0.4.0
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 1.026 GB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
[code]....
does it have to do with the VERSION parameter value? what can I check to investigate?
View 14 Replies
View Related
Apr 22, 2013
i am trying to import full db export using datapump , i have too many errors for objects that is already exist . attached is the log file . thae steps i did so far
1- created the database .
2- imported the full db backup using
impdp system/xxxxxxx full=yes directory=datapump dumpfile=palbe_full_20130322.dp log=palbe_full_22042013.log
View 5 Replies
View Related
May 11, 2010
Whats is the usage of log file in Import/export .If i use following command ,it exports successfully
exp scott/tiger file=check.dmp log=empc.log tables=emp
and if i remove .log from here it will also export successfully So why do we use .log in import/export.
View 4 Replies
View Related
Apr 28, 2012
how can i monitor the export and import job and how increase the export and import job performance.
can i monitor the export and import job by checking the log and dump file created by export and import and can its performance increase by configure parallism. m i right or not?
View 2 Replies
View Related
Jun 24, 2008
I would like to run a daily job that will export the table data from SQL server table and Import back into Oracle table. I might need to run the query to update the flag back into sql server table once job is done. How can i do this using either sql server or oracle?
We have oracle 9.2 and sql server 2005.
Normally i do from flat file or csv file which is generated by developer or user from source destination (not me) and i dump into oracle using sql*loader but this time I have to directly extract/export the data from MS Sql server and load into Oracle table, mostly it will reload so i might doing any massaging data during the load.
Is it sql sql*loader has any function that i can use the datasource to connect the MS Sql server and fetch the data and insert back data into oracle? I have access to Sql server but i don't how to use sql server to do this or using oracle as a daily job even because have to schedule the job for this as it will be a daily job.
View 4 Replies
View Related
Sep 19, 2010
We are doing daily cold backup. Due to lack of disk space,we couldn't Hot backup. We want our database to be up when doing backups. Since only export/import is possible in our scenario, clarify few queries:
1) Export was done during off period from the live server.
2) We have a development server, in which we have to update our database daily. Can i overwrite the Development server using IMPORT daily? Since this import might show lots of errors (Object already exist), what parameters can i use for import.
View 2 Replies
View Related
Feb 13, 2012
When i do a table export , it got over in 30 mins.When i do import using same dump file (that was created in 30 mins), its taking more than 30 mins .
why the import is taking more time than the export time ?
View 14 Replies
View Related
Sep 29, 2010
I want to create two or three sachems on my production server which should be the same copy of my another second production server. And I access this second server through VPN connection on toad9.0.1. And I access my production server through VNC viewer and database through toad.
How cloud I create schema on my first prod. server from second server.
View 11 Replies
View Related
Jun 9, 2010
Is there any possible to export & import the data using unix pipe parallel with two server( at the same time)?
Eg. In Server A, i will export the data & import the data in server B at the same time. (300 GB data)
View 2 Replies
View Related
Oct 8, 2012
Is it also possible to use the EXPORT and IMPORT utilities from a client machine? I want to give these utilities to one of my developers without allowing him to sit in front of my Oracle server.
i am using oracle 11.2.0.1 and windows7 OS
View 18 Replies
View Related
Jul 31, 2011
I am using Oracle 11.1.0.7.0 version and UNIX-Hp OS.
which export/import method should i use when i am to do export/import... like table level, tablespace level ,full database level, schema level?
1.Datapump or
2.Traditional export/ import ?
View 2 Replies
View Related
Jun 9, 2011
I was told to move 8 tables along with constraints,indexes,grants,rows,triggers from one database to another database.I did export and import for that.The command i used was
exp p5/chevuri@db3.SBC.COM file=C:alaexp.dmp log=C:alaexp.log
tables= ('tab1','tab2','tab3','tab4','tab5','tab6','tab7') rows=y indexes=y grants=y
constraints=y triggers=y direct=y
Below is the export output log.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 -
64bit Production With the Partitioning, OLAP, Data Mining and Real
Application Testing options Export done in WE8MSWIN1252 character set
and AL16UTF16 NCHAR
character set
server uses WE8ISO8859P1 character set (possible charset conversion)
About to export specified tables via Direct Path ...
. . exporting table tab1 12 rows exported
EXP-00091: Exporting questionable statistics.
EXP-00091: Exporting questionable statistics.
[code]...
Here is the import output log
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export file created by EXPORT:V10.02.01 via direct path
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
import server uses WE8ISO8859P1 character set (possible charset conversion)
. importing JAM's objects into JAM
.
[code]...
Everything got imported successfully . Still i have a doubt in export and import command, whether the command that i used for export and import was correct or if there is anything need to be added in command.
View 4 Replies
View Related
Oct 11, 2011
import and export database in oracle with proper example and procedure?
View 3 Replies
View Related
Jan 12, 2011
1)Want to perform an Export from a Production Schema and Import the results into Test Schema. BUT, do not want to export ALL objects from Production (only a subset of tables). Is this possible?doco on how to do this? (rather than a complete Export and then a complete Import).
2)I have 2 test instances of Oracle on the same development server, UNIT and SIT. Am using Oracle SQL Developer tool.
While in the UNIT instance, is there a way to select data from the SIT instance? An example of syntax to use?
3) Can tables in the UNIT instance be compared to tables in the SIT instance, through any existing Oracle utilities
View 2 Replies
View Related
Jan 28, 2012
In oracle 11g R2 , I face a problem when I export/import xml records in the tables . Everytime it takes huge time (like 2 or 3 days) to import or export the data . But the dump size is very small (4gb) and this dump comes from a vendor so that I dont understand that is it a data structure problem or it the normal behaviour to import xml records.
I am not used to with xml record with oracle database before. Here I am using datapump feature . I also mention that when I delete the schema (where I imported the xml data) , it also takes 2/3 days to delete.The import script will hang in the following stage :
=========================
bash-3.2$ impdp system/sys123 DIRECTORY=isb_dir DUMPFILE=jblbld_20110301_01.dmp,jblbld_20110301_02.dmp,jblbld_20110301_03.dmp logfile=JBLBLD_28Jan2012.log schemas=TBLD;date
Import: Release 11.2.0.1.0 - Production on Sun Jan 29 11:10:38 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_SCHEMA_02": system/******** DIRECTORY=isb_dir DUMPFILE=jblbld_20110301_01.dmp,jblbld_20110301_02.dmp,jblbld_20110301_03.dmp logfile=JBLBLD_28Jan2012.log schemas=TBLD
Processing object type SCHEMA_EXPORT/USER
[code]...
The machine has 16 cpu , 32 gb RAM and during export/import most of the time maximum memory will free.
View 1 Replies
View Related
Apr 25, 2011
I am trying to export/import of a schema who's size is around 60 GB.
Export parfile goes like this..
file=expdmp1.dmp, expdmp2.dmp, expdmp3.dmp, expdmp4.dmp, expdmp5.dmp, expdmp6.dmp, expdmp7.dmp
filesize=10240M
log=explog.log
owner=owner1
Import parfile goes like this..
file=impdmp1.dmp, impdmp2.dmp, impdmp3.dmp, impdmp4.dmp, impdmp5.dmp, impdmp6.dmp, impdmp7.dmp
filesize=10240M
log=implog.log
fromuser=owner1
touser=owner2
ignore=y
I am going to run this on production. So want to check it..
View 2 Replies
View Related
May 31, 2011
I am new to Oracle DBA. m facing one problem. i imported and exported my data from one oracle DB to another by using exp/imp.....in both exp/imp log files, its shows the fallowing messages at the end.
"Import terminated successfully with warnings."
"Export terminated successfully with warnings."
but when i count there are some rows missing in some tables...
what could be the cause? is there any other way to cross check whether export/import was successful...
View 2 Replies
View Related
May 6, 2012
is it possible to import only one or two table from a schema export file or from a full database export file.
View 2 Replies
View Related
Jul 26, 2012
i'm trying to do an export/import process using command prompt and the idea is export a records based on the date condition. and the date will be the parameter. my code is like this:
exp <username>/<password>@<database> file=<table_name>.dmp tables=<source_table> query="where <date> between &start_date AND &end_date";
is it possible to do like this, that it should prompt you to enter the start and end date?
then my import script:
imp <username>/<password>@<database> dumpfile=<table_name>.dmp tables=<target_table>;
the idea is get only the records from ProdDB based on the date condition, and append it to the MISDB.
View 12 Replies
View Related
Feb 20, 2012
Quote:The EXP_FULL_DATABASE and IMP_FULL_DATABASE, respectively, are needed to perform a full export and import.
what privileges are required to perform only schema level export and import?
View 3 Replies
View Related
Feb 2, 2011
any other utilities that we can use to load data from our PROD server (10g) to DEV server (9i)? I've read some related topics here that it's not possible to import from a HIGHER to LOWER versions of Oracle. We've tried (many times) EXPorting selected tables from the 10g dB, then IMPort it to the 9i dB and we've haven't succeeded anyhow. PROD & DEV have a different schema/owner but the same table structures.
View 4 Replies
View Related
Aug 14, 2012
I am getting following errors while importing data.
SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX
ORA-39083: Object type INDEX failed to create with error:
ORA-06550: line 2, column 1:
PLS-00201: identifier 'CTXSYS.DRIIMP' must be declared
ORA-06550: line 2, column 1:
[code]......
View 6 Replies
View Related
Jan 30, 2013
I am getting the below error when i am trying to import the dmp file into my oracle 11g.
impdp system/password@orcl dumpfile=aaa.dmp directory=datapump remap_schema=dev:user
ORA-31626: job does not exist
ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors
ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_INTERNAL_
LOGSTDBY"
ORA-06512: at "SYS.KUPV$FT", line 949
ORA-04063: package body "SYS.DBMS_LOGREP_UTIL" has errors
ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_LOGREP_UTIL"
View 7 Replies
View Related
Aug 21, 2010
i get flat file and i have set up a control M job so that it runs at a particular time.
initially my control file was as below:
LOAD DATA
INFILE 'DFILEabcd.dat'
BADFILE 'BDabcd.bad'
REPLACE
INTO TABLE abcd_table
(A position(01:09) CHAR,
B position(11:12) CHAR,
C position(14:33) CHAR,
D position(37:50) char)
this was working fine. control M did not send FAIL message.but later i had to change the ctl file due to requirement. i had to add a when clause.
my code after modification is:
INFILE 'DFILEabcd.dat'
BADFILE 'BDabcd.bad'
REPLACE
INTO TABLE abcd_table
when A<>'10'
(A position(01:09) CHAR,
B position(11:12) CHAR,
C position(14:33) CHAR,
D position(37:50)
now the control M is sending an erro message after it runs the job. error is Return 5. thats all it gives.
i think it is due to errorlevel 1. in log file it says zero records inserted due to data error. then what is causing control M to send fail message??
sqlloader is loading all the required records correctly.
View 10 Replies
View Related
Aug 7, 2012
1) Is there a way to skip database jobs while exporting (EXPDP) ?
2) Is there a way to skip database jobs while importing (IMPDP) ?
View 3 Replies
View Related
Aug 9, 2012
Why do export-import require temporary tablespace? Since export-import do behave like DMLs, when does temporary tablespace be needed by datapump utility?
View 2 Replies
View Related
Jul 24, 2012
I have installed Oracle server and SQL Server on separate machines which cause me a time delay of 21 seconds for each execution. Why executions delay? I have set RPC out (true).
Note: My main concern is either if the query is correct/incorrect it executes for 21 seconds._
Another case when I have both servers on the same machine it executes in milliseconds. I have tried Following methods in SQL SERVER.
*1, Using OPENQUERY:*
SELECT * From OPENQUERY(Linked Server Name,’Select * from OracleTableName ‘)
*2, Using Exec:*
DECLARE @sql NVARCHAR(MAX);
SET @sql =(’Select * from OracleTableName ‘);
EXEC (@sql) AT Linked Server Name ;
How to reduce the time delay caused for the execution?
View 0 Replies
View Related