I was asked to do export/import of some schemas from 10g(linux) to 11g(AIX) using original expor/import method. I did not consider the character set and started doing export and import. while exporting, I get questionable statistics error in export log file. In the import log, I see the error like CREATE DATABASE LINK "xxxxxxxxxxxxx" CONNECT TO "xxxx" IDENTIFIED BY...
ORA-39002: invalid operation ORA-39070: Unable to open the log file. ORA-29283: invalid file operation ORA-06512: at "SYS.UTL_FILE", line 536 ORA-29283: invalid file operation
Is this error related to the permission in the OS level (windows 7 in my case)? I manually created the folder 'DATA_PUMP_DIR' in the specified directory path. Though the directory I created (DATA_PUMP_DIR) shows read-only in the general tab of the property, I am able to create files under the folder 'DATA_PUMP_DIR'.
Target Database machine OS: Red Hat Enterprise Linux Server release 5.4 , 64-bit Target Database version/addition: 10.2.0.4.0 EE
Therefore, I did an export using Oracle Data Pump:
Export: Release 11.2.0.1.0 - Production on Tue Jun 11 13:13:28 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.1.0 - 64bit Production Starting "SYSTEM"."SYS_EXPORT_SCHEMA_04": system/******** directory=DATA_PUMP_DIR dumpfile=exp_PRODSCHEMAS_20130611.dmp schemas=PRD_100,SHR_100 logfile=exp_PRODSCHEMAS_20130611.log VERSION=10.2.0.4.0 Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 1.026 GB Processing object type SCHEMA_EXPORT/USER Processing object type SCHEMA_EXPORT/SYSTEM_GRANT [code]....
does it have to do with the VERSION parameter value? what can I check to investigate?
I am getting following errors while importing data.
SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX ORA-39083: Object type INDEX failed to create with error: ORA-06550: line 2, column 1: PLS-00201: identifier 'CTXSYS.DRIIMP' must be declared ORA-06550: line 2, column 1: [code]......
i am trying to import full db export using datapump , i have too many errors for objects that is already exist . attached is the log file . thae steps i did so far
ORA-31626: job does not exist ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_INTERNAL_ LOGSTDBY" ORA-06512: at "SYS.KUPV$FT", line 949 ORA-04063: package body "SYS.DBMS_LOGREP_UTIL" has errors ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_LOGREP_UTIL"
i get flat file and i have set up a control M job so that it runs at a particular time.
initially my control file was as below:
LOAD DATA INFILE 'DFILEabcd.dat' BADFILE 'BDabcd.bad' REPLACE INTO TABLE abcd_table (A position(01:09) CHAR, B position(11:12) CHAR, C position(14:33) CHAR, D position(37:50) char)
this was working fine. control M did not send FAIL message.but later i had to change the ctl file due to requirement. i had to add a when clause.
my code after modification is:
INFILE 'DFILEabcd.dat' BADFILE 'BDabcd.bad' REPLACE INTO TABLE abcd_table when A<>'10' (A position(01:09) CHAR, B position(11:12) CHAR, C position(14:33) CHAR, D position(37:50)
now the control M is sending an erro message after it runs the job. error is Return 5. thats all it gives.
i think it is due to errorlevel 1. in log file it says zero records inserted due to data error. then what is causing control M to send fail message??
sqlloader is loading all the required records correctly.
i am working on oracle 11g r2. is there any way to go back to oracle 9i with use of import/export utility. Should i take downgraded export from oracle 11g for oracle 9i.
We have a table in Oracle9i database with around 14 million records and we would like to import that table into 10g database with similar structure. We have exported the table from 9i database and would like to import the table into 10g database within same schema name with different table name as we already have the table with same name in 10g database in same schema. Is it possible to import a table with different table name?
We have a way around to import the table into 10g database in another schema and then push the data into our main table but want to know whether the above requirement is possible.
I have Live Server with 11g R2 with Dataguard on server 2008 server 64 bit. I exporting the DMP from live and importing in 10g on Server 2003 32 bit. All tables are not imported.
I have taken an export using expdp of schema, data of the schema spread across different tablespaces , now i want to import the data to only one tablespace.
I am getting the below error when I import a table from Prod to Dev. I understand this error will be occured if length of the datatype is low. First I got the error when the datatype(length) which is 25 for the column PASSWORD column.Then I increased the length of this column to 45, then it was imported successfully.
why am facing the error when the datatye and length for this table is same in prod and dev? What are the possible ways to import the data without increasing the PASSWORD column length?
IMP-00019: row rejected due to ORACLE error 12899 IMP-00003: ORACLE error 12899 encountered ORA-12899: value too large for column "ANEES"."SALSA_WEB_ACCESS"."PASSWORD" (actual: 28, maximum: 25) [code]....
Import: Release 11.2.0.1.0 - Production on Tue May 7 16:51:47 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Username: sys as sysdba Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options ORA-39001: invalid argument value ORA-39000: bad dump file specification ORA-39088: file name cannot contain a path specification
[oracle@oracledbserver product]$
This is Oracle 11g hosted on an eucalyptus cloud instance.
I am trying to run the imp command from dos prompt.
Here is the error I got .
C:Documents and SettingssairammMy Documentsartmsexp_AUDT2_04292012>imp sairamm/mypassword@aws fromuser=AUDT2 toUSER=SAIRAMM file=exp_AUDT2_04292012.dmp log=imp_AUDT2_04292012.log BUFFE R=10000000 GRANTS=y Import: Release 9.2.0.1.0 - Production on Mon Apr 30 14:23:29 2012 Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, Oracle Label Security, OLAP, Data Mining,Oracle Database Vault and Real Application Testing option
IMP-00010: not a valid export file, header failed verification IMP-00000: Import terminated unsuccessfully
I have the problem with import in Oracle 8.1.7.The size of import file is 29600 kb and tablespace size is 16gb and when I try to make import oracle back this message:
IMP-00003: ORACLE error 1659 encountered ORA-01659: unable to allocate MINEXTENTS beyond 7 in tablespace DATA
The data tablespace is full. I think that the import file contains information about the original tablespace from which has made export. But I don't now how to resolve the problem
how can i monitor the export and import job and how increase the export and import job performance.
can i monitor the export and import job by checking the log and dump file created by export and import and can its performance increase by configure parallism. m i right or not?
I wanted to export a table "emp_production" from Production database then import it as "emp_datawarehouse" in Data warehouse database.Both tables has same structure. I have granted IMPORT FULL DATABASE & EXPORT FULL DATABASE privileges to both schema
I would like to run a daily job that will export the table data from SQL server table and Import back into Oracle table. I might need to run the query to update the flag back into sql server table once job is done. How can i do this using either sql server or oracle?
We have oracle 9.2 and sql server 2005.
Normally i do from flat file or csv file which is generated by developer or user from source destination (not me) and i dump into oracle using sql*loader but this time I have to directly extract/export the data from MS Sql server and load into Oracle table, mostly it will reload so i might doing any massaging data during the load.
Is it sql sql*loader has any function that i can use the datasource to connect the MS Sql server and fetch the data and insert back data into oracle? I have access to Sql server but i don't how to use sql server to do this or using oracle as a daily job even because have to schedule the job for this as it will be a daily job.
I am facing a problem importing DMP file in 11g. While importing it gives me error not responding. I have to attached the jpg file for that to clear you my point whats wrong is going during import. My Dump is on 9i i want to import that on 11G R2.
I have a text file of 175G with the delimeter as '|*'. I want to import it using sql *loader. I know that sql *loader can load data parallely. I can employ two schemes for importing it using sql *loader
a) with the text file being imported all at once using option parallel = true b) with the single text file being broken down into files of smaller size(say 5G each or more) and then imported in parallel using parallel = true.
I can import it in somewhat powerful server with 128G of ram, 16 intel CPus and abundant hard disk space.Now my question is which option would be better in performance (in terms of time) it takes to import data?I donot have sufficient time to test both of these approaches.