Server Utilities :: ORA-29913 / Error In Executing ODCIEXTTABLEFETCH Callout
Jul 6, 2012
while i'm importing schemas i came accross
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-02291: integrity constraint (STATICO.FK_PROD_REQ_LIS_TAG_GEN_2) violated - parent key not found
ORA-31693: Table data object "STATICO"."PROD_REQ_LIS_TAG_GEN":"GEN_DS_LOAN" failed to load/unload and is being skipped due to error:
SQL> select * from ext; select * from ext * ERROR at line 1: ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-04063: unable to open log file empxt.log OS error The system cannot find the file specified. ORA-06512: at "SYS.ORACLE_LOADER", line 19
I exported three databases : servicedesk, report and mostcmdb on server A to a dumpfile. Moved them to similar dump location on server B. Now, I want to import them into B by using:
I have created the directory pointing on 'C:data'...And loaded a comma delimited CSV file in there...
- Checked the csv permissions ther are set to 'everyone'
- Checked the previladges of the directory, they are set to 'READ/WRITE'
But when I issue a select statement against the exte table I get an error '
ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-04040: file input.csv in DUMP_TXT not found ORA-06512: at "SYS.ORACLE_LOADER", line 14 ORA-06512: at line 1'
Procedure: CREATE OR REPLACE procedure test (a number, b varchar2) is
begin dbms_output.put_line(a ||'->'||b); end;
Anyonymous Block:
begin exec test(1,'m'); end; / When i run this i am getting this error ORA-06550: line 2, column 7: PLS-00103: Encountered the symbol "TEST" when expecting one of the following [code]...
I am trying to execute an inline spatial query from c# using ODP.NET. This is a select query and I am passing 2 parameters to this query, one is string and other one is SDO_GEOMETRY. I am able to send SDO_Geometry parameter from c# using Oracle UDT custom designed classes.
Query is working as intended but occasionally giving the following errors:
ORA-29902: error in executing ODCIIndexStart() routine ORA-13033: Invalid data in the SDO_ELEM_INFO_ARRAY in SDO_GEOMETRY object ORA-06512: at "MDSYS.SDO_INDEX_METHOD_10I", line 333 ORA-29902: error in executing ODCIIndexStart() routine ORA-13031: Invalid Gtype in the SDO_GEOMETRY object for point object ORA-06512: at "MDSYS.SDO_INDEX_METHOD_10I", line 333
I could not figure out the cause of this issue (as it is not failing frequently). Am i getting this issue with the actual query or with Oracle UDT conversion.
I am using a script to export my oracle database full backup and from few days my script is getting abended (Export terminated successfully with warnings).When I checked in log file and audit file I found below error:
EXP-00107: Feature (BINARY XML) of column XMLTYPE001 in table APEX_040000.WWV_FLOW_COLLECTION_MEMBERS$ is not supported. The table will not be exported.
For crosscheck I tried to create a table with XMLtype data type and backed up using exp and that was successful.
i am having a problem when trying to export my DB,i could run an import fine,i have ran the catalog.sql,catproc.sql,catexp.sql and utlrp.sql again.Is it because the client and DB are different?How can i solve this problem
exp usr/pass file=exp_full.dmp log=exp_full.log full=y consistent=y Export: Release 8.1.7.0.0 - Production on Wed Nov 28 13:40:04 2007 (c) Copyright 2000 Oracle Corporation. All rights reserved. Connected to: Oracle8i Enterprise Edition Release 8.1.6.0.0 - Production With the Partitioning option
I have Live Server with 11g R2 with Dataguard on server 2008 server 64 bit. I exporting the DMP from live and importing in 10g on Server 2003 32 bit. All tables are not imported.
Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA ORA-39083: Object type PROCACT_SCHEMA failed to create with error: ORA-31625: Schema ADAS is needed to import this object, but is unaccessible ORA-28031: maximum of 148 enabled roles exceeded
[code]...
ORA-06512: at "SYS.KUPW$WORKER", line 1342 ORA-06512: at line 2
Job "SYS"."SYS_IMPORT_FULL_01" stopped due to fatal error at 17:13:38
We are implementing a 2 node RAC configuration with ASM on vmware and openfiler on LINUX RHEL 6.2. We started our installation with grid infrastructure. While executing root.sh on node 1 it gives error diskgroup cannot be mounted and no alterntions perfomed as below.
+[main] [ 2012-10-04 05:38:33.150 PDT ] [UsmcaLogger.logException:173] SEVERE:method oracle.sysman.assistants.usmca.backend.USMDiskGroupManager:mountDiskGroups+ +[main] [ 2012-10-04 05:38:33.151 PDT ] [UsmcaLogger.logException:174] ORA-15032: not all alterations performed+ ORA-15017: diskgroup "CRS" cannot be mounted
I have oracle 10.2.0.4.0 installed on Window server 2008 [Machine A] and oracle 10.2.0.1.0 on windowx xp [Machine b]. Now I have taken the export of database on windows server 2k8 [Machine a] by puting the entry in the tnsname.ora file of Windox XP [Machine b].
Now when I am importing on the same machine I am getting the below mentioned error:
C:Documents and Settingsdsharma>IMP FROMUSER=SYSTEM TOUSER=ESCDBO FILE='D:sharevcc53_0106.dmp' LOG='D:sharevcc53_0106_IMP.LOG' ignore=y
Import: Release 10.2.0.1.0 - Production on Thu Jun 9 20:08:11 2011 Copyright (c) 1982, 2005, Oracle. All rights reserved.
Username: system Password:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production..With the Partitioning, OLAP and Data Mining options
Export file created by EXPORT:V10.02.01 via conventional path import done in WE8MSWIN1252 character set and UTF8 NCHAR character set import server uses AL32UTF8 character set (possible charset conversion) export client uses US7ASCII character set (possible charset conversion) Import terminated successfully without warnings.
I have the problem with import in Oracle 8.1.7.The size of import file is 29600 kb and tablespace size is 16gb and when I try to make import oracle back this message:
IMP-00003: ORACLE error 1659 encountered ORA-01659: unable to allocate MINEXTENTS beyond 7 in tablespace DATA
The data tablespace is full. I think that the import file contains information about the original tablespace from which has made export. But I don't now how to resolve the problem
when i try to export schema using expdp i got error Like
Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 6.25 MB Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
I did an export from 8.1.7 and imported in 10g. All the tables were imported. I created all the users and gave them necessary grants. when i try to compile synonyms for this schema i imported,I get the following error in enterprise manager:
Failed to compile: ORA-00980: synonym translation is no longer valid
I cannot login to my application with these users even though they have all grants. I receive the following error:
My oracle is sitting on UNIX, i have a sql loader scripts which load the data in oracle at every 10 min and bad files is written into a directory. since the file names are same it overwrite the badfiles in case of error record. i can devise a code to write the bad file with different name. I want to write error record into oracle table, is this possible and how can i achieve ?
But gives the following error ORA-31693: Table data object "LOG"."DAILY_LOG" failed to load/unload and is being skipped due to error: ORA-00904: "YYYYMMDD": invalid identifier
I tried with simple sql with YYYMMDD and it works fine, the entry_date is a char field. in QUERY where i'm doing wrong here?
SQLLDR userid=userid/passwd@vpl01 control=OtherType.ctl log=OtherType.log bad=OtherType.bad SQL*Loader: Release 11.2.0.1.0 - Production on Sat Aug 25 13:32:42 2012 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. SQL*Loader-704: Internal error: ulconnect: OCIEnvCreate [-1]
If I provide full connection string, it gives syntax error:
sqlldr userid/passwd@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=orasrv)(PORT=1526))(CONNECT_DATA=(SID=vpl01))) control=OtherType.ctl log=OtherType.log bad=OtherType.bad LRM-00116: Message 116 not found; No message file for product=ORACORE, facility=LRM
[code]...
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL>
but it doesn't work if supplied following command:sqlplus userid/passwd@vpl01
SQL*Plus: Release 11.2.0.3.0 Production on Sat Aug 25 13:32:14 2012 Copyright (c) 1982, 2011, Oracle. All rights reserved. ERROR: ORA-12154: TNS:could not resolve the connect identifier specified
Enter user-name:
Even tnsping vpl01 gives error: TNS Ping Utility for 64-bit Windows: Version 11.2.0.1.0 - Production on 25-AUG-2012 09:14:40 Copyright (c) 1997, 2010, Oracle. All rights reserved. Used parameter files:
We have a QA database on a VM server with Windows 2003 operating system and oracle 10.2.0.1 installed along with limited disk space.We received an expdp file from a client that is large enough that we had to copy it to a network drive (40GB). I created a new directory called IMPDMP with the directory path (using UNC pathing) to \serversharefoldersubfolder (our network mapped P drive, yes I included the backslash, but I have tried without it also). I also included the parfile here. I checked the grants and they seem to be fine
SQL> select * from session_roles where role like '%DATABASE' or role like 'DBA';
ROLE ------------------------------ DBA EXP_FULL_DATABASE IMP_FULL_DATABASE
SQL> select * from session_privs where privilege like '%DICT%';
PRIVILEGE ---------------------------------------- SELECT ANY DICTIONARY ANALYZE ANY DICTIONARY [code]....
My questions are this:
1) In interactive mode, does a dummy file expdat.dmp have to exist in the DATA_PUMP_DIR directory?
2) does my export have to reside in the DATA_PUMP_DIR directory (again, no disk space to handle the DMP file), one of the hard drives is just big enough to handle the space but since it has datafiles there also, it would crash during import when trying to extend.
have a requirement to load .dmp files into existing staging tables and there is package to load the ODS tables from staging.So,I thought of using DBMS_Datapump utility to import the data from .DMP files to the Tables and this need be automated.
--Create Directory CREATE OR REPLACE DIRECTORY test_dir AS 'C:Test'
--grant Access to the User GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
--Script to import DECLARE l_dp_handle1 NUMBER; BEGIN l_dp_handle1 := dbms_datapump.OPEN(operation => 'IMPORT',
[code]...
Errors
ERROR at line 1: ORA-31634: job already exists ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 938 ORA-06512: at "SYS.DBMS_DATAPUMP", line 4590 ORA-06512: at line 4
I have a table which is being load by sqlloader, when i load the table without direct path set to TRUE IT Works well , but when DIRECT path set to TRUE ,it comes out with the following error
SQL*Loader-702: Internal error - Unknown column for OCI_ATTR_COL_COUNT SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
control file looks like below.
load data BADFILE '/backup/temp/rajesh/RIO/BadFiles/FILENAME' append into table TEMP_rio_RESP_TIME_LND TRAILING NULLCOLS ( INSTALLATION_ID CHAR
Since there is an extra double quote (denoting inch) in the third column, im getting an error. Is there any way to avoid this error without modifying the csv file.
Below function has been used to transfor data and callled in sql loader control file CREATE OR REPLACE function return_domain( domain_name varchar2) return varchar2 as v_dmn varchar2(100)
[code]...
sql loader control file is as below:
load data BADFILE '/backup/temp/rajesh/CERNER/BadFiles/FILENAME' append into table TEMP_CERNER_RESP_TIME_LND WHEN CLINICAL_TRANSACTION_ID = 'USR:ERM PMSEARCH ENCOUNTER RESULTS DISPLAY' TRAILING NULLCOLS
[code]...
function takes the parameter as 'DOMAIN50_LPAR5002_slainterval051712_rj35cmi102_08_45_00.csv '
FILENAME in control file will be replace by DOMAIN50_LPAR5002_slainterval051712_rj35cmi102_08_45_00.csv when i run the the the loader i get the below error.
Record 1: Rejected - Error on table TEMP_CERNER_RESP_TIME_LND. ORA-00604: error occurred at recursive SQL level 1 ORA-01843: not a valid month