SQL & PL/SQL :: Receiving ORA-29913 And ORA-30653 Errors And Query Does Not Complete
Sep 23, 2010
Receiving ORA-29913 and ORA-30653 errors and query does not complete. The queries that receive the error are shown below <> represents substitute your filename; There are actually three different queries that must run to produce an expected output. Those queries run in the order which they appear below:
Query1
select distinct HLQ "highlevel", RESOURCE "RES" from <filename1>
Query2
select distinct ENT.RESOURCE "RES", ENT.ID "PLEX"'
case
when ENT.ACTION is null then nextkey'
else ent.action
end "ACT",
case
when ENT.ACTION is null then ENT.CONDITION_NEXTKEY
end "NRULE",
'TESTSET' "ENTTYPE", replace(My.Owner, '"',' ') "Owner"
from <filename1> ENT, <filename2> ORS
where ENT.f1 = ORS.f2
and ENT.APPLICATION = '<entertypehere>'
Query3
select ENT.ID "AID", ENT.RESOURCE "RES", ENT.ID "PLEX", replace(My.Owner, '"',' ') "Owner", TESTSET' "ENTTYPE" from <filename1> ENT, <filename2> ORS
where APPLICATION = '<entertype>'
and ENT.<f1_column_name> = ORS.<f2_column_name>
Error returned is: "Ora-29913: error in executing ODICEXTTABLEFETCH callout; Ora-30653: reject limit reached"
In the log file of the job that runs these queries, I can also see the following
NULL if (CAS9 = ~) error processing column C5 in row 1 for datafile /home/oracle/DataDir/<filename>_metadata.data
I have checked and it looks like the routine creates ODICEXTTABLEFETCH as it needs it and then throws it away leaving me with Ora-12899 Value too large for column C5 (actual: 13, maximum: 10)
View 2 Replies
ADVERTISEMENT
Jun 23, 2011
I'm joinging two tables event_types and tmp_acc tables.
event_types contains 2 Billion records
tmp_acc contains 20,000 records.
Resulting rows are about 300,000 records in event_types table end_t and account_obj_id0 are joined indexed
no indexs in tmp_acc.
When I run below query with nexted loop it takes 6 hrs to complete. But when I run with hash join even after 4 days it was still running. what is wrong with hash join here. Why it takes so long. I'm joining only 20000 rows. So I think there should be a way to get result rows quickly.
show parameters hash_area_size
NAME TYPE VALUE
------------------------------------ ----------- ------------------------------
hash_area_size integer 2097152
explain plan for
select --+ parallel(e,6)
[code]....
View 21 Replies
View Related
Mar 10, 2012
I had run the following code in Oracle 10g..
SQL> CREATE TABLE ext(
2 empno varchar2(100),
3 login Varchar2(100))
4 ORGANIZATION EXTERNAL
5 (TYPE ORACLE_LOADER
[code]....
Table created.
But got this errors...
SQL> select * from ext;
select * from ext
*
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: unable to open log file empxt.log
OS error The system cannot find the file specified.
ORA-06512: at "SYS.ORACLE_LOADER", line 19
View 16 Replies
View Related
Aug 12, 2011
I have a problem here I am using this script to create an external table
CREATE TABLE SUPP_GIM.T_CSV_DATA
(
UPC VARCHAR2(30),
AGEIND VARCHAR2(30),
DEPT VARCHAR2(30),
CREG VARCHAR2(30),
SUPP VARCHAR2(30),
PRICE VARCHAR2(30),
[code]....
I have created the directory pointing on 'C:data'...And loaded a comma delimited CSV file in there...
- Checked the csv permissions ther are set to 'everyone'
- Checked the previladges of the directory, they are set to 'READ/WRITE'
But when I issue a select statement against the exte table I get an error '
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file input.csv in DUMP_TXT not found
ORA-06512: at "SYS.ORACLE_LOADER", line 14
ORA-06512: at line 1'
View 3 Replies
View Related
Jul 6, 2012
while i'm importing schemas i came accross
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-02291: integrity constraint (STATICO.FK_PROD_REQ_LIS_TAG_GEN_2) violated - parent key not found
ORA-31693: Table data object "STATICO"."PROD_REQ_LIS_TAG_GEN":"GEN_DS_LOAN" failed to load/unload and is being skipped due to error:
View 2 Replies
View Related
Aug 15, 2013
I have a problem with the code below. I would like to define a variable (exrate1) which I would like to use in another script. Also, the variable should be different for every month(that's why timestamp is between some period).
The problem is that I receive some errors which I don't know haw to fix
View 8 Replies
View Related
Feb 9, 2011
We are receiving data from our upstream through text file, they are extracting that text file from their oracle table.In the table in ename colmn some names are having the special character like
Chng
OReilly
We are loading the data from that text file into our oracle tables and the names are loading as it is.Now we are extracting data as text file from our tables which is loaded from or upstream,then the special character is changing as the names are changing as Chng OReilly. We are loading the data from the text file into our tables and the names are loading as
Chng OReilly and i tried to replace the special character with while loading but it is loading as .If I try to change the character with '(Apostrophe) while loading it is changing.
The data type of the column is VARCHAR2(20)
NLS_CHARACTERSET is UTF8
Version Oracle 9i
View 6 Replies
View Related
May 11, 2010
I have a view in schemaA called viewA for example. The view is quite complex, with multiple joins.I've granted ALL to schemaB to the view, and to all the referenced tables within the view.However, when I run execute the view from schemaB, I get a ORA-00942 table or view does not exist.
SQL> select * from schemaA.viewA;
select * from schemaA.viewA
*
ERROR at line 1: ORA-00942: table or view does not exist
I suspect that Oracle is creating to temporary tables/views internally (or using something that is already there), and that I need to grant access to these temporary objects.
View 25 Replies
View Related
Feb 9, 2011
We are receiving data from our upstream through text file, they are extracting that text file from their oracle table.In the table in ename colmn some names are having the special character like '
Ch'ng
O'Reilly
We are loading the data from that text file into our oracle tables and the names are loading as it is.Now we are extracting data as text file from our tables which is loaded from or upstream,then the special character ' is changing as ¿the names are changing as Chang Reilly.
We are loading the data from the text file into our tables and the names are loading as Chang O¿Reilly and i tried to replace the special character ¿ with ' while loading but it is loading as ¿.
If I try to change the character ¿ with '(Apostrophe) while loading it is changing.
The data type of the column is VARCHAR2(20)
NLS_CHARACTERSET is UTF8
View 27 Replies
View Related
Jul 7, 2011
I created the following procedure in my local database.When I executed the procedure SEND_MAIL, recipient is not receiving mail.Do I need any setup in my database?
here is my procedure.
CREATE OR REPLACE PROCEDURE SEND_MAIL (
msg_to varchar2,
msg_subject varchar2,
msg_text varchar2 )
IS
c utl_smtp.connection;
rc integer;
msg_from varchar2(50) := 'Oracle11.2';
mailhost VARCHAR2(30) := '10.137.133.18'; -- local database host
[code]....
View 3 Replies
View Related
Mar 12, 2009
sample code in OCI in C for receiving records of table in array of structure? Or dynamically storing the result-set in an array..using array of pointers to structure..
View 6 Replies
View Related
Jul 7, 2011
I need to take the certain logs on my 3-Node, 10gR2 RAC with ASM on Solaris 10.I have been asked to shutdown the database, nodeapps, listener, asm and crs and then restart crs. Below are t steps that I have prepared.
CODE
====Enable trace=========
ENABLE LEVEL 5 tracing-
./crsctl debug log crs allcomp:5
====Stop database, asm, listener, nodeapps finally crs===
1. STOP DATABASE
srvctl stop database -d DB_NAME
[code]....
View 2 Replies
View Related
Jan 16, 2013
I'm trying to abort some workflow activities which are not 'COMPLETE' and then purge them. I'm using "WF_ENGINE.AbortProcess" to accomplish this task and then purge using "WF_PURGE.Total" to purge the activities. During this process, does any notifications fire?
View 2 Replies
View Related
Aug 5, 2013
I have done IZ007 in 9i. Now I wanted to complete the OCA in 10g for Oracle Administration. which exam I need to go for to complete 10g OCA.
View 1 Replies
View Related
Feb 5, 2013
The "private strand flush not complete" is a "noise" error, and can be disregarded because it relates to internal cache redo file management.
However this "private strand flush cannot complete" error is sometimes accompanies by a hung database.
These are quotes from MR Burleson himself .
But can this be really ignored as it's also mentioned "hung database"
View 1 Replies
View Related
Aug 31, 2010
how to find out queries that took more than 15 seconds to execute.
View 4 Replies
View Related
Aug 27, 2010
I was trying to get the structure of mview through ALL_MVIEWS, but I was not getting the complete one.ORDER BY clause was missing.
how we can get the complete mview structure.
View 7 Replies
View Related
Oct 29, 2008
I am using basic replication in oracle 10.2.0.3. My materialized view object contains around 3 million records. Usually we are using FAST refresh but in my materialized view having some problem that’s the reason I am going to Complete refresh.
If I start complete refresh it will take 8-12 hours to complete refresh.
My Question is: During Complete refresh how to monitor complete refresh running or not except using
v$mvrefresh.
Is there any way to monitor complete refresh because suppose target database not available/network problem/ dead lock occur in my database. I can’t able to track refresh.
Even there is no notification in alert log.
View 1 Replies
View Related
Sep 15, 2005
I am running Oracle 9.2.0.1 on Solaris 9. On just about a daily basis we perform sqlldr loads that load on the order of 300000 rows. I frequently see in my alert log:
Checkpoint not complete
Current log# 5 seq# 176431
I have 5 redo logs each of 10M in size. If I check what is going on in v$log and correlate to the alert log when it throws the checkpoint error I always notice that I have one current log (which is good) and the rest are in a status of 'ACTIVE'. It seems that when this happens I get the checkpoint error.
What can I do to get rid of this checkpoint error? Should I increase the size of my redo logs? Is there a good way to go about estimating what size redo logs I should have?
View 6 Replies
View Related
Jun 14, 2010
while upgrading to 11.1.0.7 from 10.2.0.4 at the end we are facing this error
ORA-39714: upgrade script utlmmig.sql failed
*Cause: A normal database open was attempted, but the upgrade script utlmmig.sql failed to complete.
*Action: Use the UPGRADE option when opening the database and then run utlmmig.sql.
i ran the utlmmig.sql script and i am facing this issue
SQL> -- we also need to update the statistic
SQL> begin
2 dbms_stats.delete_table_stats('SYS', 'OBJ$MIG');
3 dbms_stats.delete_table_stats('SYS', 'USER$MIG');
4 dbms_Stats.gather_table_stats('SYS', 'OBJ$MIG', estimate_percent => 100,
5 method_opt=>'FOR ALL COLUMNS SIZE SKEWONLY');
6 dbms_Stats.gather_table_stats('SYS', 'USER$MIG', estimate_percent => 100,
7 method_opt=>'FOR ALL COLUMNS SIZE SKEWONLY');
8 end;
9 /
begin
*
ERROR at line 1:
ORA-04063: package body "SYS.DBMS_STATS" has errors
ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_STATS"
ORA-06512: at line 2
compiled the invalid objects using utlirp.sql and utlrp.sql but of no use..
View 6 Replies
View Related
Apr 15, 2013
how to make the complete schema public synonym, instead of creating public synonym object by object.
View 2 Replies
View Related
Mar 27, 2012
I would like to pursue pl/sql developer as my carrier.So I would like to create a setup in the following manner in my windows 7(64 bit) laptop.Since I heard that,In real time the setup will be look like as follows
Windows7 as front end
Toad --> to practice Pl/Sql(Preferred use in windows7)
Solaris OS (To install Oracle 10g in Solaris & practice scripting)
Virtual Box (Open source tool to run both windows and solaris)
How I can create setup so that I can accomplish following things(To gain knowledge in programming/scripting).
1.Both OS has to be run simultaneously with Virtual Box So that
Oracle will be running in backend(Solaris)
2.Practicing Oracle Pl/sql in Toad (Windows7)
3.Practicing shell scripting in Solaris.
Is it possible to create the setup as i mentioned above.
View 4 Replies
View Related
Mar 19, 2012
I am loading data from XML file into Oracle table.This program is working fine for small XML files. If I try to load large XML file with multiple pages, only first ten records are loaded. Here is the procedure.
PROCEDURE Test_xml_read(p_tag varchar2,p_xml_file varchar2,p_path varchar2)
AS
BEGIN
INSERT INTO stg_xml_table(
Productid,productname,price)
select y1.productid,y1.productname,y1.price
y2.categoryid,y2.categoryname,y2.categorypath
FROM xmltable('ProductFeed/Products/Product'
passing xmltype(bfilename('TEST_DIR1', 'sample.xml' ), nls_charset_id('CHAR_CS'))
[code]...
what changes to be done to load multiple pages of data pages table.
View 4 Replies
View Related
Jan 9, 2013
A few years back, most likely under Apex 4.0, I started using Jquery - Autocomplete for looking up employee names, hostnames, etc. This used some Javascript that someone had posted (don't recall who), and it worked pretty well. I would add a stanza like this to the page def (to be run after the page loads). The X02 passed some extra info to the on demand procedure to work with the search.
$("#P28_REQUESTOR").autocomplete('APEX', {
apexProcess: 'PERSON_SEARCH_ACTIVE',
width: 600, multiple: false,
x02: 'Hostmaster:Owner Ok',
[code]...
END;The g_x01 value is what the user is typing into the text box - once they hit three characters, it gets sent to Oracle for processing via the on demand procedure. The g_x02 (and 03, 04,..) are additional parameters - perhaps from a parent LOV. General concept with the returned data is the first element is the text to be searched, the second is the actual value (typically an integer - primary key), and the third value is what ends up in the text box once a value is selected (the searching value often has additional identifying info.)
This approach worked pretty well, although it did not play well with cascading LOVs. But once I upgraded to Apex 4.2, I started hitting other issues with jQuery version differences, and some odd failures. To make matters worse, the original developer of the Javascript module, abandoned it - was not doing additional work with it.
Combobox
I did NOT want to re-invent the wheel, and it seemed to be a much better approach to move into PlugIn world - ideally using a plugin that someone ELSE wrote and was maintaining. I looked at the LOV Friendly Autocomplete plug in - and even installed it a few places - and discovered that it did not play well with my earlier autocomplete code.
My concern with using the LOV Friendly Autocomplete - is that I didn't know how to provide any of the "Target Name" search info to the LOV (since that is what is being typed by the user at the time).I also started looking at the Combobox plug in - which seemed even closer with Lazy Loading=Yes (I also liked the ability to add new items on the fly) - but I was not sure how to provide the "Target Name" info the the Pipelined function that generates the list of values.
I want to get rid of all of my old Autocomplete javascript and move to one (or more) plugins - (Which is why the details above don't matter, just trying to provide some background)Is this a good direction (Combobox or LOVFA)? How/Where do I connect my search (lov generation)?
View 0 Replies
View Related
May 21, 2013
I deleted all my RMAN backup with delete backup command .but when i run query on v$rman_backup_job_details, show information, i don't know that v$rman_backup_job_details read data from where? and how to delete complete that v$rman_backup_job_details don't show any thing?
View 8 Replies
View Related
Jul 29, 2013
I would like to update the complete flag of all the rows of a table except the most recent two rows of each category.
The table has fields like:
category - string,
item - string,
creation_date - string ('YYYYMMDDHHMI'),
complete_flag - boolean,
etc.
Each category does not have the same amount of records with the same creation_date so I do not want to filter by creation_date. Is there a way to accomplish this?
View 8 Replies
View Related
Oct 9, 2009
I am trying to modify an existing form in oracle apps. I copied all the necessary objects and modified the form. I can compile the object in windows. When I am trying to compile the object in Linux server using SSH I am getting the following error.
FRM-91500: Unable to start/complete the build.I am using the following command for compile
f60genm userid=apps/apps batch=yes module= FORMNAME.fmb module_type=form.
View 3 Replies
View Related
Jan 4, 2008
I am having two server A and B.B is the replica of A.I have to alter the materialized view of one of the table in server B,of which Master table is in both the servers .I do not have any Materialized view log on both the servers and want to convert the Refresh method from Complete to Fast refresh.One way of this altration is to create materialized view log on master table.
View 6 Replies
View Related
Dec 9, 2011
We have a SSRS Front end screen which sends multi-select column values as comma separated strings to back end ( Oracle 10g) procedure .
The procedure builds the string by inserting single quotes in the following manner.
P_BU_LST is the parameter which have comma separated values
'1234,3456,4577' i.e, BU ids selected by user in front end
the procedure inserts single quotes to this paramer value
i.e., '1234','3456','4577
v_bu_lst := '''' || REPLACE(v_selbu, ',', ''',''')|| '''';
This is used the where clause of the REF CURSOR SELECT query which send the data back to SSRS
ie.,
SELECT BU.*
FROM BU_DETAIL BU
WHERE INSTR(V_BU_LST,BU_ID) <> 0;
INSTR has a chance to fail in this scenario if the value send from the front end is 123456,3456,4577
here 123456 does not exist in table, but it will be true for INSTR and values 1234 from table will be send back to SSRS which is wrong. Earlier I was using a function to convert the comma separated values to multi-rows and treat it like a lookup table.
But the main table has around million records , and each row has to processed against each row of lookup table, which makes it slower. To avoid this I used INSTR which is faster but can give wrong results.
View 8 Replies
View Related
Aug 29, 2012
I have taken full backup of database plus archivelog. Then copied them to new server..I want the complete procedure to restore the database on new location.
View 3 Replies
View Related