SQL & PL/SQL :: How To Implement (score-proc) Using Bulk Collect
May 20, 2013
In my below code the procedures "total_score_proc" and "CopyInternalScores" are calling "score_proc" procedure 50 times
for different variable values.
Instead of calling the "score_proc" procedure 50 times.I want to hold the values in to collection , defining it in package and call that procedure only once.
how to implement "score_proc" using bulk collect.
CREATE OR REPLACE PACKAGE total_score_pkg
IS
PROCEDURE total_score_proc(pBUID IN STAGING_ORDER_DATA.BUID%TYPE,
OrderNum IN STAGING_ORDER_DATA.ORDER_NUM%TYPE,
I am working on oracle 11g...I have one normal insert proc
CREATE OR REPLACE PROCEDURE test2 AS BEGIN INSERT INTO first_table (citiversion, financialcollectionid, dataitemid, dataitemvalue, [code]....
I am processing 1 lakh rows.tell me the reason why bulk collect is taking more time. ? According to my knowledge it should take less time. do i need to check any parameter?
I am finishing using procedure migration from HP-ux server to Oracle linux server. I am currently testing migration by procedure, and there's time limit, so I like to use bulk collect and other faster way to do that, however I could convert normal procedure to bulk procedure.
here's my script of old table, new table, normal procedure, and my new procedure.which parts can be corrected to use bulk collect or bulk insert?
PLS-00386: type mismatch found at 'RECORD_VARRAY' between FETCH cursor and INTO variables
while executing the below code.
PROCEDURE MAIN_BULK_COLLECT(P_STARTDATE IN TIMESTAMP DEFAULT NULL, P_ENDDATE IN TIMESTAMP DEFAULT NULL, P_ROW_COUNT IN NUMBER DEFAULT 1000, O_RECORD_VARRAY OUT NOCOPY SSAM_VARRAY_TYPE, P_ERROR OUT VARCHAR2) AS [code]....
I'm able to run the program successfully using FOR LOOP instead of BULK COLLECT but wish to run using bulk collect.
CREATE OR REPLACE PROCEDURE fast_proc IS TYPE ARRAY IS TABLE OF mkt_total_lvl_indx_dly_stg%ROWTYPE; l_data ARRAY; cursor C IS SELECT * [code]..........
create or replace PROCEDURE CDR_PROC_ARCHIVE_ORDER_EXTRACT IS /* Criteria to be followed to Order Archival
* Order Status should be 'Cancelled' or 'Complete' * Order Closed date should be 6 months before * -- main Cursor to spool the Orders to be archived based on criteria
identify what type of error is present in below procedure?
create or replace procedure test_bulk_load_type as type c1_owner is table of bulk_load_all_object.owner%type index by binary_interger; v_owner c1_owner;
For the following procedure if I send the existed employee number of emp table as input. The procedure is executing successfully. But if I send the employee number as input which does not exist in the emp table . The execution block does not handling the exception.
I am getting the following error.
ORA-06502: PL/SQL: numeric or value error ORA-06512: at "RAKULA.SP_TEST_EXCEPTION_BULK", line 8 ORA-06512: at line 7
If I use WHEN OTHERS exception then I am able to handle that exception. Why it's happening like this.
CREATE OR REPLACE PROCEDURE RAKULA.sp_test_exception_bulk(i_empno NUMBER) IS t type_test1; BEGIN SELECT deptno BULK COLLECT INTO t FROM emp WHERE empno=i_empno; [code].......
how to handle that exception.
If I create the procedure without using
BULK COLLECT then I am able to handle that exception using WHEN NO_DATA_FOUND
In the following procedure I am able to handle the exception.
CREATE OR REPLACE PROCEDURE RAKULA.sp_test_exception(i_empno NUMBER,v_dept_no OUT NUMBER) IS BEGIN SELECT deptno INTO v_dept_no FROM emp WHERE empno=i_empno; EXCEPTION WHEN NO_DATA_FOUND THEN dbms_output.put_line('employee number' ||i_empno|| 'does not exist'); END sp_test_exception; /
I have a select..bulk collect into clause which is inside a for loop as the query gets a parameter from the loop, Then, how to extend the collection rows for each iteration.
The procedure uses bulk collect to fetch from a normal cursor, Then I am using for all to insert into target table, The number of rows are 234965470
Question: What should ideally be the limit for my bulk collect ?
According to below, it should be in hundreds[URL]...
I put a bulk collect limit of 50000 - took close to 2 hours then i tried 10000 - just 3 mins shorter than the above time But if you commit every 500 rows, Then is there not another theory that frequent commits is not good ?
I used bulk collect and for all statements to select and insert the data in temp table.The select SQl is returning one row. But its not inserting this row into temp table.Its not throwing any exceptions. Used ref cursor because the select statement is going for every cursor.
here modified the code and provided only one cursor.
Create Or Replace Procedure Sales_Hist_Update_Bkp Is Type Type_Name Is Record( Sku_Item_Key Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Sku_Item_Key%Type, Locationno Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Locationno%Type, Bsns_Unit_Key Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Bsns_Unit_Key%Type, Act_Item_Cost_Amt Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Item_Cost_Amt%Type, Act_Rglr_Unit_Price_Amt Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Rglr_Unit_Price_Amt%Type, [code]...
The following is the Record type and table type created. It is called in the below procedure. The procedure will be called in another procedure to return the records. But the count is showing as 0. How I get an OUT param with records which I can use in the calling procedure.
-- Record type type t_ein_cmpl_rec IS RECORD( IN_req_param_id t_IN_req_param_id, IN_call_request_id t_IN_call_request_id, IN_action t_IN_action, IN_event_id t_event_id, IN_ported_nbr t_IN_ported_nbr, [code]........
SQL> declare 2 TYPE id_collection is TABLE of number(6); 3 TYPE ename_collection is TABLE of varchar2(20); 4 id ID_COLLECTION; 5 ename ENAME_COLLECTION; 6 cursor c is select empid,name from Nemp; 7 begin 8 open c; 9 loop
[code]....
Here sub_Nemp is my new table in which i have to insert the values from Nemp old table.Both tables are same like below:-
if it is possible to assign a varchar2(14) index to a pl/sql table while fetching bulk data with bulk collect.my requirement is to assign varchar2(14) index to a pl/sql table so that i can directly reach to my record by index and process the record further.
further i need to run a loo on this ( for processing each record)my database version is 11.2.01.0.
create or replace PROCEDURE CDR_PROC_ARCHIVE_ORDER_EXTRACT IS /* Criteria to be followed to Order Archival * Order Status should be 'Cancelled' or 'Complete' * Order Closed date should be 6 months before */ -- main Cursor to spool the Orders to be archived based on criteria CURSOR GET_ORD_DET IS ( SELECT ORD.ROW_WID ORDER_ID FROM SRMW.W_ORDER_D@ARCHIVAL_TO_CRMSPA2 ORD WHERE (
optimize this code. Scenario have to update about 40 million rows to static value, I'm committing 1Million rows in one loop. The first 1 millions rows are getting updated very fast probably in a 2 minute, after that the code just hungs and i don't see increase in committed rows.
Declare cursor c1 is select rowid from t1 where c1 is not null;
I have more than 10 lakhs records in the table for which i am going to update two columns without any filtration. i have pasted my query in it..it's taking more time to update..is there any way to fine tune this block.
I have an app that reads records from a driver table in order to update another (account) table. The idea is that it runs quickly but does not impact other processes on the system. In total I have around 1M records to update.
Originally I wanted to BULK COLLECT and employ a LIMIT of 1000 with a FORALL. The problem with this approach though is that I make two updates. One to the target table but I also need to update the driver table showing that I have processed the record. Therefore I cannot use this option to commit every time I reach the LIMIT by having a COMMIT inside the LOOP.
So instead I I have a FOR LOOP and test the count, if it is 1000 then I want to commit. I thought this syntax would be fine but I get the 'Fetch Out of Sequence' error. Below is a copy of the code.. Is this being caused by the double update... as the format of the code looks correct to me (though it is late!!)?
PROCEDURE update_set IS CURSOR cur_get_recs IS SELECT account_num, ttw_active_flag, acct_rowid, rowid driver_rowid FROM driver_table FOR UPDATE OF processed; [code]...
I am trying to update a table column values if any change occurs using bulk collect and for all update not able to get idea. below is the proc working out.it is for insert and update using the cursors.
CREATE OR REPLACE PROCEDURE PRC_INS(P_ID IN NUMBER,P_STAT OUT NUMBER) IS TYPE T_TEST_TAB IS TABLE OF T_DTLS%ROWTYPE; V_PARAM T_TEST_TAB; V_STATUS NUMBER; V_BUS VARCHAR2(20); V_UP VARCHAR2(1); V_Q VARCHAR2(50);
Create a table with 100 records.Then write a BULK COLLECT Cursor that can reduce the number of context switches by using a bulk fetch to query records in sets or all at once.
I need a procedure to get the records from the table for every SKU of XML message from the table and take one record for each SKU based on order precedence.And send the set of output records to the .net code. The order precedence is
If MODEL is not null that record should be high precedence and consider max lead_time If MODEL and VK_UNIT is NULL then the precedence goes to STATE_ID.
The parameters in the procedure are
CREATE OR REPLACE PROCEDURE sku_proc ( p_bu_id IN number, p_model IN varchar2, p_vk_unit IN XMLTYPE, p_country_id IN varchar2, [code].......
Query to get the data from the table based on the parameters.
Based on the above query assume we got the following records.
BU_ID MODEL VK_UNIT COUNTRY_ID STATE_ID LEAD_TIME FC 0 M123 210-39348 AB 0 20 A 0 M123 210-39348 AB 0 30 B 0 NULL 210-39348 AB 0 10 C 0 M123 405-12132 AB 0 10 A 0 NULL 340-30904 AB 0 30 C 0 M123 340-30904 AB 0 20 B 0 M123 340-30904 AB 0 10 A 0 NULL 403-10890 AB 0 10 B 0 M123 403-10890 AB 0 20 A 0 M123 709-10007 AB 0 10 B 0 NULL NULL AB 0 20 B 0 NULL NULL AB 1 30 A
Th final query has to return the following result to the OUT parameters p_Lead_Time p_FC of the procedure.
LEAD_TIME FC 30 B 10 A 20 B 20 A 10 B 30 A
How to implement my above requirement using BULK COLLECT.
Table contains 10k records,we are going to insert data into another table with FORALL bulk collect limit 1000. if i use 10000 ,it's completed fast compared to 1000 limit.Can u tell me which one is better Limit.
PROCEDURE Return_Summary(WX IN dbms_sql.varchar2_table, WX OUT SYS_REFCURSOR) IS
Begin FOR i IN 1 .. Pi_ WX.count LOOP
/* I need to put this results in a temp table or table object Can I use temp table for this or do we have any other recommended method. The loop might execute max of 10 times and for each run it might return 100-200 records. */
select WX_NM, WX_NUM from TAB A, TAB B, TAB C where A.KEY1 = B.KEY1 and B.KEY1 = C.KEY1 and C.WX = WX(i); End Loop; End;
I'm calculating a Z score based on some simple numerical data thus:
create table t (id number, val number);
insert into t values(1, 1795); insert into t values(2, 1753); insert into t values(3, 1743); insert into t values(4, 1876); insert into t values(5, 1848);
[Code] .....
the logic is quite simple - calculate a moving average over the previous 12 rows, and a stdev over the same window. Then subtract the prior row's moving average from the current value, and divide by the prior row's stdev.
The issue is I want to expose this logic in a BI tool (OBI EE v10g), meaning I can't use the nested analytic functions. How to achieve this logic in a single analytic pass? The sql above took about 2 minutes to write this morning, then I've spent all day looking at user-defined aggregate functions, but haven't even been able to get the first step, the moving average, working. I can understand that I can probably create an udaf to replicate the avg(val) over (order by id ROWS BETWEEN 11 PRECEDING AND 0 FOLLOWING) functionality, but I can't see how to bundle the logic for the other three steps in the calculation into this.
From what I've read, the ODCIAggregateMerge should allow me to combine different threads that can return the different values I need for the current row calculation. Is this correct?
The only example udafs I can find are either not relevant (STRAGG) or very simple (ie don't appear to invoke multiple passes). I've also had a look at the COLLECT function, but again I can't see a way to use this.