SQL & PL/SQL :: Bulk Insert In Index Table
Jul 8, 2013why bulk insert is not possible in a table which has index?
View 9 Replieswhy bulk insert is not possible in a table which has index?
View 9 RepliesI have a table emp as mentioned below:
SELECT * FROM EMP_TEST;
EMP_TEST
-----------------------------------------------------
ENO EFIRSTNAME ESECONDNAME DEPTNO
1 JOHN PAI 10
2 ABC DEF 20
3 EFG GHI 30
Now the primary key in this above table is pk_emp_test_eno on eno.
I have a requirement where i need to dump some dummy data (600000000 numbers of data ) into the emp_test based on these existing data without disabling the constraints (maintaining unique constraint for each record). And while inserting i want to commit after every 1000 insertion.
in bulk inserting dummy datas into the table as it is taking much more time to insert into the same.
Using the Bulk collect for insert into table,it's raising the below error.
ORA-00600: internal error code, arguments: [25027], [130], [1], [], [], [], [], [], [], [], [], []
Table contains 10k records,we are going to insert data into another table with FORALL bulk collect limit 1000. if i use 10000 ,it's completed fast compared to 1000 limit.Can u tell me which one is better Limit.
View 4 Replies View Relatedif it is possible to assign a varchar2(14) index to a pl/sql table while fetching bulk data with bulk collect.my requirement is to assign varchar2(14) index to a pl/sql table so that i can directly reach to my record by index and process the record further.
further i need to run a loo on this ( for processing each record)my database version is 11.2.01.0.
I made two runs for bulk insertion
In first run, 16,36,897 were inserted successfully in around 38 seconds.But in second run, 54,62,952 records had to be inserted, but process failed after 708 seconds with following error :
Error report:
ORA-04030: out of process memory when trying to allocate 980248 bytes (PLS non-lib hp,DARWIN)
ORA-06512: at line 21
04030. 00000 - "out of process memory when trying to allocate %s bytes (%s,%s)"
*Cause: Operating system process private memory has been exhausted
*Action:
Here is my code snippet :
.......
FORALL i in products_tab.first .. products_tab.last
INSERT INTO tab1 VALUES products_tab(i);
COMMIT;
.........
I think that there should not have been any problem in getting it completed successfully.
We have requirement to create INSERT SCRIPT from the table having thousands of records and load that into flat file. if there are any better option other than using UTL_FILE package which process record by record.
View 5 Replies View RelatedWe are doing a bulk select and insert (10,000 rows processed in each transaction). If one record fails, the entire transaction is rolled out. We need to fix this and re-run. the process is repeated unless all errors are fixed.
How to capture all errors in a single run ?
I want to know which is the most efficient insert method among the followiing
1)using the hint append
or
2)bulk collect. Preferably I'd like to know which method of inserting for say 2-5 millions rows is the best.
I used bulk collect and for all statements to select and insert the data in temp table.The select SQl is returning one row. But its not inserting this row into temp table.Its not throwing any exceptions. Used ref cursor because the select statement is going for every cursor.
here modified the code and provided only one cursor.
Create Or Replace Procedure Sales_Hist_Update_Bkp Is
Type Type_Name Is Record(
Sku_Item_Key Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Sku_Item_Key%Type,
Locationno Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Locationno%Type,
Bsns_Unit_Key Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Bsns_Unit_Key%Type,
Act_Item_Cost_Amt Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Item_Cost_Amt%Type,
Act_Rglr_Unit_Price_Amt Ordm_Int.Dwi_Rtl_Sls_Retrn_Line_Bkp.Rglr_Unit_Price_Amt%Type,
[code]...
Is there any defined record count range for the following ways of bulk insert :
INSERT INTO ABCTEMP SELECT * FROM DEFTEMP;
OR
through a cursor, bulk fetch and bulk insert under a loop.
SQL> declare
2 TYPE id_collection is TABLE of number(6);
3 TYPE ename_collection is TABLE of varchar2(20);
4 id ID_COLLECTION;
5 ename ENAME_COLLECTION;
6 cursor c is select empid,name from Nemp;
7 begin
8 open c;
9 loop
[code]....
Here sub_Nemp is my new table in which i have to insert the values from Nemp old table.Both tables are same like below:-
SQL> create table sub_Nemp(empid number(6),name varchar2(20));
I'm unable to find this error...
I have an array of C structs say
struct S
{
int a, long b, double c;
};
S s[100];
Further suppose I have an Oracle table T like this:
create table T
(
a number(10),
b number(10),
c float
);
I want to bulk insert all 100 instances of S from a client application into T. I've seen code that does this for *one* field or column. The code defines a stored procedure which accepts a single argument which is a TABLE and then does a FORALL ... insert. The client application passes in the array of data.
What I need is N columns. In my example above struct S has N=3 fields which conform to the N=3 columns in T. In reality my N will be 50+. I am trying to avoid creating stored procedures which will take the 50 or so arguments it will eventually need.
So does my stored procedure need to accept N TABLE arguments? Or can I cajole OCI/OTL/ODBC and PL/SQL so that the stored procedure can take an array of rows which the type of row conforms to T by defining a record or something? That is, do I need:
Option 1: // declares one type and one argument each for N cols
create or replace procedure insert_S(
a_array IN A_TABLE, -- type A_TABLE is TABLE of number;
b_array IN B_TABLE, -- type B_TABLE is TABLE of number;
c_array IN C_TABLE) -- type C_TABLE is ...
begin ... end
Option 2: // this somehow accepts an array compatible with T
// if I could get a OCI/OCCI/OTL/ODBC application
// to send this data, this procedure would have
// only one argument
create or replace procedure insert_S(
row_array IN ?????????? type -- some sort of array of rows
)
begin ... end
Or should I pass the whole memory chunk of data in as an image or varchar array -- basically an opaque block of data -- and then internally decypher/decode the memory block inside the stored procedure as discussed on [URL].
best way to pass an array of N C-structs of M fields to a stored procedure for insertion into a table with M compatible columns? One TABLE per column? with an array of a custom type compatible with a row in T? As glob of data? Another option is to populate some host variables ... but, again, I'd need N host variables.
I am facing a problem in bulk insert using SELECT statement.My sql statement is like below.
strQuery :='INSERT INTO TAB3
(SELECT t1.c1,t2.c2
FROM TAB1 t1, TAB2 t2
WHERE t1.c1 = t2.c1
AND t1.c3 between 10 and 15 AND)' ....... some other conditions.
EXECUTE IMMEDIATE strQuery...These SQL statements are inside a procedure. And this procedure is called from C#.The number of rows returned by the "SELECT" query is 70.
On the very first time call of this procedure, the number rows inserted using strQuery is 70. But in the next time call (in the same transaction) of the procedure, the number rows inserted is only 50.And further if we are repeating calling this procedure, it will insert sometimes 70 or 50 etc. It is showing some inconsistency.On my initial analysis it is found that, the default optimizer is "ALL_ROWS". When i changed the optimizer mode to "rule", this issue is not coming.I am using Oracle 10g R2 version.
I am working on oracle 11g...I have one normal insert proc
CREATE OR REPLACE PROCEDURE test2
AS
BEGIN
INSERT INTO first_table
(citiversion, financialcollectionid,
dataitemid, dataitemvalue,
[code]....
I am processing 1 lakh rows.tell me the reason why bulk collect is taking more time. ? According to my knowledge it should take less time. do i need to check any parameter?
Let's consider such table that all rows fit into single block:
SQL> create table test as select rownum id, '$'||rownum name from dual connect by level <= 530;
Table created.
SQL> create index i_test on test(id);
Index created.
SQL>
SQL> begin
[code].....
why does approach with full scan take longer even if table occupies only one data block? PS. 11gR2
I have a huge table (about 60 gb) partition over range. The index on this table is global index created on 4 columns together. I have a query which is running very slowly. The explain plan is showing the use of this global index.Explain plan is not showing pstart and pend because the index is global.
View 6 Replies View RelatedI am using temporary table.
PROCEDURE Return_Summary(WX IN dbms_sql.varchar2_table,
WX OUT SYS_REFCURSOR) IS
Begin
FOR i IN 1 .. Pi_ WX.count LOOP
/* I need to put this results in a temp table or table object Can I use temp table for this or do we have any other recommended method. The loop might execute max of 10 times and for each run it might return 100-200 records. */
select WX_NM,
WX_NUM
from TAB A, TAB B, TAB C
where A.KEY1 = B.KEY1
and B.KEY1 = C.KEY1
and C.WX = WX(i);
End Loop;
End;
How to force an index if the table not using the index?
View 10 Replies View RelatedI want to insert bulk records to the table. I want to insert date rows for next 50 years in table ( from year 2001 to year 2050). I have following columns in my table :
YYYYMMDD MM/DD/YYYY Day of the week ( Monday, Tuesday etc) JulianDate
I am trying to delete 3 million records of data from huge table which already consists of 3 billion records.
This is hitting performance of DB and halting other activities of my users. Is there any easy way to delete such data fast. I have tried with forall delete but it is even taking lot of time.
1. how can i impose a restriction on a table so that the data gets updated only specific period of time say 9 a.m. to 10 p.m.
2. Can i use bulk collect in dynamic sql? If yes how?
We deleted millions of records from a table.
1.Is it necessary to reorganize a table and index after the deletion of records from table ? Because i see some change in table size after table and index reorganization.
2.Will re org table and index improve the database performance ?
I have 1M Records coming from an External Data source as a Flat File (using ETL). Now I need only Yesterday's data only to load in my Database Table.
this can be done using Bulk Load and Filter.
write the CODE.
Second Part:-
Hint: if I need to update only those records been updated Say the Address1 field is updated. So this records need to update in my Master Customer Table.
If I have many fields in table and any records that are modified (coming to me from External Datasource as a Flat file) how to identify and update that record in my Master Customer Table?
when i am Executing the following statement
SELECT DISTINCT EXPOSURE_REF FROM KBNAS.VW_EXPOSUREDETS_FOR_CCYREVAL
WHERE EXPOSURE_CURRENCY='THB' AND BASE_TXN_CCY='USD' AND BRANCH_CODE='7000'
AND (REVAL_STATUS='O') AND CONV_RATE<>'62' AND (EXPOSURE_AMOUNT<>0)
UNION
SELECT DISTINCT ED.EXPOSURE_REF FROM KBNAS.EXPOSURE_DETAILS ED,
[code].....
I have attached DDL for table EXPOSURE_DETAIL(PARTITION),LEDGERCARD,LEDGERCARDDETAILS, DDL for INDEX on those tables and DDL for Views..
Issue: we have created the Indexes but when we check the explain plain .. full table scan is going on..I have attached the explain plan ..
We need to load data from index by table to table.Below code is working fine.
declare
query varchar2(200);
Type l_emp is TABLE OF emp%rowtype INDEX BY Binary_Integer;
rec_1 l_emp;
begin
[Code]....
But data from source table and target table is dynamic.Ex:In above code, emp(source) and target table is emp_b are static. But for our scenario is depends on the source table , target would change as below.If source is emp then target is emp_bIf source is emp1 then target is emp_b1 ............
create or replace procedure p(source in varchar2, target in varchar2)
as
query varchar2(200);
source varchar2(200);
Type l_emp is TABLE OF emp%rowtype INDEX BY Binary_Integer;
rec_1 l_emp;
[Code]....
Its throwing. How to implement this scenario .
primary key constraint on transaction_dtl_bk is affecting the insertion of next correct rows.
CREATE OR REPLACE PROCEDURE NP_DB.san_po_nt_wnpg_1 (
dt DATE
)
IS
v_sql_error VARCHAR2 (100); -- added by sanjiv
v_sqlcode VARCHAR2 (100); ---- added by sanjiv added by sanjiv
[code]...
Oracle 11gI have a large table of 125 million records - t3_universe. This table never gets updated or altered once loaded, but holds data that we receive from a lead company. I need to select records from this large table that fit certain demographic criteria and insert those into a smaller table - T3_Leads - that will be updated with regard to when the lead is mailed and for other relevant information. select records from this 125 million record table to insert into the smaller table.
I have tried a variety of things - views, materialized views, direct insert into smaller table...I think I am probably missing other approaches. My current attempt has been to create a View using the query that selects the records as shown below. Then use a second query that inserts into T3_Leads from this View V_Market. This is very slow. Can I just use an Insert Into T3_Leads with this query - it did not seem to work with the WITH clause? My Index on the large table is t3_universe_composite and includes zip_code, address_key, household_key.
CREATE VIEW V_Market asWITH got_pairs AS ( SELECT /*+ INDEX_FFS(t3_universe t3_universe_composite) */ l.zip_code, l.zip_plus_4, l.p1_givenname, l.surname, l.address, l.city, l.state, l.household_key, l.hh_type as l_hh_type, l.address_key, l.narrowband_income, l.p1_ms, l.p1_gender, l.p1_exact_age, l.p1_personkey, e.hh_type as filler_data, 1.p1_seq_no, l.p2_seq_no , ROW_NUMBER () OVER ( PARTITION BY l.address_key ORDER BY l.hh_verification_date DESC ) AS r_num FROM t3_universe e JOIN t3_universe l ON l.address_key = e.address_key AND l.zip_code = e.zip_code AND l.p1_gender != e.p1_gender
[code]....
I have a table with a BLOB column ;
I want read data from table and insert to another table with a cursor
My code is :
procedure read_data is
cursor get_data is
select id,image from picture1;
id1 number;
pic blob;
begin
open get_data;
[code]....
when I run form , error FRM-40734 occurred
error in line " fetch .... "
My scenario is I need to insert into History table when a record is been updated into a tabular form(insert the updated record along with the additional columns Action_by,Action_type(Like Update or delete) and Action Date Into History table i.e History table contains all the records as the main table which is been visible in tabular form along with these additional columns ...Action_by,action_type and action_date.
So now i dont want to create a befor/after update trigger on base table rather i would like to create a generic procedure which will insert the updated record into history table taking the page alias and pade ID as the parameters(GENERIC procedure is nothing but whcih applies to all the tabular forms(Tables) contained int he application ).