SQL & PL/SQL :: Insertion / Data Loading To A Table
Apr 10, 2012
We are getting the below error frequently from the application while doing insertion/dataloading to a table. The mentioned error is in the Primary key index
Error: 'ORA-01502: index 'INDEX_NAME' or partition of such index is in unusable state'.
I set the value SKIP_UNUSABLE_INDEXES = TRUE using the command 'ALTER SYSTEM SET SKIP_UNUSABLE_INDEXES = TRUE' to avoid this. Again we are getting the same error and Every time Iam rebuilding('alter index INDEX_NAME rebuild') the index and doing the DML Operation.
Would using a blob based schema load noticeably faster than a binary_double based schema?
CODEBlob Scenario:
Load 1 row: 5 columns (1 integer column, 4 blob columns) of size X VERSUS
CODEDouble Scenario:
Load 10,000 rows: 5 columns (1 integer column, 4 binary_double columns) of size X
While the benefit of using the rows approach is obviously the capability to query the values, I'd like quick answer concerning the loading/insertion performance. Associative array binding is used for loading from a .NET client. Also, would the answer also hold true for 200 columns instead of just 5 columns.
I need to insert data from one schema table to other schema's table in same database.The thing is columns are not equal.so when I am trying to use insert statement it is throwing error as not enough values. The situation is explained clearly below.The insert stmt is implemented in second schema whose table name is b.
Table created. SQL> ALTER TABLE B ADD 2 CONSTRAINT B_PK1 3 PRIMARY KEY 4 (ID);
Table altered. SQL> create sequence b_seg start with 1;
Sequence created. SQL> insert into b select b_seg.nextval,lexcom.a.* from lexcom.a,dual; insert into b select b_seg.nextval,lexcom.a.* from lexcom.a,dual * ERROR at line 1: ORA-00947: not enough values
So,for table b in ID column sequence needed to be inserted and other columns need to be taken from table a.I can understand the error is because two tables are not having equal columns.So,the insert stmt is throwing error.
I can manually write by taking columns from a and b and write insert stmt as follows,but this is tedious process.
SQL> insert into b(ID,Name,rollno,address)select b_seg.nextval,lexcom.a.Name,lex com.a.rollno,lexcom.a.address from lexcom.a,dual;
3 rows created.
But this is time taking and I had tables which has many columns to be inserted.So is there any other way to solve it and implement insert stmt.
I have 1M Records coming from an External Data source as a Flat File (using ETL). Now I need only Yesterday's data only to load in my Database Table.
this can be done using Bulk Load and Filter.
write the CODE.
Second Part:-
Hint: if I need to update only those records been updated Say the Address1 field is updated. So this records need to update in my Master Customer Table.
If I have many fields in table and any records that are modified (coming to me from External Datasource as a Flat file) how to identify and update that record in my Master Customer Table?
I was trying to load data from XML files to an Oracle database table.I followed these below steps to load that file data into a table. Created XML_DIR1 as oracle directory where i have kept all XML files.
Create table import_rpt_xml of xmltypexmltype store as binary xml; insert into import_rpt_xmlvalues (xmltype (bfilename('XML_DIR1','I-Yamanouchi-20040525-501.xml'),nls_charset_id('AL32UTF8')));
This insert shows below error: Error starting at line 80 in command:insert into import_rpt_xmlvalues(xmltype(bfilename('XML_DIR1', 'I-Yamanouchi-20040525-501.SGM'), nls_charset_id('AL32UTF8')))
Error report:SQL Error: ORA-31061: XDB error: XML event errorORA-19202: Error occurred in XML processingIn line 69 of orastream:LPX-00217: invalid character 142 (U+008E) I tried to look into my XML and got that it has some Japanese characters in it.
this to deal with japanese characters in XML. I don't want to miss those characters. My databse NLS_CHARACTERSET is 'AL32UTF8'.
ERROR at line 1: ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "badfile": expecting one of: "column, enclosed,
I'm not sure if this is so much a SQL Loader problem as it is a database understanding problem, but here it is. I am having trouble loading data into a table (using SQL Loader) due to the fact that I am trying to load data row by row, into corresponding columns.
TestFile.csv
testvalue1, 123445 testvalue2, test testvalue3, 455321 testvalue4, 65742 testvalue5, 5719
So, using the above data, I am trying to load the value for 'testvalue1' into a column defined as 'testvalue1'; the value for 'testvalue2' into a column defined as 'testvalue2' and so on. From my understanding, SQL loader loads by column not by row, so I am not even sure if this is possible.
How to load the CLOB data into table..in the attached file 18 column has clob data it's appear like new line..Using external table how to load. i tried it's not working..
When I am loding the data in person table through sql loder runs successfully without errors but when i check the person table it shows me zero records. Following is the details about what i done.
here are the details of data files. 1 Ahmed Baraka 1000 1.87 1-1-2000 2 John Rice 5000 2.4 10-5-1998 3 Emme Rak 2500 2.34 4 King Size 2700 5 Small Size 3000 31-3-2001
And The control File. OPTIONS ( ERRORS=0) LOAD DATA INFILE '/oraeng/app/oracle/product/10.2.0/dbs/persons.dat' BADFILE '/oraeng/app/oracle/product/10.2.0/dbs/persons.bad' DISCARDFILE '/oraeng/app/oracle/product/10.2.0/dbs/persons.dsc' INTO TABLE "KAILAS"."PERSONS" REPLACE FIELDS TERMINATED BY X'9' TRAILING NULLCOLS
I have to do upload into the table through a csv file . The table's primary key i have to load the rest through user's uploaded file. Is it possible to do the data loading to the table only to required columns and fill the other columns from backend. Or is there any other way to do this?
While importing dump to the new database, error occurred. Below are the errors -
ORA-02374: conversion error loading table "INS"."GENMST_FINANCIER_BRANCH" ORA-12899: value too large for column TXT_IFSC_CODE (actual: 19, maximum: 15) ORA-02372: data for row: TXT_IFSC_CODE : 0X'4644524C30303031353739A0A0A0A0' [code]...
I would like to know, why such error occurred during the import.
The issue is slow insertion in particular table(i.e A Table) it means insertion in all other tables(i.e B, C, D tables) in same schema is going properly but only when i am trying to insert in one particular table(i.e A table) in same schema it takes long time to complete insertion. Daily insertion is 6000 rows.
I have check all the details like Tablespace size, Analyzing of table, Analyzing of indexes and all. There is no any error alertlog file.
I am using oracle form builder 6i and oracle database 10g
1:I have table named 'info' column name 'InfoId'and some other.And another table named 'Handing' with column names HtId, Value1 and value2.
2:I made form that consist of three data blocks, first block takes criteria and second block display record against that criteria from table 'info'.
3:i want checkboxes agianst that display record,and want that when I select some checkboxes against 'InfoId' these selected 'InfoIds'
should save in another table named 'handing' in column 'HtId'.and in same table data in column value1 and value2 will be inserted through textboxes that are in the third datablock of the thae same form .
How to avoid Junk character insertion in oracle table. I have prepared scripts like this Say
customer - info
After insertion the data is inserted like below in production
Customer ¿ info
We are using command prompt for script execution in production environment. I am using PLSQL developer and SQL developer for development. i cannot see junk data in PLSQL developer and latest SQL developer , but its caught in old version of SQL developer. Also in Application also i can able to figure out junk data.
We have requirement such that whenever stored procedure is executed, their resultant records has to be stored in excel file ( Just like an reports ).No third party tool or reporting tools are used.
is there any option in oracle (Stored procedure or built in packages ) which can create excel file with the resultant records.
I have a requirement like, I received .dat files and placed in my local directory and there is a process/method where this data which is there in .dat file will insert into my oracle external tables?Any link of the example with clear steps?
Is it Possible doing SQL LOAD into Varray table having two inner objects.
find the structure of the table and its types for your reference
CREATE OR REPLACE TYPE OB_TEST_INFO AS OBJECT ( AGE NUMBER ( 3 ), NAME VARCHAR2 ( 14 ) ); /
CREATE OR REPLACE TYPE OB_TEST_INFO_VARRAY AS VARRAY( 400 ) OF OB_TEST_INFO ; /
CREATE OR REPLACE TYPE OB_TEST_MAINTENANCE AS OBJECT ( BREAKOUT_TYPE NUMBER ( 1 ), EXISTING_STRIPS_FLAG NUMBER ( 1 ), OB_TEST_INFO OB_TEST_INFO_VARRAY ); /
i am trying to load data into a table in a remote database schema, and my files are residing on another remote Server, Server having the files does not have a DB installed, i just need to know that if its possible or not..
Source and Target db version : 10.2.0.4.0 Source Os :Cent OS 5.4 and Target OS:Sun OS 5.10
We are loading data from source DB to target DB using dblink.Source db is having 15.4 crore records of number and varchar2 data types.
Using the logic as follows. insert into table1 as select * from table1@dblinkname[/email];
DB link is working.If I give "select * from scott.REPORT@DBLINK[/email]" in target db alone i could retrieve records.
Actual query:
Create Or replace procedure test_abcd as begin Insert into test select * from scott.REPORT@DBLINK[/email] dbms_output.put_line('Hello world'); end;
When i give the above query in sqlplus ,it is hanging. When i see the wait events i could find
"Wait Event: SQL*Net more data from dblink"
How to get the above things working.? when checked with network team they says there is no issue in the network. Do we need to modify any database/network level parameter settings.there is no firewall between source and target db.
We completed creating a replicate of dB_01 to dB_02 (housed in a single DEV server). But dB_02 had only table structures (no records). What would be our fastest option (tools, commands, etc.) to load more than a thousand new records for each of the 20 tables of dB_02?
I am using Oracle XE (11g) with APEX 4.1.1.0023 and Glassfish for the APEX Listener.I created a Data Upload set of pages and things worked great. I then exported the whole application and imported it into a new environment that was the same except the schema name was different. It was a different owner. I then tested the data upload in the new schema/environment and could not get the data loading to recognize the table. Upon comparison of the Shared Components between the two environments I discovered that the imported application in the new environment was still looking for the original schema name. The name is not editable via the Shared Components page. I had to recreate the pages and have it create a new Data Loading object before things worked again.
I have an requirement to load past and future data from different source table to one tgt table.Say now we are in July
Past Data Apr May Jun
Future Data Aug Sep Oct
I HAVE actual sales for past month and present month which is in Table A & expected sales in table B.for every month i have to load 7 months data TARGRT TABLE..
In future the requirement may change to 6 months also. in that case the procedure has to load past 6 month + current month + 6month future so totally 13 month should be loaded.
SQL> desc stg_query_overflow Name Null? Type ----------------------------------------- -------- ---------------------------- HOSTNAME VARCHAR2(50) NPSID NUMBER NPSINSTANCEID NUMBER OPID NUMBER
[code]....
Here's my controlfile:
load data infile '/u01/tony/server_name/query_overflow.dat' badfile '/opt/oracle/tony/sql_dir/bad/server_name_query_overflow.bad' discardfile '/opt/oracle/tony/sql_dir/discard/server_name_query_overflow.dsc' append into table stg_query_overflow
[code]....
Here's a sample of data that I can't load into the table via sqlldr:
Record 272: Rejected - Error on table STG_QUERY_OVERFLOW, column NPSID. ORA-01722: invalid number Record 273: Rejected - Error on table STG_QUERY_OVERFLOW, column NPSID. ORA-01722: invalid number
As you can see, sqlldr is interpreting this vertical sql code as the npsid column, when in fact it is the querytext column. How can I insert each record when some of my data is in this vertical format?