SQL & PL/SQL :: Insert From Large String Into CLOB
Jun 7, 2012
I'm looking for a way to insert strings larger than 40.000 characters in a CLOB-field without geting the "ORA-01461: can bind a LONG value only for insert into a LONG column".
Something like this:
insert into MyClobTable(ID,Data) values ('101','A string containing more than 40000 characters...')
The problem is that a Java-application concatinates the string from a MSSQL-DB so I don't store the string in my oracle-DB. As far as I'm aware this means I can't chop my string in pieces and use declare to put the pieces in variables, right?
Below is an example I found but I don't think I can apply it on my case, correct?
SQL> CREATE TABLE myClob
2 (id NUMBER PRIMARY KEY,
3 clob_data CLOB);
I am facing a problem with utl_http.write_text in my pl/sql application. My requirement is to write data of size>32k. So I used a clob variable in write_text. But still it is showing numeric or value error when the data size is above 8k.
I have read that chunked transfer encoding will work. But I couldn't find out how this is done.
I have created a function that is used for splitting a comma separated string & give the output in tabular form.here is the function
Here I have used CLOB as my input string will be huge(greater than max limit of varchar2)
CREATE OR REPLACE TYPE SPLIT_TBL_CLOB AS TABLE OF CLOB; CREATE OR REPLACE FUNCTION CSVTOSTRING_CLOB ( P_LIST CLOB, P_DEL VARCHAR2 := ',' ) RETURN SPLIT_TBL_CLOB PIPELINED
[code]....
But here I am facing 2 problems.
1. The function is not accepting a large string & I am getting the error
I have encountered some problems in SQL I want to create a table with a bunch of prepared data. For ease of use, I choose to generate a SQL file which contains all the sql clauses used to create the table and insert the data. So all the data can only be inserted to a table using sql clause.
My questions: 1) If data of a column is large (for example, 1 M text), how to insert it using SQL, is there a piecewise method. 2) And how can I insert BLOB data using SQL clause.
What I what is to enclose all the operations in a single SQL file, and when the table is needed, just execute this SQL file.
How can I insert xml to clob ? I'm taking from database clob making it in xml type and when I want to insert xml type into clob field it calls and error:
Error(316,43): PL/SQL: ORA-00932: inconsistent datatypes: expected NUMBER got -
I am trying to insert complex HTML into a CLOB value in an Oracle table and keep running into various issues. The ORA errors I get include 0734, 0042 and 0552. The HTML I am inserting includes characters such as &, ', and *.
Here is an example of the SQL I am trying to run:
set DEFINE off' update table set message = '<HTML>TEST!@#$%</HTML>'; commit;
I need to pass a large data into one of the tables where the column is declared as CLOB before which I was checking with the sample code as below which is throwing an error.
I was trying to insert the CLOB data into the table as below.
create table t1 ( x number, y clob ); insert into t1(x) values (2) declare l_clob t1.y%type;
[code].....
The error I am getting is:
ORA-06502: PL/SQL: numeric or value error: invalid LOB locator specified: ORA-22275 ORA-06512: at "SYS.DBMS_LOB", line 833 ORA-06512: at line 161
i have a problem when i try to insert a large character string of nearly 1 lac characters (code of html) in a clob column of my test table, then i get an error "ORA-01704: string literal too long" , i didnot understand that why clob column is not storing this data.
I have a table with no primary key constraints with some roles containing null value/duplicates. I then decided to alter the table to add composite primary key constraints on four columns (a, b, c, and d). I did this by using the same script that was used to create the original table but this time adding the not null constraints.
I then took and export of the original table. I now want to import the data to the newly created table but I am now getting the error: ORA-01400: cannot insert NULL into (string).
I will like to perform the import without NULL. Is there a parameter in impdp that I can use? I tried DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS but it didn't work.
Beside options using impdp is there a way to do an insert statement like this insert into table a (select * from table) excluding NULL;?
Basically, I need to load the data into the newly created table without NULL.
i try to insert Concatenation string to my table,i need that all traps that has 12 length will be insert the new trapnum like this:
for example: 26001005CC45 = 260001005CC0045 ....... 08060027RF05 = 080600027RF0005 ......... and so....
update trap set TrapNum = ( select trim(both from to_char(substr(TrapNum,1,4),'0000'))|| trim(both from to_char(substr(TrapNum,5,1),'00'))|| trim(both from to_char(substr(TrapNum,6,3),'000'))|| substr(TrapNum,9,2)|| trim(both from to_char(substr(TrapNum,11,2),'0000')) from Trap) where length(Trapnum)=12
i have three tables ot_cut_head,ot_cut_det and om_mc_master based on which fourth table ot_cut_opr and fifth table ot_cut_mc must get populated , Conditions are as follows
first one is based on job_no in ot_cut_head the selection criteria will be filtered,if the job number is like '%M' then type MISC will be chosen ,if job number is '%G' then GRAT TYPE will be picked from om_mc_master (Machine Master) and operations and machines based on this will be filtered.
Second all the cd_ps_desc will be taken from ot_cut_det and will be compared with om_mc_master to get their corresponding operation codes and machine codes , there can be 2 operations or 1 operation.
Finally if the match is found record will be inserted into ot_cut_opr and ot_cut_mc ,based on the criterias and what i want is the search criteria to be more flexible and if there are 2 operations 2 rows will be inserted and if one opeation is defined in om_mc_master ,then only one record will be inserted.
We have to make sure that if based on operation number stage will be populated ,if its first operation then stage will be 1 and if its second operation the stage will be 2.like previous operation also depends on them , the second operation will have the previous operation as first operation and so on.
CREATE TABLE om_mc_master ( mc_type VARCHAR2(12),mc_prof VARCHAR2(30),mc_prep_cd1 VARCHAR2(30),mc_mach_cd1 VARCHAR2 (30),mc_prep_cd2 VARCHAR2(30),mc_mach_cd2 VARCHAR2(30)); INSERT INTO OM_MC_MASTER VALUES ('MISC','TEE SCH','IR','HO','RE','HO'); insert into om_mc_master values('MISC','Vertical Brace','R','HM','I','HO'); insert into om_mc_master values('MISC','Pipe','IR','HO',NULL,NULL); INSERT INTO OM_MC_MASTER VALUES ('GRAT','PL','RE','HO',NULL,NULL); SQL> SELECT * FROM OM_MC_MASTER; [code]....
I'm facing some problem even after using INSTR function in Oracle.The problem is I have written the logic in the PL/SQL block which appends all the values fetched in a loop on the basis of whether the string is present or not.
For ex:
The first value fetched from the select query first is ABCDEFG which gets appended to a variable The next value fetched is AB even this has to be appended to the variable since this exactly doesn't match with ABCDEFG. The next value fetched is BCDEF even this has to be appended to the variable since this exactly doesn't match with ABCDEFG. The third Value fetched is ABCDEFG this will not get appended presently according to the logic which is correct.
writing that piece of code to append the value fetched which doesn't exactly match with the existing string
how to insert data in oracle table without writing insert statement in oracle 9i or above. i am not going to write insert all, merge, sqlloder and import data.
1. When querying the "alert_log" table I created from the alert log using the script below, 2 new files were created ALERT_LOG_30499.bad and ALERT_LOG_30499.log.
The ALERT_LOG_30499.log. contains this error message:
error processing column MSG in row 2910 for datafile /u02/damistst/admin/bdump/alert_damistst.log ORA-12899: value too large for column MSG (actual: 82, maximum: 80)
the ALERT_LOG_30499.bad , so far, only contains datafile resize information. The datafiles have plenty of space and there is plenty of space on the San slice the datafiles reside.
2. then each time I recreate the table and increased the increased the varchar2 size, the "actual" size will also increase in the log file.
error processing column MSG in row 2910 for datafile /u02/damistst/admin/bdump/alert_damistst.log ORA-12899: value too large for column MSG (actual: 92, maximum: 90)
3. When I increased the varchar2 size to 120+ it gave me this error message:
[oracle@tds_dw bdump]$ cat ALERT_LOG_30715.log
LOG file opened at 03/09/11 14:46:20
Field Definitions for table ALERT_LOG Record format DELIMITED BY NEWLINE Data in file has same endianness as the platform Rows with all null fields are accepted
Fields in Data Source:
MSG CHAR (255) Terminated by "," Trim whitespace same as SQL Loader
TABLE DDL:
create table alert_log ( msg varchar2(80) ) organization external ( type oracle_loader default directory BDUMP access parameters ( records delimited by newline ) location('alert_damistst.log') ) reject limit 1000;
**** QUESTION I can still query the alert_log table in sqlplus, but those log and bad files are generated, is this an issue?
example of a piece of the results from " select * from alert_log; "
MSG -------------------------------------------------------------------------------- Thread 1 advanced to log sequence 5254 (LGWR switch) Current log# 1 seq# 5254 mem# 0: /tds_oradata/redo01a.log Current log# 1 seq# 5254 mem# 1: /u02/damistst/REDO_LOGS/redo01b.log Thread 1 cannot allocate new log Checkpoint not complete Current log# 1 seq# 5254 mem# 0: /tds_oradata/redo01a.log Current log# 1 seq# 5254 mem# 1: /u02/damistst/REDO_LOGS/redo01b.log Wed Mar 9 14:33:09 2011 Thread 1 advanced to log sequence 5255 (LGWR switch) Current log# 2 seq# 5255 mem# 0: /tds_oradata/redo02a.log Current log# 2 seq# 5255 mem# 1: /u02/damistst/REDO_LOGS/redo02b.log
I keep getting the "ORA-01401:inserted value too large for column". No biggie - I've dealt with this multiple times before (but obviously not enough in this instance).
The data being entered is a SINGLE digit number - a number like 1, 2 or 3 - nothing fancy, just a plain straight everyday single digit number. The field in question is / was set as field type "Integer". Now, there is no set field size for integers! - not in Oracle anyway. Since it wasn't happy, I decided I'll try field types of 'Number' and also "Varchar2" set to 10 bytes. I have deleted the column from the table and re-created it as well.
Here's the even more puzzling bit: I can INSERT data into this field, BUT I can not UPDATE the field with the exact same data. The data is being inserted from a csv file. The same exact csv file used to insert works, but the same data in the same file will not update only that particular column.
If I delete the specific column data from the csv file, all goes through fine. If I hard code the update for the field (eg SET field2 = '1' or even SET field2 = ' ') it still doesn't work. So I know it is not the csv file that is causing problems. I deleted all data from the csv file except the field in question - still no luck.
So after eliminating: 1. The field type 2. The field length 3. The data being inserted 4. The external source of the data
I am using trim function in my select query. But still I am getting white space in my output. because of this, I am getting the error "value too large for column... " when I load the data into a table through sqlloader.
define APPName="&1" set heading off; set verify off; set newpage 0 set feedback off; set rtrimspool on; set termout off; set pagesize 40000;