SQL & PL/SQL :: Data Of Column Is Large - How To Insert

Oct 9, 2013

I have encountered some problems in SQL I want to create a table with a bunch of prepared data. For ease of use, I choose to generate a SQL file which contains all the sql clauses used to create the table and insert the data. So all the data can only be inserted to a table using sql clause.

My questions:
1) If data of a column is large (for example, 1 M text), how to insert it using SQL, is there a piecewise method.
2) And how can I insert BLOB data using SQL clause.

What I what is to enclose all the operations in a single SQL file, and when the table is needed, just execute this SQL file.

View 2 Replies


ADVERTISEMENT

SQL & PL/SQL :: Insert From Large String Into CLOB

Jun 7, 2012

I'm looking for a way to insert strings larger than 40.000 characters in a CLOB-field without geting the "ORA-01461: can bind a LONG value only for insert into a LONG column".

Something like this:

insert into MyClobTable(ID,Data) values ('101','A string containing more than 40000 characters...')

The problem is that a Java-application concatinates the string from a MSSQL-DB so I don't store the string in my oracle-DB. As far as I'm aware this means I can't chop my string in pieces and use declare to put the pieces in variables, right?

Below is an example I found but I don't think I can apply it on my case, correct?

SQL> CREATE TABLE myClob
2 (id NUMBER PRIMARY KEY,
3 clob_data CLOB);

Table created.

SQL>
SQL> INSERT INTO myClob VALUES (101,null);

1 row created.

SQL>
SQL> declare
2 clob_pointer CLOB;
3 v_buf VARCHAR2(1000);
4 Amount BINARY_INTEGER :=1000;

[Code]...

PL/SQL procedure successfully completed.

SQL>
SQL> drop table myClob;

Table dropped.

SQL>

View 3 Replies View Related

SQL & PL/SQL :: Insert Data In Clob Column?

Jun 13, 2012

i have a problem when i try to insert a large character string of nearly 1 lac characters (code of html) in a clob column of my test table, then i get an error "ORA-01704: string literal too long" , i didnot understand that why clob column is not storing this data.

View 1 Replies View Related

XML DB :: Insert Element Into Data Stored In CLOB Column?

Dec 14, 2012

My ORACLE DB version is:

('Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production');
('PL/SQL Release 11.2.0.2.0 - Production');
('CORE 11.2.0.2.0 Production');
('TNS for Linux: Version 11.2.0.2.0 - Production');
('NLSRTL Version 11.2.0.2.0 - Production');

I have this XML data stored in a CLOB column:

<Activity>
<Changes>     
</Changes>
<Inserts>     
</Inserts>
<Definition>     

[code]....

I need creating an update that will insert another <Activity>SomeGUID</Activity> into the <Spawned> parent.

View 3 Replies View Related

SQL & PL/SQL :: Dynamic Way To Insert Data Into Correct Column In Table?

Dec 26, 2011

i used sql loader to import data from csv file to my db.but every time the columns places are changed.o i need dynamic way to insert data into correct column in the table.

in csv file contains column name and i insert this data to temp table, after that i want to read data over column name.also i read the column names from (All_Tab_Columns) to make combination of column name between temp table and All_Tab_Columns table to insert data to right place...

View 39 Replies View Related

Value Too Large For Column MSG

Mar 9, 2011

1. When querying the "alert_log" table I created from the alert log using the script below, 2 new files were created ALERT_LOG_30499.bad and ALERT_LOG_30499.log.

The ALERT_LOG_30499.log. contains this error message:

error processing column MSG in row 2910 for datafile /u02/damistst/admin/bdump/alert_damistst.log
ORA-12899: value too large for column MSG (actual: 82, maximum: 80)

the ALERT_LOG_30499.bad , so far, only contains datafile resize information. The datafiles have plenty of space and there is plenty of space on the San slice the datafiles reside.

2. then each time I recreate the table and increased the increased the varchar2 size, the "actual" size will also increase in the log file.

error processing column MSG in row 2910 for datafile /u02/damistst/admin/bdump/alert_damistst.log ORA-12899: value too large for column MSG (actual: 92, maximum: 90)

3. When I increased the varchar2 size to 120+ it gave me this error message:

[oracle@tds_dw bdump]$ cat ALERT_LOG_30715.log

LOG file opened at 03/09/11 14:46:20

Field Definitions for table ALERT_LOG
Record format DELIMITED BY NEWLINE
Data in file has same endianness as the platform
Rows with all null fields are accepted

Fields in Data Source:

MSG CHAR (255)
Terminated by ","
Trim whitespace same as SQL Loader

TABLE DDL:

create table
alert_log ( msg varchar2(80) )
organization external (
type oracle_loader
default directory BDUMP
access parameters (
records delimited by newline
)
location('alert_damistst.log')
)
reject limit 1000;

**** QUESTION
I can still query the alert_log table in sqlplus, but those log and bad files are generated, is this an issue?

example of a piece of the results from " select * from alert_log; "

MSG
--------------------------------------------------------------------------------
Thread 1 advanced to log sequence 5254 (LGWR switch)
Current log# 1 seq# 5254 mem# 0: /tds_oradata/redo01a.log
Current log# 1 seq# 5254 mem# 1: /u02/damistst/REDO_LOGS/redo01b.log
Thread 1 cannot allocate new log
Checkpoint not complete
Current log# 1 seq# 5254 mem# 0: /tds_oradata/redo01a.log
Current log# 1 seq# 5254 mem# 1: /u02/damistst/REDO_LOGS/redo01b.log
Wed Mar 9 14:33:09 2011
Thread 1 advanced to log sequence 5255 (LGWR switch)
Current log# 2 seq# 5255 mem# 0: /tds_oradata/redo02a.log
Current log# 2 seq# 5255 mem# 1: /u02/damistst/REDO_LOGS/redo02b.log

13076 rows selected.

View 7 Replies View Related

SQL & PL/SQL :: Getting Error / Value Too Large For Column

Jul 29, 2011

I am using trim function in my select query. But still I am getting white space in my output. because of this, I am getting the error "value too large for column... " when I load the data into a table through sqlloader.

define APPName="&1"
set heading off;
set verify off;
set newpage 0
set feedback off;
set rtrimspool on;
set termout off;
set pagesize 40000;

[code].....

View 3 Replies View Related

Adding Column To Large Table

Aug 12, 2013

I want to add column to table which has huge amount of data and fill with data from another table. What is the best way to do it? Is it faster to use CTAS instead of ALTER TABLE ADD COLUMN?

View 2 Replies View Related

SQL & PL/SQL :: ORA-12899 / Value Too Large For Column (actual / 2 - Maximum / 1)

Feb 14, 2013

While creating the external Table, i am getting error.

ORA-12899: value too large for column PRC_REC_TYPE (actual: 2, maximum: 1)

But while checking the CSV file, i found that prc_rec_type is having one 1 length value. I am uploading the csv file also.

-- Create table

create table ET_PGIT_POL_RISK_COVER
(
prc_pol_no VARCHAR2(60),
prc_end_no_idx VARCHAR2(22),
prc_sec_code VARCHAR2(12),
prc_risk_id VARCHAR2(12),
prc_smi_code VARCHAR2(12),

[code].....

View 23 Replies View Related

Server Utilities :: Data Pump For Exporting And Importing Extremely Large Data Files

Sep 24, 2010

I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?

View 4 Replies View Related

Storing Large Volume File In CLOB Column As A String?

Oct 6, 2009

I am having problem in inserting a long file (80000 B) into CLOB cloumn in oracle database 10g.

ERROR at line 23:
ORA-06550: line 23, column 1:
PLS-00172: string literal too long

I am calling the oracle stored procedure through unix shell script as fallows.

#!/bin/ksh
cd /usr/home/dfusr/backup
i=`cat pe_proxy_master_2160.Thu.log | sed "s/'/''/g"`
sqlplus username/password@db 1> temp.log <<!
DECLARE
BEGIN
Write_Text_To_CLOB(1,'$i');
COMMIT;
END;
/

The file pe_proxy_master_2160.Thu.log is about 50000 B.

My Stored Procedure is fallows

CREATE OR REPLACE PROCEDURE Write_Text_To_CLOB (
p_id IN NUMBER
, p_clob IN VARCHAR2
)
IS

[code]....

I also tried with ojdbc5.jar and ojdbc14.jar files in class path.

View 1 Replies View Related

Export/Import/SQL Loader :: ORA-12899 / Value Too Large For Column XXXX (actual / 51 Maximum / 50)

Apr 10, 2013

i am exporting / importing from 10.2.0.4.0 to 11.2.0.3.0 but while doing import some of rows are rejected ...

IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column XXXX (actual: 51, maximum: 50)
Column 1 264
Column 2 123432
Column 3
Column 4 7

[code]....

having looked at data in source system i cant see see character "Â" in the column 11 i think this is what causing issue !!why is oracle adding this character to this field ? how can i fix this ? without having to modify table in new system to allow more characters?

View 11 Replies View Related

Large Amount Of Data

Aug 6, 2013

I have oracle 11gr2 database on linux os. It's total sga size is 500mb only. Now, if uses wants read the 1gb of data from database, then there is no sufficient memory in buffer cache. so how it will works. the transaction will get successful or it will fail.And i have another doubt, does oracle can read the data from memory only or it can also read directly from disk. 

View 11 Replies View Related

SQL & PL/SQL :: FOR UPDATE Large Data

Jan 23, 2012

the large data FOR UPDATE in table column ?

claimClob clob:=claim; -- claim large data
v_buf varchar2(1000);
amount binary_integer:=1000;
position binary_integer:=1;

[Code]....

why the FOR UPDATE don't do nothing ?

View 4 Replies View Related

SQL & PL/SQL :: Store Large XML Data In Blob?

May 8, 2010

Since XML-files only contain character data, we could/should store it in a CLOB, rather than a BLOB.

But, One of my friend having a table where a column is defined as bloband came to know that XML data are being stored. I searched for some article with keyword 'How to insert large XML data in BLOB' But did not work.How to store the large xml content in a Blob and How to extract it?

View 2 Replies View Related

Server Utilities :: Insert Data Without Writing Insert Statement In Oracle?

May 15, 2010

how to insert data in oracle table without writing insert statement in oracle 9i or above. i am not going to write insert all, merge, sqlloder and import data.

View 2 Replies View Related

Need Clob Variable To Write Large Volume Of Data?

Oct 6, 2010

I am facing a problem with utl_http.write_text in my pl/sql application. My requirement is to write data of size>32k. So I used a clob variable in write_text. But still it is showing numeric or value error when the data size is above 8k.

I have read that chunked transfer encoding will work. But I couldn't find out how this is done.

View 5 Replies View Related

SQL & PL/SQL :: Data Types To Store Large Integer Values?

Aug 15, 2012

what could be effective data type to store large integer values like, 50,000; 10,000,000 etc.?

View 3 Replies View Related

SQL & PL/SQL :: View Sample Data From Very Table Which Is Large In Size?

Apr 26, 2010

I have a query on , how to view the sample data from a very table which is large in size ( more than 10 million ).

I just need to see some sample data from a large table ( to see what kind of data which is application related ).

My question is :

Select *
from Sample_table
where rownum < 10

is this a Good way to view the sample data ?

I have understanidng that the rownum will be assigined to the rows once all the rows are reteived.

So what is the best way to view ?..I am not sure of any condition to put in the intial time of querying.

View 5 Replies View Related

RAC & Failsafe :: Inserting Large Data Locks The Destination Table In RAC

Oct 18, 2010

Scenario:

Our application is using a two instance, one for the live active data and the other for the reports data. We have a process which moves the data from the live instance to reports instance every night. In a single db environment the process is working without any issues. However when we move to the RAC environment the reports db's (insert) in large table get locked and we are unable to insert data to the reports db.

What we are performing is:

Insert into my_table_rpt select * from may_table_live@db_link_to_livedb;

Issues:

my_table_rpt get locked

We have found the workaround by disable locking in destination and subsequent to the insert enable locking

ALTER TABLE my_table_rpt DISABLE TABLE LOCK;

Insert the data to the reports database table

Then

ALTER TABLE my_table_rpt ENABLE TABLE LOCK

Question:

Why does the large destination table (my_table_rpt) get locked in the RAC environment?

View 2 Replies View Related

Forms :: When Insert Detail The Data Insert Twice Time

Feb 27, 2012

when i tried to insert the details from oracle froms..the data inserts twice to the DB..

my table structure:

create table app_sri
(a_id integer primary key,
p_first_name varchar2(30),
p_last_name varchar2(20),
p_age number(3)
);

here a_id can be genarated through simple sequence(pid_seq)...
trigger on app_sri
create or replace trigger pid_trg

[Code]....

form insertion code..

Begin
insert into app_sri(null,'robo','Big',100);
commit
End;

the data inserted...but twice

what is the reason behind the double insertion?

View 8 Replies View Related

Data Archive Script Is Taking Too Long To Delete A Large Table

Aug 8, 2013

We have data archive scripts, these scripts move data for a date range to a different table. so the script has two parts first copy data from original table to archive table; and second delete copied rows from the original table. The first part is executing very fast but the deletion is taking too long i.e. around 2-3 hours. The customer analysed the delete query and are saying the script is not using index and is going into full table scan. but the predicate itself is the primary key,More info below

CREATE TABLE "APP"."MON_TXNS"    (    "ID_TXN" NUMBER(12,0) NOT NULL ENABLE,     "BOL_IS_CANCELLED" VARCHAR2(1 BYTE) DEFAULT 'N' NOT NULL ENABLE,     "ID_PAYER" NUMBER(12,0),     "ID_PAYER_PI" NUMBER(12,0),     "ID_PAYEE" NUMBER(12,0),     "ID_PAYEE_PI" NUMBER(12,0),     "ID_CURRENCY" CHAR(3 BYTE) NOT NULL ENABLE,     "STR_TEXT" VARCHAR2(60 CHAR),     "DAT_MERCHANT_TIMESTAMP" DATE,     "STR_MERCHANT_ORDER_ID" VARCHAR2(30 BYTE),     "DAT_EXPIRATION" DATE,     "DAT_CREATION" DATE,     "STR_USER_CREATION" VARCHAR2(30 CHAR),     "DAT_LAST_UPDATE"

[Code]...

 Data is first moved to table in schema3.OTW. and then we are deleting all the rows in otw from original table. below is the explain plan for delete  

SQL> explain plan for  2  delete from schema1.mon_txns where id_txn in (select id_txn from schema3.OTW); 

Explained. SQL> select * from table(dbms_xplan.display); 

PLAN_TABLE_OUTPUT--------------------------------------------------------------------------------------------------------------------------------------------

Plan hash value: 2798378986
 -------------------------------------------------------------------------------------
| Id  | Operation              | Name       | Rows  | Bytes | Cost (%CPU)| Time     |
-------------------------------------------------------------------------------------|   0 | DELETE STATEMENT       |            |  2520 |   233K|    87   (2)| 00:00:02 ||   1 |  DELETE                | MON_TXNS   |       |       |            |          ||*  2 |   HASH JOIN RIGHT SEMI |            |  2520 |   233K|    87   (2)| 00:00:02 ||   3 |    INDEX FAST FULL SCAN| OTW_ID_TXN |  2520 | 15120 |     3   (0)| 00:00:01 ||   4 |    TABLE ACCESS FULL   | MON_TXNS   | 14260 |  1239K|    83   (0)| 00:00:02 |

-------------------------------------------------------------------------------------
 PLAN_TABLE_OUTPUT
-------------------------------------------------------------------------------------------------------------------------------------------- 
Predicate Information (identified by operation id):
--------------------------------------------------- 

View 6 Replies View Related

Application Express :: How To Show Data From A Table Having Large Number Of Columns

Oct 8, 2013

I have a report with single row having large number of columns . I have to use a scroll bar to see all the columns. Is it possible to design report in below format(half columns on one side of page, half on other side ofpage : 

Column1DataColumn11DataColumn2DataColumn12DataColumn3DataColumn13DataColumn4DataColumn14DataColumn5DataColumn15DataColumn6DataColumn16DataColumn7DataColumn17DataColumn8DataColumn18DataColumn9DataColumn19DataColumn10DataColumn20Data I am using Apex 4.2.3 version on oracle 11g xe.

View 2 Replies View Related

How To Insert A BLOB Column

Nov 9, 2008

how to insert into a BLOB column. I have stored procedure with an input parameter of type BLOB now how do I insert this BLOB into a table.

View 2 Replies View Related

SQL & PL/SQL :: Insert Into BLOB Column

Jul 21, 2011

I have a table with a column of type blob. Now i want to create a procedure which will insert into that table. But I don't like to create a directory. How can i solve this.

I want to insert a row like this:

insert into table x(image)
values('d:photo est.jpg');

View 2 Replies View Related

SQL & PL/SQL :: Insert Using Column Name To Indicate Inserted Value

Apr 16, 2012

I have the following tables:

CREATE TABLE test_abc(id NUMBER PRIMARY KEY, col1 NUMBER(10), col2 NUMBER(10), col3 NUMBER(10));

INSERT ALL
INTO test_abc VALUES (1,12345,34567,87654)
INTO test_abc VALUES (2,17345,37567,87754)
INTO test_abc VALUES (3,12745,34767,87674)
INTO test_abc VALUES (4,17045,30567,80754)
INTO test_abc VALUES (5,12740,34067,87604)
SELECT * FROM dual;

CREATE TABLE test_b (id NUMBER, col VARCHAR2(10), coltype_id NUMBER);

What I need to do is to convert the columns col1, clo2, col3 in test_abs into rows in test_b. but I need to do this so that the column name is what determines the value of colytype_id.So in test_b values will look like:

id col coltype
1 12345 1
1 34567 2
1 87654 3

and so on.

View 11 Replies View Related

Maintain Large Tables / Cleanup Data From Our Tables

May 18, 2011

I have to cleanup data from our tables (Production Environment) that contain millions of rows. The question is apart from the solution of the partitioned tables what alternative recommended solution suggests Oracle?

To delete these tables by using a cursor PL/SQL block or to import all the database and in the tables that we want to remove the old rows to use the QUERY option of the data pump utility.

I have used both ways and i have to admit that datapump solution is much much faster than the deletion that suffers from I/O disk.The question again is which method from these two is more reliable and less risky for the health of the database.

View 5 Replies View Related

SQL & PL/SQL :: Insert PDF File Into BLOB Column?

Jan 20, 2011

I want to store a pdf file into a database column of BLOB type. The pdf file on the client system not on the database server. Is there any way i can achieve this?

View 1 Replies View Related

SQL & PL/SQL :: Insert Value In Column Which Has Date Datatype?

Oct 12, 2012

i am having one column name pdate with date datatype

i am updating here value like this

update table1 set pdate='15-10-2012' where id=1;

but showing error: not a valid month.

how to update this.

View 7 Replies View Related

SQL & PL/SQL :: Insert Query Having Strings As A Column

Dec 13, 2012

tyring to insert an insery query having string as a column where i am supposed to insert a single quote casuing the problem.

insert into abc(x,y) values (1,'select abc,bbc from T_AB A,select fgh,hij from T_AB where fgh='self' group by fgh,hij having count(fgh)>1) B) where A.hij=B.hij')

getting missing comma with the above query.when i tried to give as

insert into abc(x,y) values (1,'select abc,bbc from T_AB A,select fgh,hij from T_AB where fgh=''self'' group by fgh,hij having count(fgh)>1) B) where A.hij=B.hij')

insert is happening but saving as "select abc,bbc from T_AB A,select fgh,hij from T_AB where fgh=''self'' group by fgh,hij having count(fgh)>1) B) where A.hij=B.hij"

how to avoid this and get the select query to store as

select abc,bbc from T_AB A,select fgh,hij from T_AB where fgh=''self'' group by fgh,hij having count(fgh)>1) B) where A.hij=B.hij

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved