Since XML-files only contain character data, we could/should store it in a CLOB, rather than a BLOB.
But, One of my friend having a table where a column is defined as bloband came to know that XML data are being stored. I searched for some article with keyword 'How to insert large XML data in BLOB' But did not work.How to store the large xml content in a Blob and How to extract it?
I am getting the file using CLIENT_GET_FILE_NAME. I need to read the data from the .xsl file and convert it into blob. The file should not be stored in DB.
i want to store xml in database. i have following questions,
1) in which col should i keep xml . 2) right now i am keeping it in blob columns, how can i insert update a record in blob col from query, which can be run from worksheet.
I have a table where user can store images in clob format.Need to convert the same image to blob and store in a dummy table.
I tried using this --function creation CREATE OR REPLACE FUNCTION FN_CLOB_TO_BLOB(CLOB_IN IN CLOB) RETURN BLOB IS POS PLS_INTEGER := 1; BUFFER RAW(32767);
My client system is WinXP.I want to store a Word/Excel document to Oracle Blob column and choosed the command named "piecewise.exe",it seems a free tool on the web, to store the document to oracle,but occausionally failed.So I need alternative which is stable and available.
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
I have oracle 11gr2 database on linux os. It's total sga size is 500mb only. Now, if uses wants read the 1gb of data from database, then there is no sufficient memory in buffer cache. so how it will works. the transaction will get successful or it will fail.And i have another doubt, does oracle can read the data from memory only or it can also read directly from disk.
I have encountered some problems in SQL I want to create a table with a bunch of prepared data. For ease of use, I choose to generate a SQL file which contains all the sql clauses used to create the table and insert the data. So all the data can only be inserted to a table using sql clause.
My questions: 1) If data of a column is large (for example, 1 M text), how to insert it using SQL, is there a piecewise method. 2) And how can I insert BLOB data using SQL clause.
What I what is to enclose all the operations in a single SQL file, and when the table is needed, just execute this SQL file.
begin for i in 1..5 loop insert into easy values(i,'777'||i,i||89,'Ris'||i||i);
[code]...
I was able to fetch primary keys and there values for a table and store them in stage table but only 1 record , I am stuck when there are more than 1 record Here is what i tried
SQL> SET serveroutput ON SQL> DECLARE 2 V_prim_key VARCHAR2(2000); 3 l_tab DBMS_UTILITY.uncl_array;
[code]...
If you look at the above code in the execute immediate where condition i have used id=1, which is 1 records only but if i change id <3 then there will be 2 records and i will get below error ORA-01422: exact fetch returns more than requested number of rows
Other problem if there are 3 primary keys as in above case i am fetching there values 1 by 1 , is there any way if i can fetch those values once in for all data to be stored in stage table in below format
how to tune qurey for coulumn wise data saved.because we have to join same table n number of times.for reference go through the following scnarios.
Suppose one table T1 is there and it has two column KEY and VALUE.if we are writing qurey for retriving desire result in row manner we have to join samae table no of times.
KEY Value agreesWith true id 1 assessment False basisForDateOfProgression 1 bestOverallResponse 2 bestOverallUnknownComments data is ok
Qurey:
select * from ( select t.agreesWith from t1 t )a
[Code]...
In this manner we can join upto bestOverallUnknownComments .so which method we follow to reduce the execution time and performance should be good.
I am facing a problem with utl_http.write_text in my pl/sql application. My requirement is to write data of size>32k. So I used a clob variable in write_text. But still it is showing numeric or value error when the data size is above 8k.
I have read that chunked transfer encoding will work. But I couldn't find out how this is done.
I have two different database servers where I need to migrate table data from one schema to another schema in batch wise for eg say 100 rows. I used BULK COLLECT with LIMIT. But to access BLOB data from table I have facing errors. What could be other approache to do the same.
here pc_work is a table containing BLOB data in sourse schema. I am fetch data from this table to table t1_test_work using dblink but not working
[ declare type array is table of test_work%ROWTYPE; L_DATA array; cursor C is select * from pc_work@prpctrg; begin open C; LOOP
I have a table containing BLOB column which stores scanned images. Due to an application error, few extra data was padded with BLOB data and now we want to remove it.
The table count will be near to 10 million rows.
We need to remove data from 161byte to 167byte of the blob data. I tried to do with DBMS_LOB.ERASE.But it will create blank spaces for the removed data. Here we need to reduce the size of BLOB data by 6 bytes by removing data from 161bytes to 167 bytes.
I am very new to oracle and SQL.I am trying to create a store proc that will copy 14 day of data into a table and then truncate the original table. When i compile following code....
CREATE OR REPLACE PROCEDURE STOPROC_TRUNCATE ( dateNum IN NUMBER ) IS BEGIN create table AUDIT_14Days as select * from AUDIT where TIMESTAMP >= (SYSDATE - dateNum); truncate table AUDIT drop storage; [code]....
Our application is using a two instance, one for the live active data and the other for the reports data. We have a process which moves the data from the live instance to reports instance every night. In a single db environment the process is working without any issues. However when we move to the RAC environment the reports db's (insert) in large table get locked and we are unable to insert data to the reports db.
What we are performing is:
Insert into my_table_rpt select * from may_table_live@db_link_to_livedb;
Issues:
my_table_rpt get locked
We have found the workaround by disable locking in destination and subsequent to the insert enable locking
ALTER TABLE my_table_rpt DISABLE TABLE LOCK;
Insert the data to the reports database table
Then
ALTER TABLE my_table_rpt ENABLE TABLE LOCK
Question:
Why does the large destination table (my_table_rpt) get locked in the RAC environment?
I have a PL/SQL procedure which gathers data from multiple places as well as calculates some data. I want to store all this in a materialized view.So, I created an object type (I've shortened the definitions):
CREATE OR REPLACE TYPE mf_record_type AS OBJECT (identifier VARCHAR2(6), name VARCHAR2(100));
Then created the table type of the object:
CREATE OR REPLACE TYPE mf_table_type IS TABLE OF mf_record_type;
Then in the stored procedure defined a variable of the table type:
I have a Data entry form which is a multirecord block;
Question: for example that form has 10 to 25 fields or columns more than that all the data has been entered, but before committing or saving that form i need to cross check the the data with a select query, whether the data entered is correct or not but before committing, that data it should be posted into that table if i find that one data is entered wrongly then i will modify that and again cross check and save the transaction permanently into the database table?
We have data archive scripts, these scripts move data for a date range to a different table. so the script has two parts first copy data from original table to archive table; and second delete copied rows from the original table. The first part is executing very fast but the deletion is taking too long i.e. around 2-3 hours. The customer analysed the delete query and are saying the script is not using index and is going into full table scan. but the predicate itself is the primary key,More info below
Plan hash value: 2798378986 ------------------------------------------------------------------------------------- | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time | -------------------------------------------------------------------------------------| 0 | DELETE STATEMENT | | 2520 | 233K| 87 (2)| 00:00:02 || 1 | DELETE | MON_TXNS | | | | ||* 2 | HASH JOIN RIGHT SEMI | | 2520 | 233K| 87 (2)| 00:00:02 || 3 | INDEX FAST FULL SCAN| OTW_ID_TXN | 2520 | 15120 | 3 (0)| 00:00:01 || 4 | TABLE ACCESS FULL | MON_TXNS | 14260 | 1239K| 83 (0)| 00:00:02 |
------------------------------------------------------------------------------------- PLAN_TABLE_OUTPUT -------------------------------------------------------------------------------------------------------------------------------------------- Predicate Information (identified by operation id): ---------------------------------------------------
I have a report with single row having large number of columns . I have to use a scroll bar to see all the columns. Is it possible to design report in below format(half columns on one side of page, half on other side ofpage :
Column1DataColumn11DataColumn2DataColumn12DataColumn3DataColumn13DataColumn4DataColumn14DataColumn5DataColumn15DataColumn6DataColumn16DataColumn7DataColumn17DataColumn8DataColumn18DataColumn9DataColumn19DataColumn10DataColumn20Data I am using Apex 4.2.3 version on oracle 11g xe.