I have one table with clob field. The data in this field will contain string having transaction record data. Now I want to read data from this clob filed and insert different record in other oracle table.
example -> Data in clob field will be-> H|12|1233|fff|sss L|1234|gggg|tttt|yyyyyy|rrrrr L|1094|gggg|tttt|yyyyyy|rrrrr L|1344|gggg|tttt|yyyyyy|rrrrr L|1666|gggg|tttt|yyyyyy|rrrrr L|188|gggg|tttt|yyyyyy|rrrrr
I have one master table and one detail table. I want to insert record -> H|12|1233|fff|sss in master table and records-> L|1234|gggg|tttt|yyyyyy|rrrrr L|1094|gggg|tttt|yyyyyy|rrrrr L|1344|gggg|tttt|yyyyyy|rrrrr L|1666|gggg|tttt|yyyyyy|rrrrr L|188|gggg|tttt|yyyyyy|rrrrr in detail table.
End of excercise will redult-> 1 record in header and 5 records in detail table.
While reading data from collection variable using ref cursor . I am getting the below two errors.
PLS-00382:Expression is of wrong type ORA-22905 Cannot access rows from a non-nested table item.
CREATE OR REPLACE PACKAGE APPS_GLOBAL.GIIOMEGAORDERLIST AS TYPE BU_LIST_TYPE IS TABLE OF VARCHAR(50); TYPE OFFER_DETAIL IS RECORD ( GII_BU VARCHAR(50), GII_OFFER NUMBER, [code]........
Using Oracle 10g2 I'm trying to create a ctx index, using CTX_DLL atributtes.
begin CTX_DDL.CREATE_PREFERENCE ('LEXER_SINTILDES', 'BASIC_LEXER'); CTX_DDL.SET_ATTRIBUTE ('LEXER_SINTILDES', 'BASE_LETTER', 'YES'); end;
drop index se.INDEX_PRONUMJNE_CTX;
CREATE INDEX INDEX_PRONUMJNE_CTX on TBL_PRONUM (MyBlobColumn) INDEXTYPE IS CTXSYS.CONTEXT parameters('sync (on commit) LEXER LEXER_SINTILDES');However, I got errors: ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine ORA-20000: Oracle Text error: DRG-10700: preference does not exist: LEXER_SINTILDES ORA-06512: at "CTXSYS.DRUE", line 160 ORA-06512: at "CTXSYS.TEXTINDEXMETHODS", line 364
Is it possible to create this kind of indexes for blob fields? Or just for varchar field
I have a db field "image" of type BLOB and I save the contents to a file on a local folder c: example.bmp.
I found numerous examples
BEGIN - Get LOB locator SELECT image INTO l_blob pc_immagini_blob FROM WHERE code = WCI; - Open the destination file. l_file: UTL_FILE.fopen = ('C: Temp', 'EXAMPLE.BMP', 'w', 32767); -----> error
The db is not local but on an application server, the image must be saved to c: the client.
The SQL below successfully inserts a row into my PDF_TEMPLT table and reads the referenced pdf file into the TEMPLT field. However, the pdf stored in the blob is incomplete or somehow corrupted. It's 438655 bytes long and causes the application that uses it from the database to crash. If I load the same file into the blob field using Quest Software's Toad GUI, it's 438667 bytes (12 bytes longer), and the consuming application works fine. I have the same problem with other pdfs, too, though the difference in length varies from 2 to 17 bytes, with the SQL-loaded blob always being shorter.
why a blob loaded by this SQL would differ from one loaded via Toad, and what changes I'd need to make to this SQL to get it work properly?
I have a table in schema with a BLOB field. I store employee's picture in this field. Fields in this table are emp_id (number) emp_name (varchar2) and emp_photo (BLOB). I want to ask if there is a way in pl / sql that i could empty this BLOB field to null or reset this field so that user can change the saved photo graph and save another one.
what i am looking for is something like
alter table employee set emp_photo = empty_blob() or alter table employee set emp_photo = null
I try to find out how export data from table to Excel file format and save the result to BLOB field in some other table.I know how to download report from Page by submit, but I need to process data and instead of returning result to user as Excel file - save it in BLOB.
Also I found implementation on JAVA for the issue but actually I wanna study out - Is it possible to resolve this issue by PL/SQL and APEX API methods?
We are facing the problem of slow performance in the production system and lots of complaints from the clients for resource busy or system hang related issue. We are finding the locked object in the system which are mostly inactive in nature so we kill them manually and the problem gets resolve.
We have some observation in AWR reports which are as below :
1) Instance Efficiency Percentages (Target 100%) Execute to Parse %: -26.96 Parse CPU to Parse Elapsd %: 4.83 =============================================================== 2) Total time in database user-calls (DB Time): 55700.6s ================================================================= 3) TCP_RECEIVE_SIZE_DEFAULT 16,384 TCP_RECEIVE_SIZE_MAX 9,223,372,036,854,775,807 TCP_RECEIVE_SIZE_MIN 4,096 TCP_SEND_SIZE_DEFAULT 16,384 TCP_SEND_SIZE_MAX 9,223,372,036,854,775,807 TCP_SEND_SIZE_MIN 4,096 ================================================================= 4) Wait Class Waits %Time -outs Total Wait Time (s) Avg wait (ms) %DB time Network 45,423,119 0 2,122 0 3.81 ================================================================= 5) Event Waits %Time -outs Total Wait Time (s) Avg wait (ms) Waits /txn % DB time SQL*Net message to client 45,030,511 0 55 0 127.77 0.10 =====================================================================
The Top Five events are :
db file sequenctial Read enq TX -row lock contention DB CPU Virtual circuit wait Read by other session
Is there a any oracle doc id or any other document ,which can be referred to get better understanding of AWR Reports and gives the steps of resolution to be taken for various top timed events
I'm dealing with an ORA-1000 error in a Pro*C application where all the cursors are correctly closed (or so it seems to me).
Here is the code for a simple program which reproduces the problem:
Each cursor is opened in a PL/SQL package:
CREATE OR REPLACE PACKAGE emp_demo_pkg AS TYPE emp_cur_type IS REF CURSOR; PROCEDURE open_cur(curs IN OUT emp_cur_type, dept_num IN NUMBER); END emp_demo_pkg;
[Code]....
While testing the initialization parameter open_cursors is set to 50.
It's my understanding that Oracle doesn't close the cursors until it needs the space for another cursor, which in my test case seems to happen when I enter a value of 50 or bigger for "number of loops". To see how oracle is reusing the cursors, while the test program is running I run SQL*Plus and query v$sesstat for the session that's running the test with the following sentence:
select name, value from v$sesstat s, v$statname n where s.statistic# = n.statistic# and sid = 7 and name like '%cursor%';
Even before I enter a value for number of loops I can see that the session opened 4 cursors and closed 2 of them:
NAME VALUE ---------------------------------------------------------------- ---------- opened cursors cumulative 4 opened cursors current 2
Entering a value of 5 for number of loops yields
NAME VALUE ---------------------------------------------------------------- ---------- opened cursors cumulative 11 <----- 7+ opened cursors current 8 <----- 6+
With a value of 30
NAME VALUE ---------------------------------------------------------------- ---------- opened cursors cumulative 36 <----- 25+ (apparently, Oracle reused at least 5 cursors) opened cursors current 33 <----- 25+
With a value of 47
NAME VALUE ---------------------------------------------------------------- ---------- opened cursors cumulative 53 <----- 17+ opened cursors current 50 <----- 17+
Now I reached the upper limit set by the initialization parameter open_cursors.
Entering a value of 48, I get the ORA-1000 error.
ORA-01000: maximum open cursors exceeded ORA-06512: at "SCOTT.EMP_DEMO
Since I open and close the cursor in the same loop iteration, I expect to find in every iterarion 1 explicit cursor and a number of implicit cursors (the PL/SQL call along with the so-called recursive cursors), but I don't expect the sum of all of them to be greater than 50. If my understanding is correct Oracle should be reusing the 50 cursors previously marked as "closeable", not raising the ORA-1000 error.
I am looking for a code/script to read values from excel file and perform PLSQL script.
Now i have the PLSQL script to generate report which takes two value which i have to change all the time to generate new report .All i wanna do create a script to read from a excel file and perform the other script.
I have been searching from a long time and only found UTL_file Package which use CREATE OR REPLACE FUNCTION and create some virtual table.
The problem is i don't have create authorization in database so i m not able to use UTL_File command . Is there any simple way to read value from excel file?
I'm trying to utilize the utl file to read a txt file and import the data into a table in Oracle. I've read in various forums and have researched a lot on oracle documentation site and on the internet but can not find the answer to the problem.
The source follows:
Set serveroutput on DECLARE arquivo_ler UTL_File.File_Type; Linha Varchar2 (1000); BEGIN arquivo_ler: UTL_FILE.FOPEN = ('INTRANET_LOAD', 'carga_intranet.txt', 'R', 32767); Loop UTL_File.Get_Line (arquivo_ler, Linha); dbms_output.put_line (Linha); End Loop; UTL_File.Fclose (arquivo_ler); DBMS_OUTPUT.PUT_LINE ('File processed with sucesso.'); END; /
The errors:
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 633
ORA-29283: invalid file operation
ORA-06512: at line 5
What has been done:
Created DIRECTORY (INTRANET_LOAD) and given the GRANT read, write to the user
On Linux where Oracle is installed, was given full access to the Oracle user folder: / u01/app/oracle/product/11.2.0/db_1/adp
I'm planning to do a single-sing-on in one of my applications. To get to my application, the user has to log in to a portal-application befor. From this application i receive 2 headers, to verify this person has authenticated successfully. But how can I read these 2 headers? It doesn't work with the UTL_HTTP package, because i don't do an active request on an url.
oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production PL/SQL Release 11.1.0.6.0 - Production "CORE 11.1.0.6.0 Production"
I have a .xls file, which has few columns with Korean data. I have problem converting it into .csv. I converted it to UNICODE TEXT, which was TAB separated. I replaced tabs with commas. but still saved it as .txt file. Now I am using utl_file package to open and read.
A doc file stored in a database having data type blob. To read this file I have written Following procedure
create or replace procedure XX_read_blobfile1 as b blob; c clob; n number; begin SELECT file_data INTO b FROM fnd_lobs WHERE file_id = 322420; if (b is null) then [code]......
With the data displaying some boxes before and after data
working on loading the data from flat file into table and below given is the validation condition given.I checked the UTL_FILE build in package but not able to figure out, how to identify the column header in flat file.
1. Skip the header, if any. The header is the first record, and starts with '000' 2. Skip the trailer, if any. The Trailer is the last record, and starts with '999' 3. Log an error, but continue if a line exceeds 512 characters 4. Log an error, but continue if a line is blank
I have a table where i need to update one field values based on another field of the same table , simply as it is.I have done this using one select all check box , on clicking that all check boxes of item_trans table will get selected , then i will un select some of check box and then using one button, i will update the value of the fields which are checked only.
I have put the sample code but when i am updating its taking long time and hanging.I am also attaching the form based on the test case provided.
insert into item_trans(TRANS_ITEM,TRANS_QTY,TRANS_ACT_QTY) VALUES ('TREE1',40,NULL); insert into item_trans(TRANS_ITEM,TRANS_QTY,TRANS_ACT_QTY) VALUES ('TREE2',20,NULL); insert into item_trans(TRANS_ITEM,TRANS_QTY,TRANS_ACT_QTY) VALUES ('TREE3',20,NULL);
--i want to set the value of trans_Act_qty as trans_qty
--i create one dummy or test block to keep the select all check box. for that table test script is
CREATE TABLE TEST ( C VARCHAR2(2000 BYTE), B NUMBER, A NUMBER );
insert into test (C,B,A) values ('A',1,1);
--code written in select all check box which is created on test.block.
BEGIN GO_BLOCK('item_trans'); FIRST_RECORD; LOOP :M_END_YN := :M_END_ALL; [code].......
--code written in M_END_YN ( actual check boxes where i will uncheck).
IF :M_END_YN = 'N' THEN :M_END_ALL := 'N'; END IF;
--code written on button to update those values which are checked.
BEGIN GO_BLOCK('item_trans'); FIRST_RECORD; LOOP IF :M_END_YN = 'Y' THEN [code]......
-define a cursor with bind variables -get a cursor record from these cursor -and pass the bind variable in the OPEN clause
Did'nt succeed as shown in the example.
SET SERVEROUTPUT ON SIZE 900000; DECLARE --works fine CURSOR c1 IS SELECT * FROM USER_TABLES WHERE rownum<3; --doesn't work --CURSOR c1 IS SELECT * FROM USER_TABLES WHERE rownum<:1; crec c1%rowtype; BEGIN --works fine OPEN c1; --isn't possible ? --OPEN c1 USING 3;
LPAD not behaving as expected. the main thing I'm trying to accomplish here is reading values from one table, and inserting them into another... but in the "other" table, they need to be inserted as 11 characters long, with leading zeros. in it's most basic form, this is the cursor I'm using:
CODEDECLARE CURSOR update_mpi_cur IS select distinct A.epn_nbr, A.mrn_nbr, B.mpi_nbr from table1 A, table2 B where B.external_id = A.epn_nbr and B.identifier_type = 'EPN'; [code]....
should have mentioned that this is Oracle 10.2.0.3, on HPUX. not sure if that matters for this issue or not, but wanted to throw that out there.
For example, we have a table ACCOUNT (snowflake dimension containing other dimension keys) and I have many fact tables based on this dimension. Normally data warehouse load happens like first dimensions needs to be loaded and then facts. Our frequency of loads is 30 mins.
To increase the rate in which the data will be available in the facts (as its a financial application), am considering to have two batches one with dimension and another one with fact (came to this conclusion as there is no dependency like first dimensions to be loaded then only fact) just the update might get missed sometimes. But if I do that, when dimension gets loaded, it will be read in the facts in another session. Will this affect the performance ?
LOADING (insert/update) and selecting data from table at the same time. Will it affect the performance in any way.