PL/SQL :: Read Long Raw data And Write To A File
Sep 17, 2012So we need an mechanism to read data from LONG RAW and convert into actual file.
View 1 RepliesSo we need an mechanism to read data from LONG RAW and convert into actual file.
View 1 RepliesI have a table revenue
create table revenue
(
person varchar2(23),
month varchar2(3),
rev_amt number
)
and i have data in a file like below
Person Jan Feb Mar Apr Mai Jun Jul Aug Sep Oct Nov Dez
--------------------------------------------------------
Schnyder,345,223,122,345,324,244,123,123,345,121,345,197
Weber,234,234,123,457,456,287,234,123,678,656,341,567
Keller,596,276,347,134,743,545,216,456,124,753,346,456
Meyer,987,345,645,567,834,567,789,234,678,973,456,125
Holzer,509,154,876,347,146,788,174,986,568,246,324,987
Müller,456,125,678,235,878,237,567,237,788,237,324,778
Binggeli,487,347,458,347,235,864,689,235,764,964,624,347
Stoller,596,237,976,876,346,567,126,879,125,568,124,753
Marty,094,234,235,763,054,567,237,457,325,753,577,346
Studer,784,567,235,753,124,575,864,235,753,864,634,678
i want to load it into the table in the following way.
Person Month Revenue
-------------------------
Schnyder Jan 345
Schnyder Feb 223
Schnyder Mar 122
Schnyder Apr 345
Schnyder Mai 324
Schnyder Jun 244
Schnyder Jul 123
Schnyder Aug 123
Schnyder Sep 345
Schnyder Oct 121
Schnyder Nov 345
Schnyder Dez 197
........ ... ...
How to write control file to load this data into the above revenue table.
how to write procedure to load the data into a table using xml as input parameter to a procedure and xml file is as shown below which is input to me.
xml version="1.0"?><DiseaseCodes><Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity><Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity><Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity></DiseaseCodes>.
I am receive the REP-50127 error, cannot write to file when trying to save a report to a file on the network via report parameters.
I am guessing the rwserver does not have permissions to the network drive.
Will the SERVER_IN_PROCESS=NO run the rwserver process as the user executing the report?
I have read the following article:[URL] 11070I want to know wherer if there exists a possibility of write a clob or an xml into a file on disk, if we do not have the CREATE ANY DIRECTORY privilege. Many functions, like UTL_FILE.FOPEN, or dbms_xslprocessor.clob2file, or dbms_xmldom.writetofile, need an Oracle directory to be created (with CREATE OR REPLACE DIRECTORY...). But if we don't have this privilege, is there a possibility to export a clob into a file as xml (the clob contains 100% xml, but this is the column data type, CLOB) if we don't have that privilege?The clob data contains 48200 characters.
View 8 Replies View RelatedIn PL\SQL program, I am writing information from one table to file. In my current architecture, I am writing the information to approximately 1000 files.
If I put the database write operation in another package and in another method and call this method from my PL\SQL program asychronously, can that increase performance?
We have a database that is accessed by ArcSDE, a product to modify maps. It uses BLOBs to store those maps.
We ran a load on the server and the response time was slow. By running the following query:
select event, total_waits, time_waited, avg_ms, round(ratio_to_report(time_waited) over () * 100) percent
from (select substr(event, 1, 30) event, total_waits, time_waited, round(time_waited_micro / total_waits / 1000, 2) avg_ms
from v$system_event
where wait_class in ('System I/O') union
select 'CPU' event, NULL, value, NULL
from v$sysstat
where statistic# = 12
order by 3 desc)
where rownum <= 10;
I get
EVENT TOTAL_WAITS TIME_WAITED AVG_MS PERCENT
--------------------------- -------------- ------------- -------- ---------
control file parallel write 127187 6354909 499.65 70
CPU 988274 11
db file parallel write 20461 886442 433.23 10
log file parallel write 14987 870672 580.95 10
log archive I/O 1557 18094 116.21 0
control file single write 149 10590 710.71 0
control file sequential read 136502 5219 .38 0
log file single write 56 2511 448.41 0
log file sequential read 489 492 10.05 0
BUG: 733426 says to change the event="10359 trace name context forever, level 1"
suppose i have a file named chk.txt and I have write 10 line in that file and i just want to write some text at line 5 or line first then how we can do this ?
View 3 Replies View RelatedHow to write the CTL file for this kind of situation.
a.txt
id name subject
12aaaHistory
23bbbScience
45cccZoology
b.txt
idlayerLayerNo
12xxx121
23yyy232
23lll233
45xxx451
45yyy452
45lll453
i have files a.text which is parent file and another one is child one called file b.txt . Both files are linked together by common field called "id". Interesting part child file have multiple layers name associated with ids. (we are only aware that in b.txt for each id there could be max 3 layers)
So they needs to get loaded into Table called PARENT_TBL
So PARENT_TABLE looks like
ID NAME SUBJECT LAYER LAYERNO
How I'm going to achieve this ?
How to write to text file using oracle? And how do I handle spaces/next line? (i was trying to use spaces(ch(32)), however it is just converted into squares in the text file.)
View 5 Replies View RelatedI have a file studnet.txt in following order
JAMs|1231|PHYSICS
SAM|1232|PHYSICS
ALI|1233|CHEMISTRY
I want to read and write data from a file in | seprated mode. file READ and WRITE using Oracle Forms6i. I have a knowledge of file handling in C++ but not use it in Oracel Form before this.
I have a web service which gives me the response in Byte array.write this byte array into a binary file? .
View 2 Replies View RelatedI would like to connect to oracle database and would like to execute series of SQLs and write those SQL results into Excel file (sheet1).
my requirement is simple :
------------------------
db_user_id scott
db_user_pwd tiger
db_sid orcl
select count(*) from emp;
select count(*) from dept;
print the same SQLs in A column and result should be in B column in excel file.
column A column B
select count(*) from emp; 14
select count(*) from dept; 4
thats it.
I'm working on trying to write a shell script to parse a parameter file, but at the same time I want to be able to overwrite the parameter file settings with other command line settings. For instance if my par file had export/import settings for the username, password, schema, etc;
I wanted to run the same export/import with those settings for a different schema. I want to be able to put the schema=<different_than_par_file> after the parfile=<parfile.par and have the parfile be read and applied for everything except the different schema.
Right now I'm storing the cmd line and parsing it again looking for other parameters besides the parfile.
I am trying to upload big files to individual table with BLOB column. During upload process after long time approx. 2h I get the following error message:
[#|2012-08-01T19:03:01.667+0200|WARNING|sun-appserver2.1|java.lang.Class|_ThreadID=27;_ThreadName=httpSSLWorkerThread-8082-2;_Reques
tID=4cec5fc8-b9e1-4017-a859-8759ec1f5d37;|oracle.jdbc.driver.OracleBlobOutputStream.flushBuffer(OracleBlobOutputStream.java:236)
java.io.IOException: ORA-01013: user requested cancel of current operation
[code]....
I am using Glassfish Server v2.1.1 with APEX Listener v1.1.3.243.11.40...The Timeout parameters for JDBC settings in APEX Listener are default. Thus I would expect to abort earlier to be an issue of JDBC Connection?
I have come one requirement where i need to extract data from a LONG RAW data type column.
View 7 Replies View RelatedSQL> desc res;
Name Null? Type
----------------------------------------- -------- ----------------------------
RESULT PUBLIC.XMLTYPE
SQL> select * from res;
RESULT
--------------------------------------------------------------------------------
<fine No="2"><stdNo>2</stdNo><value>300</value><reason>breaks keyboard</reason><
date>2011-10-03</date></fine>
how can me write results of select statement into xml file instead of show them on screen?
I tried to write a dynamically changing filename and file type so that i can dump some data. But the code I wrote is not producing any file at all even though it compiles with no errors. I run this code at XE database which comes free from the oracle website. I did all the directory settings. No problem with it because I can produce file with different codes. The code is as following:
CREATE OR REPLACE PROCEDURE dump_csv_file IS
TYPE number_array IS VARRAY(10000) OF NUMBER;
TYPE string_array IS VARRAY(10000) OF VARCHAR2(100);
TYPE date_array IS VARRAY(10000) OF DATE;
TYPE v_file_array IS VARRAY(10000) OF UTL_FILE.FILE_TYPE;
[code]...
I got an exception when I was using sesame adapter to dump a turtle file which contains long texts as objects into oracle semantic database. The exception information is:
org.openrdf.repository.RepositoryException: org.openrdf.sail.SailException: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
ORA-06512: in "SF.ORACLE_ORARDF_ADDHELPER", line 1
ORA-06512: in line 1
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:439)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:395)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:802) ...
resolve problem with move lob objects ? I move table partition and lob (BLOB) from one tablespace to another :
alter table EBIF.APO_T_VER_DISP_ACC_RESP MOVE PARTITION P1M20120901 LOB(SIGNATURE_PATTERN) STORE AS (TABLESPACE tmp) t
able EBIF.APO_T_VER_DISP_ACC_RESP MOVE PARTITION have : pbeb_ap1.SYS>select partition_name , tablespace_name from dba_lob_partitions where table_name='APO_T_VER_DISP_ACC_RESP';
PARTITION_NAME |TABLESPACE_NAME
------------------------------|------------------------------
P1M20110901 |TD1M20110901
P1M20111001 |TMP
P1M20111101 |TMP
P1M20111201 |TMP
P1M20120101 |TD1M20120101
[code]....
I used skrip to generate move :
select 'alter table '||table_owner||'.'||table_name||' MOVE PARTITION '||partition_name||' LOB('||COLUMN_NAME||') STORE AS (TABLESPACE TD_PART_RW) PARALLEL 4;'
from dba_lob_partitions where tablespace_name='TMP';
when I started loadink into dis table I get : ORA-01461: can bind a LONG value only for insert into a LONG column
when I recreate this table ALL work ok , but new table is not partitioned .
i am trying insert data from one DB to other DB table. one field data type is LONG in first DB table, Same field data type in other DB is CLOB.
i used TO_LOB function to convert from LONG to CLOB data type.
My problem is, i used this TO_LOB function, i got illegal operation of LONG Data type.
I have a task to update one of the rows in a table (having only 2 columns, number and long) which is long data type. We are on Oracle 10g. Not sure how to use update for a long data type column.
I have tried using dbms_metadata_util.long2varchar, but still not getting what I want.
converting BLOB data into varchar2 or long .
we have function which convert long data and return it has varchar . But has part of Apps upgrade the Column has been converted into blob column.
How we need the same function to read the data from BLOB and return its as long or varchar2.
Somewhere i am making mistake..
CREATE OR REPLACE function GDS.test_alert_msg(v_rowid rowid) return varchar2 is
vblob blob;
i2 number;
amt number :=32767;
len number;
pos raw(32767);
position INTEGER := 10000;
my_vr raw(32767);
[Code]....
am trying to write a simple sql which would delete data from last n months but it will keep the data for the first of each of those month from current sysdate
e.g
Jan 1 - 30 deletes 2 - 30 keeps data for 1st
Feb 1 - 28 deletes 2 - 28 keeps data for 1st
Mar
Apr Current sysdate
May
In my production Db, I have 3 mount points like /vol1/orcl,/vol2/orcl,/vol3/orcl which contains the datafiles.
I have to create a standby like the same location. how to write the db_file_name_convert parameter?
I have upgraded oracle database from 9i to 11g using export and import utility. After migration we are facing performance issue in report generation, We have observed that First execution of report is taking very long time and when we generate the same report 2 -3 times there is considerable change in the execution time and it is more better than the first execution.
2 days back I have restarted the database and found the same issue. There are around 300 Reports and it is not possible to generate all the reports 2-3 times every time we restart the database.
when I try comnd create table a as select * from b where 1=2; it says illegal datatype long..i m bemused what sin has the long datatype done?
View 4 Replies View RelatedI have a form with an item with "datatype=long". when I am typing in "Arabic" or "Persian" it break the text before line ends.but when I type in English "text" go to end line(in other words line fills to the end) and then breake to next line.It show that the problem is LANGUAGE.
nls_lang=AMERICAN_AMERICA.ar8mswin1256
Regional and language/regional option=Arabic(U.A.E)
in the block/item property : wrap_style= word ----multi_line=yes
I have a long datatype field in which text of length more than 4000 is stored. I want to use editor for entering and displaying the data, but it shows the error:-
case 1: when editor is invoked directly- FRM-40735: WHEN-BUTTON-PRESSED trigger raised unhandled exception ORA-06502
case 2: when I write something in the field- FRM-40209: Field must be of form.
We are working on a performance tuning aspect, where in a table has a LONG datatype and number of round trips are increasing based on the number of rows fetched.i.e one round trip for every row.
Background.
1. created a table with LONG data type.
2. inserted bulk load of data.
3. set auto trace on and executed below tests..
SQL> select count(*) from test_long;
COUNT(*)
----------
6110
SQL> desc test_long
Name Null? Type
----------------------------------------------------------------------------------------------------- -------- ----------------------------------------------------------------------------
ID NUMBER
TESTLONG LONG
[code].....
but on LONG column, irrespective of the array size the round trips does not reduce.