SQL & PL/SQL :: Two Columns Can Have Long Datatype In Table?
Apr 27, 2010
I have question related to LONG datatype. Actually from google and get to know that one table can have only one LONG datatype when i searched for reason . i got these resons:-
With 9i (I believe) and later versions, Oracle deprecates using the long datatype in favor of the lob (clob, nclob and blog) datatypes. It is only supported for backward compatibility.
Restriction:- It can not be used in create type as an attribute of the defined type.
It can not be used in where conditions.
There can be no indexes on long columns.
Regular Expression are not possible.
long can not be returned from a stored function.
SQL can not call functions that have an attribute of type long.
And even more restrictions.
So I want to know that is only reason because of that Oracle doesn't allow us to make two Column or is there any strong reason which make it more logical Like storing of data in Row blocks or some thing else.
<ORACLE VERSION : 11.2.0.2.0> i have created a table with CLOB as datatype for one of the columns, I am trying to store a string ( I am not sure about the length of the string) , when i am querying on my table for the CLOB column,instead of the actual string "(HUGECLOB)" is coming. How to get the actual string in case the problem is with the SIZE.
I am not sure but may be I need to set long size before running above query. But when I try to set long size gives below error. "The output from DBMS_METADATA.GET_DDL is a LONG datatype. When using SQL*Plus, your output may be truncated by default. Issue the following SQL*Plus command before issuing the
DBMS_METADATA.GET_DDL statement to ensure that your output is not truncated:"SQL> SET LONG 9999error: Unhandled SET statement: "SET LONG 9999"
We also face the same in oracle 9i version. we tried to re-org some tables for performance issue.But our tables have long and long raw datatype. then we approached the traditional way i.e 1. Export tables.2.Truncate tables.3.import tables.(use ignore=y).4.check the index validation.5.gather stats. Above was successfully done in our production environment.but some application downtime is required.
I've got a dblink between two Oracle databases. There are one view that I specify a join between four remote tables.
One of the tables does have a column defined as a long raw. I do not need that column; the query doesn´t make reference to it.However, when I specify the query, I get this error: ORA-00997; illegal use of LONG datatype.
"how can I accomplish this query, over a dblink, given that one table has a long raw that is not part of my query ?
I am trying to copy structure of table through database link but getting an error while running the command :
SQL> create table TOAD_PLAN as select * from TOAD_PLAN_@db_link where 1=2; create table TOAD_PLAN_TABLE as select * from TOAD_PLAN_TABLE@to_paceview where 1=2 * ERROR at line 1: ORA-00997: illegal use of LONG datatype
how can i create it through database link or through any other utility.
I'm creating a package function that would return the image from the table HR.PER_IMAGES.
Here's the table description of HR.PER_IMAGES IMAGE_ID NUMBER(15) IMAGE LONG RAW PARENT_ID NUMBER(15) TABLE_NAME VARCHAR2(30)
I have also created a simple PL/SQL code that would supposed to extract the value of the IMAGE column and store it in a variable then return it from a function.
DECLARE x LONG RAW; BEGIN SELECT image INTO x FROM per_images WHERE parent_id = :p_parent_id ; END;
However, I'm getting the "ORA-06502: PL/SQL: numeric or value error" message. I have tried retrieving an specific image using Oracle Reports Developer and when I queried it was able to render the image on the report page.
I would need to convert the column datatype from BLOB to CLOB. currently in the table, the BLOB column has the data. the requirement is to convert this column from BLOB to CLOB datatype.
How to convert from BLOB datatype to CLOB datatype ?
I got an exception when I was using sesame adapter to dump a turtle file which contains long texts as objects into oracle semantic database. The exception information is:
org.openrdf.repository.RepositoryException: org.openrdf.sail.SailException: java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
ORA-06512: in "SF.ORACLE_ORARDF_ADDHELPER", line 1 ORA-06512: in line 1 at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:439) at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:395) at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:802) ...
resolve problem with move lob objects ? I move table partition and lob (BLOB) from one tablespace to another :
alter table EBIF.APO_T_VER_DISP_ACC_RESP MOVE PARTITION P1M20120901 LOB(SIGNATURE_PATTERN) STORE AS (TABLESPACE tmp) t able EBIF.APO_T_VER_DISP_ACC_RESP MOVE PARTITION have : pbeb_ap1.SYS>select partition_name , tablespace_name from dba_lob_partitions where table_name='APO_T_VER_DISP_ACC_RESP';
I am updating a table column which is xml datatype and am getting above error.Below is the process what i did. since the xml is too large i split them into small chunks.
Need to change the precision of a column in a existing table. Statistics about the table
* has over 130 columns * More than 300 million records * Column to modify is #121 which has data * No primary key defined
Since the column has data, it is not possible to modify with a simple Alter.
Second option - create temp column in same table, update from original, put null in original, alter, update back from temp, drop the temp column. This approach is very expensive and time consuming.
I'm trying to load xml file into table having xmltype datatype, but it is throwing below given error.I even tried to load data by changing '&' into '&' but still getting same error.
Error at line 6 ORA-06512: at "SYS.XMLTYPE", line 296 ORA-06512: at line 1 31011. 00000 - "XML parsing failed" *Cause: XML parser returned an error while trying to parse the document. *Action: Check if the document to be parsed is valid.
Version : ORACLE 11g, Windows 7
CREATE TABLE xml_test ( id NUMBER(5), NAME VARCHAR2(50), xmldata xmltype ); INSERT INTO xml_test(id, name, xmldata) VALUES(1,'file1', XMLTYPE(bfilename('SCOTTDIR', 'TEST_XML.XML'), nls_charset_id('AL32UTF8')));
I have a requirement that i should list out all the table names which are all using timestamp datatype in a specified schema. Is there any way to find those table names by using any system tables.
I am getting an ORA-00902: invalid datatype error when I am trying call the below function from a select statement. Here I am trying to get a table out from a function.
create or replace package pkg10 as type tabletype1 is table of table1%rowtype index by binary_integer; function func1 return tabletype1; end pkg10;create or replace package body pkg10 as
Database Version : DB : Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit ProductionOS : HP-UX nduhi18 B.11.31 U ia64 1022072414 unlimited-user licenseAPP : SAP - ERP I have to RANGE partition on UPDATED_ON or PROFILE either one table which is having below
structure : Name Null? Type -------------------- -------- -------------------------------- MANDT NOT NULL VARCHAR2(9) MR_ID NOT NULL VARCHAR2(60) PROFILE NOT NULL VARCHAR2(54) REGISTER_ID NOT NULL VARCHAR2(30) INTERVAL_DATE NOT NULL VARCHAR2(24) AGGR_CONSUMPTION NOT NULL NUMBER(21,6) MDM_VERS_NO NOT NULL VARCHAR2(9) MDP_UPDATE_DATE NOT NULL VARCHAR2(24) MDP_UPDATE_TIME NOT NULL VARCHAR2(18) NMI_CONFIG NOT NULL VARCHAR2(120) NMI_CONFIG_FLAG NOT NULL VARCHAR2(3) MDM_DATA_STRM_ID NOT NULL VARCHAR2(6) NSRD NOT NULL VARCHAR2
[Code]....
As per my knowledge, RANGE is better suited for DATE or NUMBER. and INTERVAL partition is possible on DATE or NUMBEr . Column PROFILEIts is of VARCHAR2 datatype. I know still I can partition as Oracle internally convert varchar2 to number while inserting data. But INTERVAL is not possible. How to RANGE partition on PROFILE ? Column CREATED_ON :It is of NUMBER with decimal
I use sqlplus in oracle (linux).I have a table and the string cell have long string .
Like below :
column A Column B
A BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB .....................................BBBBBBBBBBB
So, I need to edit/update the row A and the value in Column B.But the string in Column B is so long and I only need to edit one character.IF I use update command , I need to type very long string and it is easy to wrong edit .
We are firing a normal Drop command on our database and the database version is 10.2.0.4.The database is running on AIX v5.The command is taking more time than usual .
When i am monitoring the session i can see that a call is being made to procedure "aw_drop_proc".Could i ask you if this is something that is taking more time than usual.
We are not having any partitions on the nested tables .We have a pack of tables and we are dropping this pack through a procedure.The pack comprises of nested tables & normal tables.To drop a nested table it is taking around 6 seconds(Table with no rows) and a normal table(With no rows) it is taking 17 milli seconds.We have a partition on Normal table.
The same operation in windows is taking very less time when compared to AIX.