Column Length Versus Size

Oct 6, 2011

if one of the columns is given as

ABC varchar2(10)

the size of the data in bytes that this column going to hold.

View 5 Replies


ADVERTISEMENT

SQL & PL/SQL :: Column Length Versus Column Size

Oct 6, 2011

if one of the columns is given as

ABC varchar2(10)

can we tell anything about the size of the data in bytes that this column going to hold.

View 3 Replies View Related

SQL & PL/SQL :: Max Length For A Column

Jan 8, 2013

I want to find the max length for a column in oracle, without querying each of the columns. Are these stats stored somewhere?

I have several fields defined as varchar2(4000). Not all of them use up 4000. Instead of querying each one ..max(column name) i want to explore if there is a way, i can find the max size stored somewhere? dba_tab_cols provides size of the field. is there any table that provides max used so far?

View 1 Replies View Related

SQL & PL/SQL :: Get MAX Length Of Column Value Along With Field Value

Oct 4, 2012

create table test_schema(
col1 varchar2(50)
)
insert all
into test_schema values ('this_is_a_test')
into test_schema values ('this_is_a_test_test')
into test_schema values ('this_is_a_test_test_xxxx')
into test_schema values ('this_is_a_test_test_aaaaaaaa')
select * from dual;

I want to get the length of the col1 value with maximum length of characters also with that field value.

i.e o/p is

this_is_a_test_test_aaaaaaaa length 28

how can I do it??

View 3 Replies View Related

SQL & PL/SQL :: Table Column Name - Maximum Length

Apr 8, 2011

What is the Maximum Charter length can be given as a column name in a table?

View 3 Replies View Related

PL/SQL :: Finding Length Of CLOB Column

Jul 11, 2012

I had a varchar2 variable which was storing some data and I could use the LENGTH function to get the length of the data. However, If I change it to CLOB. What is the best way to get the length?

v_clob_size:= (DBMS_LOB.getlength(v_clob)) / 1024 / 1024;
DBMS_OUTPUT.put_line('CLOB Size ' || v_clob_size);

another one is
LENGTHB(TO_CHAR(SUBSTR(<clob-column>,1,4000))) But this seems restricted to first 4000 characters only.

View 3 Replies View Related

SQL & PL/SQL :: Sum Of Length Of Column Data Of All Tables

Jul 24, 2012

i want to add a column data of datatype number of all the tables at a time in a database..

i am trying with the all_tab_columns but i can get only the column info whether it is number or varchar2 or any other data type but i am unable to retrieve the column name and the sum of length of the data present in all the rows in that column...

View 39 Replies View Related

Exchanging Partition After Doing Alter Table Modify Column Length?

Mar 11, 2011

We have some tables in our database in which for loading data we have the setup in place to do Exchange partition after data load into staging. Today we did changes to column length to one pair of main and staging table. Post that Exchange partition stop working.

View 1 Replies View Related

Direct Path Exported Dump File Contains Illegal Column Length

Apr 28, 2011

I am trying to import the database and i see the following error:

IMP-00051: Direct path exported dump file contains illegal column length imp abruptly stops. my source and destination database is as follows:

Source: 9.2.0.8
Destinatin: 11g-r2.

I used exp with the following options:

direct=y
buffer=899989898
recordlength=64000
File=1.dmp,2.dmp,3.dmp,4.dmp,5.dmp,6.dmp,7.dmp,8.dmp,9.dmp,10.dmp,11.dmp,12.dmp,
13.dmp,14.dmp,15.dmp,16.dmp,17.dmp,18.dmp,19.dmp,20.dmp,21.dmp,22.dmp,23.dmp,24.
dmp,25.dmp,26.dmp,27.dmp,28.dmp,29.dmp,30.dmp
grants=y
compress=y
FILESIZE=25g
owner=BRM

Using imp with the following option:

indexes=n
buffer=99989898
recordlength=64000
File=1.dmp,2.dmp,3.dmp,4.dmp,5.dmp,6.dmp,7.dmp,8.dmp,9.dmp,10.dmp,11.dmp,12.dmp,
13.dmp,14.dmp,15.dmp,16.dmp,17.dmp,18.dmp,19.dmp,
20.dmp,21.dmp,22.dmp,23.dmp,24.dmp,25.dmp,26.dmp,27.dmp,28.dmp,29.dmp,30.dmp
analyze=n
grants=y
fromuser=brm
touser=brm
statistics=none

why my imp failing?

View 4 Replies View Related

Server Utilities :: Extra Space - Length Of Column Showing 4 Char

Feb 28, 2010

I have one issue while loading the value through sql*loader the last column data is SG1 and when its loaded , it is length of this columns is showing 4 char. Unable to understand, how to find this extra space. Though used TRIM but does not work.

View 8 Replies View Related

Export/Import/SQL Loader :: Field In Data File Exceeds Maximum Length For CLOB Column

Jun 18, 2012

I'm loading data from text file separated by TAB and i got the error below for some lines. Event the column is CLOB data type is there a limitation of the size of a CLOB data type. The error is:

Record 74: Rejected - Error on table _TEMP, column DEST.
Field in data file exceeds maximum length

I'm using SQL Loader and the database is oracle 11g r2 on linux Red hat 5. Here are the line causing the error from my data file and my table description for test:

create table TEMP
(
CODE VARCHAR2(100),
DESC VARCHAR2(500),
RATE     FLOAT,
INCREASE VARCHAR2(20),
COUNTRY VARCHAR2(500),
DEST     CLOB,
[code]........

View 3 Replies View Related

Performance Tuning :: How Length Of Column Width Effects Index Performance

Sep 30, 2010

How the length of column width effects index performance?

For example if i had IOT table emp_iot with columns:
(id number,
job varchar2(20),
time date,
plan number)

Table key consist of(id, job, time)

Column JOB has fixed list of distinct values ('ANALYST', 'NIGHT_WORKED', etc...).

What performance increase i could expect if in column "job" i would store not names but concrete numbers identifying job names.
For e.g. i would store "1" instead 'ANALYST' and "2" instead 'NIGHT_WORKED'.

View 24 Replies View Related

Calculate Size Of Table With BLOB Column

Feb 5, 2007

when trying to calculate the occupied space for a table, I'm using DBA_SEGMENTS, which works fine as long as the table does not have a BLOB column.

As far as I can tell, the size of the BLOB data is stored with the SEGMENT_TYPE = 'LOBSEGMENT', but I cannot find a view that tells me which DBA_SEGMENT row belongs to the BLOB column in the table I'm checking.

To give you an example:

select sum(BYTES)
from DBA_SEGMENTS
where owner = user
and segment_name = 'MY_TABLE'
group by SEGMENT_NAME

returns 262144

running:

SELECT sum(length(blob_column))
FROM my_table

returns 821333

There are entries in DBA_SEGMENTS for my user with the type LOBSEGMENT, but I cannot find a way to map the correct DBA_SEGMENTS row to the table I am checking.

I'm using Oracle 10.2.0.3.0 on Redhat

View 2 Replies View Related

SQL & PL/SQL :: Decrease Size Of Number Datatype Column

May 24, 2012

I want to decrease the size of testid column of number datatype in my "test" named table from size 20 to 15 and the data of maximum size is of 10 digits. but oracle throws an error "ORA-01440: column to be modified must be empty to decrease precision or scale". i cant understand why it is happening?

what is the reason behind it even though new size is maximum than the maximum size of existing data. but when i decrease the size of "varchar2" then oracle does not through any error.

View 3 Replies View Related

Server Administration :: Get Size Of CLOB Column But Not Getting Any Output

Sep 10, 2012

I'm trying to get size of CLOB column but not getting any output.

SQL> desc TABLE_STEP_INST234
Name Null? Type
----------------------------------------- -------- ----------------------------
NUM_PENDING_PREREQS NOT NULL NUMBER(10)
OBJID NOT NULL VARCHAR2(31)
OUTFLOW_BITS NUMBER(19)
PARAMS CLOB
PARENT2PROC_INST NOT NULL VARCHAR2(31)
ROOT2PROC_INST NOT NULL VARCHAR2(31)
START_TIME DATE
STATUS NOT NULL NUMBER(2)
[code]...

View 2 Replies View Related

XML DB :: Find Size Of Table With XML Type Column Store As Binary XML

Dec 21, 2012

I have a table with structure as:

CREATE TABLE XML_TABLE_1
(
ID NUMBER NOT NULL,
SOURCE VARCHAR2(255 CHAR) NOT NULL,
XML_TEXT SYS.XMLTYPE,
CREATION_DATE TIMESTAMP(6) NOT NULL
[code].....

- So HOW do I find the total size occupied by this table. Does BINARY storage work as LOB storage. i.e. I need to consider USER_LOBS as well for this. OR foll. will work

select segment_name as tablename, sum(bytes/ (1024 * 1024 * 1024 )) as tablesize_in_GB
From dba_segments
where segment_name = 'XML_TABLE_1'
and OWNER = 'SCHEMANAME'
group by segment_name ;

- Also if I am copying it to another table of same structure as:

Insert /*+ append */ into XML_TABLE_2 Select * from XML_TABLE_1.

Then how much space in ROllbackSegment do I need. Is it equal to the size of the table XML_TABLE_1?

View 2 Replies View Related

SQL & PL/SQL :: Exporting Oracle Data Into Excel File With Auto Column Size

Nov 7, 2007

I want to export the oracle data into an excel sheet. I have written the code by using UTL_FILE package. but i am getting the output as shown in the screen shot(without formatting the column size as the width of the data it has). But I want the output column width to be set according to the size of the data automatically.

View 5 Replies View Related

Server Utilities :: Estimate Size Of FlatFile Based On Table Size?

May 8, 2013

We are planning to export the table data to a file pipedelimited. How do i estimate the size of the FlatFile based on the table size? or avg rowlength

View 3 Replies View Related

Server Administration :: How To Reduce Size Of TEMP DBF File Size

Apr 13, 2011

I am using oracle 8.1.5 database and my temp01.dbf file size is increased upto 19.8 GB now i want reduce its size .

View 13 Replies View Related

SQL & PL/SQL :: Procedure Versus Function?

Dec 27, 2011

Procedure and function. exact reason when we go for function or procedure?

View 3 Replies View Related

Database Versus System Statistics

Aug 26, 2011

In the article regarding gathering CBO Statistics, it states: QUOTE When an Oracle database is created, a job will be scheduled that will generate the database statistics for you. You will still need to collect system statistics however, as these are not collected by the automatic statistics gathering mechanism.

what is the difference between "database statistics" and "system statistics"? In other words, do I need to run this script for each schema owner in my 10g/11g instance?

variable whoami varchar2(20);
begin
select user into :whoami from dual;
end;
exec dbms_stats.gather_schema_stats( -
ownname => :whoami, -
options => 'GATHER AUTO', -
estimate_percent => 15, -
cascade => true).

View 2 Replies View Related

SQL & PL/SQL :: Unique Constraint Versus Distinct?

Apr 30, 2013

about the functionalty w.r.t. unique constraint and Distinct clause. Below is the example which is confusing me lot.

--Below statement will create table and unique constraint
Create Table A (A Varchar2 (10) Unique);
Insert Into A Values (Null);
Insert Into A Values (1);
Insert Into A Values (2);

[code]...

If we are saying each null value is having a unique value, then why oracle distinct showing records.

View 3 Replies View Related

Dirty Versus Redo Buffer?

Mar 3, 2010

What's the difference between a dirty buffer and a redo buffer?

My understanding is that a dirty buffer is a changed buffer or whenever data changes in the buffer cache, it's marked as dirty. Also, a redo buffer keeps track of changes that were made to the data, so it's also referring to changed data as well...DWBn writes dirty buffers to disk and LGWR writes redo data to redo log filesHow can we differentiate between the two?

View 2 Replies View Related

Incremental Versus Differential Backup?

Jan 18, 2013

what is the difference between incremental and differential backup?

View 5 Replies View Related

SQL & PL/SQL :: TRIM To A Specified Length?

Aug 9, 2013

I have a field "Email". The length of it is restricted to 30. But i mayget more than 30 characters. So how to trim the email address so that its max length is 30 characters.

View 5 Replies View Related

Cursor Versus Global Temp Table

Jan 16, 2013

We had an issue with a PL/SQL package taking hours to run as a concurrent program. Database version is 10.2.0.4.0, running on Linux x86 64-bit. A tkprof'd trace file revealed the problem SQL statement to be a cursor. This one SQL statement would run for 3+ hours. I copied the SQL statement and ran it in TOAD and it completed in seconds, returning the exact same result set. To resolve the issue in the PL/SQL package I created a global temp table and ran the exact same SQL statement as an INSERT into the global temp table.

Again, instead of hours, the SQL statement completes in seconds. If I revert the change, it goes back to taking hours. I've attached the relevant sections of the tkprof showing the two SQL statements (identical other than the insert in front of one) and the resulting explain plans and performance data. I've always been under the impression that a cursor was a better option than a temp table and I've never run into a situation where the same SQL statement runs so much longer when executed as a cursor.

Attached File(s)

SQL_As_Cursor.jpg ( 274.02K )
Number of downloads: 7

Explain_for_SQL_As_Cursor.jpg ( 189.43K )
Number of downloads: 4

SQL_as_Insert.jpg ( 277.38K )
Number of downloads: 4

Explain_for_SQL_As_Insert.jpg ( 180.66K )
Number of downloads: 2

View 2 Replies View Related

Performance Comparison TDE Versus Plain Tablespace

Dec 9, 2008

Environment Setup

Oracle Server 11g on HP-UX
Oracle Client on Windows

I am using swingbench tool to generate load on DB and using OLTP like benchmark i am comparing the performance of plain data and encrypted data.

I have created two different database. one for tde and other for plain. I have populated same number of rows in both databases. Then i start running the benchmark and i use SAR to collect disk I/O's, VSAR to CPU usage.

From the sar report it seems that,

Oracle plain has faster transactions, it uses minimum CPU. But when look in tot the Reads/Writes TDE has lower than the plain.

If TDE needs to encrypt the data to store in the disks it should occupy more space than the plain data. Then the I/O should be more in TDE..

Note: Bcz the DB parameters are same, number of rows in the tables are same. File system and its block size are same. I will run the swingbench seperately for both the databases.

I am attaching the excel sheet for sar results. Let me know if you need more information

View 7 Replies View Related

PUSH Versus PULL Tables Between Two Instances?

Oct 19, 2010

I want to move data between two instances and recommended we create a local database link to PULL data from remote database located here (supplier on site) but they want to PUSH data to us. I thought you could only PULL data over a database link but then read the link [URL] where PUSH is considered ? I was going to use standard creatas like create table A as select * from table A@<remote_db_link> which works well and fast ( tried and tested) but some are saying they think PUSH quicker/better ?

we do have data "PUSH" already but this does not use a db link - effectively it calls a local proceedure here and passes a row of data and is slow ie for a 1000 row table to be pushed to us we have our local proceedure called 1000 times.

I have always suggested a PULL with db_link as the fastest method - any proof OR info on a fast PUSH method ( that is quicker than PULL ) ? can you REALLY push ?

View 2 Replies View Related

Performance Of CHAR Versus VARCHAR2 In VLDB DW

Jul 20, 2010

With a very large database (VLDB) for a data warehouse (DW) using primarily a STAR based schema in an environment in which time (both human and CPU) is orders of magnitude more valuable than storage capacity, is there any signficant difference in query performance when tables have all fixed length (CHAR) columns compared to tables with variable length (VARCHAR2) columns?

I realize this is one of those "in general" questions so considering "a given VLDB DW environment" with all other things being equal, what, if any, is the time based performance difference between a database of tables with all fixed sized columns versus one of tables with variable length columns ?

View 2 Replies View Related

SQL & PL/SQL :: Different Special Character Display Oracle 10 Versus 11g?

Sep 17, 2012

A database containing inventory data has been migrated from Oracle 10g to Oracle 11g. I have access to both the Oracle 10g and Oracle 11g database on different client computers. Both databases use the same character set, WE9MSWIN1252 (query shown below). However the results from the sql SELECT show incorrectly displayed characters. I would like the "1/2" character and degree character to show in the text. The ASCIISTR function shows that the underlying ascii is the same in the two copies of the databases.

Is there a setting that needs to be changed in Oracle 11g so that the saved special characters in the database show correctly (as in Oracle 10).

Query of database character set

SQL> Select value from SYS.NLS_DATABASE_PARAMETERS where PARAMETER = 'NLS_CHARACTERSET'
WE8MSWIN1252

Under Oracle 11g, this is a query on DSI using SQLPLUS 11.2.0.1.0.

SQL> select description from part where id = '57234';

DESCRIPTION
----------------------------------------
KL BRKT PLN 22╜░ ANGLE (AMER BOT RAIL)
SQL> select asciistr(description) from part where id='57234';
ASCIISTR(DESCRIPTION)
--------------------------------------------------------------------------------

[code].....

View 6 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved