Server Utilities :: Exporting A Table That Is 3 GB In Size

Mar 22, 2011

I am exporting a table that is 3 GB in size and also Partitioned with option NOCOMPRESS specified.

Now when i export it with COMPRESS=N option of exp utility then it should take 3 Gb in target server but will exporting it with COMPRESS=Y will save some storage during import or once NOCOMPRESS option specified on partition has no impact on exp utility COMPRESS=Y option and it will take 3 GB space in both cases

Is this true that whether u specify COMPRESS=N|Y during export it does not matter the size will be 3 GB always after import?

View 6 Replies


ADVERTISEMENT

Server Utilities :: Exporting Table Structure From A Particular DB?

Nov 9, 2010

know the process of exporting only the table structure of a Database without the actual content of it.

Note:: I don't know how many tables are present in the DB.

View 1 Replies View Related

Server Utilities :: Exporting / Importing Partitioned Table

Sep 2, 2013

I am trying to export a partition of a table and import it to another database. I get the below error when I try to import.

ORA-14400: inserted partition key does not map to any partition

If I export the table(for that particular partition) and import the table(after dropping the table) in destination, the partitions and sub partitions are created without any problem.

The table is Range Partitioned and Sub partitioned in List. So I had to perform the below operation if I want to retain other data in the Destination table.

1. Drop the existing partition
2. Create the partition and sub partition, same as source
3. Execute imp

In fact I had to perform step#2, as if I split the partition also, the sub partition gets replicated in the new partition, which again throws the same error. Is there better way of managing the partitions and subpartition in destination with exp/imp utility, so that I need not perform step#1 and step#2 manually.

View 11 Replies View Related

Server Utilities :: Estimate Size Of FlatFile Based On Table Size?

May 8, 2013

We are planning to export the table data to a file pipedelimited. How do i estimate the size of the FlatFile based on the table size? or avg rowlength

View 3 Replies View Related

Server Utilities :: Exporting Database?

May 31, 2010

I am Trying to Export my database and whenever I try to login it is giving Ora-Error 1017 Invalid Username/Password.

If I Login as System and Manager it is accepting but I am not able to Export all my Database.

View 9 Replies View Related

Server Utilities :: Exporting Metadata Backup

Jul 19, 2011

i am using a schema which i need to take a backup of meta data only i am using exp utility

exp shan/shan@shan file=/backup_dump/shan.dmp log=/backup_dump/shan.log owner=shan rows=n

but it will return me below error, i have only access to user shan our client cant allow me to use system or sysdba schema or any other required grants or privileges so is there any way to take metadata backup of user shan from user shan.

EXP-00008: ORACLE error 942 encountered
ORA-00942: table or view does not exist
EXP-00024: Export views not installed, please notify your DBA
EXP-00000: Export terminated unsuccessfully

View 3 Replies View Related

Server Utilities :: Exporting From Physical Standby

Aug 3, 2012

I would like to export few tables from the physical standby which is in read only mode.

I have tried both the exp and expdp methods and could successfully export and import the tables from physical standby using exp unfortunately the expdp does not allow this process from a read only database.

Does this mean that we still have to use the exp feature instead of expdp ?

Note : I would expect a proper response from experts and no unwanted comments like "Contact Oracle support" or "Paste the entire command here" or "Read the Manuals" or "Why i am exporting from Standby and not from Primary" etc.

View 1 Replies View Related

Server Utilities :: Exporting Huge Amount Of Data?

Jul 25, 2011

extract a huge amount of data from a couple of views... the problem is that they want it in TXT files with fixed record length. There will be like 6 files, for a total amount of about 10GB.

export those tables in the fastest possible way? If I'm not mistaken exp and expdp can't create txt files, so do I really need to use utl_file or spool?

View 1 Replies View Related

Server Utilities :: Exporting Schema Using Consistent Parameter

Aug 5, 2012

I taking export using consistent parameter. Theoretically i can understand . practically i couldn't understand how it works.

for ex

I am updating tab1 table under sams user. table having one lakh records.
while updating the query using consistent=y and consistent=n. i mean

exp sams/sams file=cons.dmp owner=sams consistent=y
exp sams/sams file=cons2.dmp owner=sams consistent=n

then both files imported to separate user(sam ,san).
Updated info not visible in san and sam user.

I want to know practically how it works. I need perfect example. while using consistent=y and consistent=n

View 2 Replies View Related

Server Utilities :: Exporting Schema Using Filesize Parameter?

Aug 6, 2012

Export /Import
==============

While exporting schema's

i couldn't export dump file to exact location i mean see following query : -

QUERY
=====

exp file=ackupfile1.dmp,ackupfile2.dmp,ackupfile3.dmp
owner=(order,purchase) filesize=5m as os level ,

I fould those dump files files home directory.

-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile1.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile2.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile3.dmp
[oracle@localhost ~]$ pwd
/home/oracle

when i listing

rw-r--r-- 1 oracle oinstall 72 Jun 20 21:17 afiedt.buf
drwxr-xr-x 3 oracle oinstall 4096 Jun 17 10:07 Desktop
-rw-r--r-- 1 oracle oinstall 71 Jun 19 20:42 ed.hup
drwxr-xr-x 2 oracle oinstall 4096 Aug 6 19:38 backup
-rw-r--r-- 1 oracle oinstall 2826240 Aug 6 19:39 expdat.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile1.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile2.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile3.dmp

Dump file goes to home path even if i mentioned appropriate location.

View 7 Replies View Related

Server Utilities :: Syntax For Exporting Only Procedures Of Particular User

Mar 7, 2011

What is the syntax for exporting only the procedures of a particular user.

View 1 Replies View Related

Server Utilities :: Exporting Database Schema / Tables Without Data

Sep 22, 2010

I'm trying to export a relatively large database but it's a bit more complicated than that.For one schema I need a full export / import (data included).

For another 10 schemas I need them empty, with the exception of a table in some of them which needs to be exported / imported with all data inside.Is it possible to do this with datapump utility (impdp, expdp)?

Afterwards I will be running some scripts to populate the DB instance with critical data / metadata.

View 1 Replies View Related

Server Utilities :: Primary Keys Are Not Exporting When Export Using EXP Command

Dec 27, 2011

I have taken database backup using exp command and when I try to import in other pc the foreign keys are not imported. It saying error message that no matching unique key or primary key for this column.

how will i take backup including with primary keys?

View 7 Replies View Related

Server Utilities :: Export Hangs At Exporting Cluster Definitions?

Dec 22, 2010

Sunddenly my exports hangs at 'exporting cluster definitions'. I had been using this database since last 4 years and it never cause a problem or hangs at this level. here i'm pasting my screen details. it is my production db.

[oracle1@wbh_as1 smbshare]$ exp wb/wb

Export: Release 9.2.0.1.0 - Production on Thu Dec 23 00:02:44 2010

Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.

Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
Enter array fetch buffer size: 4096 >

Export file: expdat.dmp > wb

(2)U(sers), or (3)T(ables): (2)U >

Export grants (yes/no): yes >

Export table data (yes/no): yes >

Compress extents (yes/no): yes >

Export done in US7ASCII character set and AL16UTF16 NCHAR character set
server uses WE8ISO8859P1 character set (possible charset conversion)
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user WB
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions for user WB
About to export WB's objects ...
. exporting database links
. exporting sequence numbers
. exporting cluster definitions

View 11 Replies View Related

SQL & PL/SQL :: Exporting Oracle Data Into Excel File With Auto Column Size

Nov 7, 2007

I want to export the oracle data into an excel sheet. I have written the code by using UTL_FILE package. but i am getting the output as shown in the screen shot(without formatting the column size as the width of the data it has). But I want the output column width to be set according to the size of the data automatically.

View 5 Replies View Related

Server Utilities :: Data Pump For Exporting And Importing Extremely Large Data Files

Sep 24, 2010

I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?

View 4 Replies View Related

Server Utilities :: Difference In Size Of Exports

May 7, 2011

Recently i've migrated from Oracle 9i SE(9.2.0.1.0) to Oracle 11gR2 EE(11.2.0.1.0). Previously i'm taking export of some of my schema and it's file size was around 1g.(with exp utility of Oracle 9i). As per earlier practice now i'm taking export of same schema with same no of objects and same data volume, the size of export file size on Oracle 11gR2 database is significantly gone down , actual size around 825mb(with expdp utility of Oracle 11g).

So i would like to know why there is a difference in file size(.dmp files) of export files between two oracle versions. I have crosschecked objects and rows of data tables. It is perfectly same.

Command line parameter for export on Oracle 9i

exp test/test FILE=test.dmp OWNER=test GRANTS=y ROWS=y COMPRESS=y LOG=test.log

Command line parameter for export on Oracle 11g

expdp test/test DIRECTORY=dpump_dir DUMPFILE=test.dmp LOGFILE
=test.log

View 3 Replies View Related

Server Utilities :: Imp / Exp - Max Size Assigned To Datafile?

Jul 22, 2010

i have exp dump of size 1gb but when i tried to imp ,it showing error of space , it asking for space of 4gb. But i have 1gb on c: drive and 32gb on d: ,can i add datafile on d: locaion and what is max size i can assign to that datafile .

View 4 Replies View Related

Server Utilities :: Evaluate Export Dump Size

Jan 11, 2012

I want to take a schema level export .The schema size is 115 GB size . Do we require same amount of space to be available in server side (where we are taking a dump) as the schema size or less or more space is required in server side ?

View 6 Replies View Related

Server Utilities :: Importing From 9i To 11g - Getting ORA-29339 Error Because Of Block Size?

Oct 5, 2010

We are working on migrating from 9.2.0.4 to 11.2 and we've set up a test machine so that we could test the install and the import (as well as test additional 11g features that we want to begin using).

So we created the database and created all of the tablespaces beforehand.

Our import command is

$ORACLE_HOME/bin/imp system/manager FULL=Y BUFFER=140000 FILE=/dbexport/Lhtech.exp VOLSIZE=2000M GRANTS=Y INDEXES=Y COMMIT=Y IGNORE=Y

However, when we run the import, we get the errors like so:

Import: Release 11.2.0.1.0 - Production on Tue Oct 5 15:01:19 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export file created by EXPORT:V09.02.00 via conventional path

[code]....

First of all, the block size in our "newly" created tablespaces is 8192...and these are obviously trying to recreate the tablespaces with a block size of 2048.

1) Why is it not ignoring these create tablespace commands when those tablespaces already exist?

2) how in the world do we get around the block size issue? We've tried nearly everything we could find, but we've still not had any luck.

View 8 Replies View Related

Server Utilities :: Taking Export Dump Using Expdp Of Some Schema's Of Total Size Is 300GB

Mar 30, 2007

I'm taking export dump using expdp of some schema's of total size is 300GB. This is the par file:

DIRECTORY=expdp
FILESIZE=32212254720
DUMPFILE=expdp_schema01.dmp,expdp_schema02.dmp,expdp_schema03.dmp,expdp_schema04.dmp,expdp_schema05.dmp,expdp_schema06.dmp,expdp_sche ma07.dmp,expdp_schema08.dmp,expdp_schema09.dmp,expdp_schema10.dmp,expdp_schema11.dmp,expdp_schema12.dmp,expdp_schema13.d
[code]....

here one biggest schema size is 250GB and the total size of all the schema's is 300GB. The file where am taking the dump has 350GB space but even then the expdp failed saying

ORA-39095: Dump file space has been exhausted: Unable to allocate 8192 bytes

why it failed and how to restart it and make sure it runs successfully without error.

View 4 Replies View Related

Server Administration :: Move Partitioned Table Between Table Spaces Of Different Block Size?

Apr 4, 2011

I was about to move some tables from one table space to another but it seems it is not possible to move partitioned tables between table spaces of different block sizes.

So far the only option I have is to export and then import back the data.

know if there is any way to move a partitioned table between table spaces of different block size?

View 14 Replies View Related

Server Administration :: How To Shrink Table Size

Jan 4, 2012

I have a table: desc STG_XML

Name Null Type
------------------------------ -------- ------------------------
ENTITY_ID NOT NULL VARCHAR2(100 CHAR)
ENTITY_TYPE_ID NOT NULL NUMBER
SOURCE_ID NOT NULL VARCHAR2(512 CHAR)
XML_SCHEMA_ID NOT NULL NUMBER
JOB_ID NOT NULL NUMBER
FINGERPRINT NOT NULL VARCHAR2(100 CHAR)
ENTITY_XML_DATA CLOB()
ARCHIVED NUMBER(1)
CREATION_DATE TIMESTAMP(6)
MODIFICATION_DATE TIMESTAMP(6)
ARCHIVING_DATE TIMESTAMP(6)
CREATED_BY VARCHAR2(50 CHAR)
MODIFIED_BY VARCHAR2(50 CHAR)

The problem is that the data of the table are 40GB while on the DB the table holds 400GB! How can I shrink and reuse that space except from drop/recreate and drop/import?

The table has no initial data, so that I can play with the INITIAL parameter. Data are inserted, updated and deleted all the time. I have run DBMS_ADVISOR which recommended to SHRINK table. I have performed the shrink :

alter table STG_XML shrink space COMPACT;

but I haven't gained any space.

View 12 Replies View Related

Server Administration :: How To Reduce Size Of A Table

Oct 2, 2012

One of our solaris machines is running Oracle 8.0.3

A table reached the 2 Gb size and oracle failed due to the operating system file size limitation.

The information in the table is not relevant and can be deleted, but the table contains a lot of indexes.

I would like to know the best procedure to delete the information and reduce the size of the file.

View 3 Replies View Related

Server Administration :: Size Of Data Of Table

Oct 20, 2011

ops$tkyte@DEV8I.WORLD> select blocks, empty_blocks,
2 avg_space, num_freelist_blocks
3 from user_tables
4 where table_name = 'T'
5 /

BLOCKS EMPTY_BLOCKS AVG_SPACE NUM_FREELIST_BLOCKS
---------- ------------ ---------- -------------------
19 35 2810 3

Ok, the above shows us:

- we have 55 blocks allocated to the table (still)
- 35 blocks are totally empty (above the HWM)
- 19 blocks contains data (the other block is used by the system)
- we have an average of about 2.8k free on each block used.

Therefore, our table

- consumes 19 blocks of storage in total.
- of which 19 blocks * 8k blocksize - 19 block * 2.8k free = 98k is used for our data.

not too sure this calculation is accurate for getting the size (data)of the table.

View 32 Replies View Related

Server Administration :: How To Reduce Size Of TEMP DBF File Size

Apr 13, 2011

I am using oracle 8.1.5 database and my temp01.dbf file size is increased upto 19.8 GB now i want reduce its size .

View 13 Replies View Related

Server Administration :: Calculation Of Initial Extent Size Of Table

Apr 21, 2010

I need to create table A. which will going have more than 8L records. Daily this table A will truncate and reinsert all 8L records. Also number of records(8L) will we increase 50K per month. what should be storage clause parameters . Mainly initial and next extent.

View 3 Replies View Related

SQL & PL/SQL :: How To Escape Comma While Exporting Data From Table Into CSV File

Apr 9, 2012

How we can escape comma while exporting data from table into csv file.

CREATE TABLE emp
(
EMPNO NUMBER(4) NOT NULL,
ENAME VARCHAR2(10 BYTE),
JOB VARCHAR2(9 BYTE),
MGR NUMBER(4),
HIREDATE DATE,
address varchar2(100),
[code].......

i have to export data from emp table which has address column and address column contain comma, when i am running below script, the comma part in address field comes in next tab in csv file, is there any way we can avoid shifting to next tab and can have complete address in one tab.

set echo off
set verify off
set termout on
set heading off
set pages 50000
[code]....

View 9 Replies View Related

Server Utilities :: Geometric Data From Text To Table And Wrong CTL Upload Into Table

Jul 11, 2013

I have a requirement to import text files which are generated from 3d modelling software xsteel where it records all geometric information and i want to import this information into oracle table.

CREATE TABLE dstv_head ( wo_no VARCHAR2(12),struct VARCHAR2(12),rev_no NUMBER,
mark VARCHAR2(12),pos VARCHAR2(12),grade VARCHAR2(12),qty NUMBER,PROFILE VARCHAR2(24),TYPE VARCHAR2(12),
len NUMBER,width_web NUMBER,width_bottom NUMBER,flange_thk NUMBER,web_thk NUMBER,radius NUMBER,kgm NUMBER,
kgm1 NUMBER,kgm2 NUMBER,bevel_plus NUMBER,bevel_minus NUMBER,holes_yn VARCHAR2(1),holes_v_yn VARCHAR2(1),
hole_x_dim NUMBER,hole_y_dim NUMBER,hole_dia NUMBER,no_of_holes NUMBER)

-- All the data which has to go under specific field for example **9005.nc1 will go into wo_no field, 1239401A will go under struct.

ST
** 9005.nc1 --WO_NO
1239401A - STRUCT
1 -REV_NO
9005 -MARK
9005 --POS
S275JR --GRADE
2 --QTY
[code]....

View 24 Replies View Related

Server Utilities :: Import A Table With Table Already Present With New Columns

Mar 31, 2010

I want to do an import of a table from my old dump file.The same table is already there in the development box but few more columns are added to that table while testing so in the dump those columns are not available.

TABLE_EXISTS_ACTION=TRUNCATE
The new table
SQL> desc "TESTINVENTORY"."TTRANSACTION"
Name Null? Type
----------------------------------------------------------------------------------- -------- --------------------------------------------------------
TRANSACTIONIDNOT NULL CHAR(26)
BRANCHCODE NOT NULL CHAR(3)
EXTERNALSYSTEM NOT NULL CHAR(3)
EXTRACTSYSTEM NOT NULL CHAR(3)
OWNERBRANCHCODE NOT NULL CHAR(3)
TRADEREFERENCE NOT NULL CHAR(20)
[code]...

It giving error while doing an import.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved