Server Utilities :: Slow Exports Using Exp Command Due To Open Sessions
Nov 11, 2011
I have had the following problem open with Oracle support since March 2011 (8 months), and still no resolution.
When I export all our schema's on Sunday night it takes about 1 hour 50 minutes. When I export the same schema's on any other night it takes 7 hours. The only difference is that on Sunday at 4:00am we drop all connections in the connection pools and reestablish new connections. Then 19hours later on Sunday at 23:00 we perform the exports which only take 2 hours to complete.
I have also tried recreating the connections in the connection pools during the week, and the exports have then only taken 2 hours to complete. But the following night after the connections have been used during the day, the exports again take 7 hours. So it appears the export speed gets significantly slower when there are many open connections that have been used and not closed.
From the stats pack report I found 2 SQL statements internal to the export command, that had an order of magnitude in difference when looking at the elapsed execution time between the fast export, and the slow export (see below).
How to speed up the exports without having to drop and recreate the database connections in the connection pools each night.
FAST:
elapsed_time: 430.90
executions: 161,388
Module: exp@Oracle1 (TNS V1-V3)
SELECT COLNAME, COLNO, PROPERTY, NOLOG FROM SYS.EXU10CCL
WHERE CNO = :1 ORDER BY COLNO
elapsed_time: 264.29
executions: 50,349
Module: exp@Oracle1 (TNS V1-V3)
SELECT TOWNER, TNAME, NAME, LENGTH, PRECISION, SCALE, TYPE, ISNULL, CONNAME, COLID, INTCOLID, SEGCOLID, COMMENT$, DEFAULT$, DFLTLEN, ENABLED, DEFER, FLAGS, COLPROP, ADTNAME, ADTOWNER, CHARSETID, CHARSETFORM, FSPRECISION, LFPRECISION, CHARLEN, TFLAGS, 100 FROM SYS.EXU8COL
WHERE TOBJID = :1 ORDER BY INTCOLID
SLOW:
elapse_time: 8264.16
executions: 124,662
Module: exp@Oracle1 (TNS V1-V3)
SELECT COLNAME, COLNO, PROPERTY, NOLOG FROM SYS.EXU10CCL
WHERE CNO = :1 ORDER BY COLNO
elapsed_time: 3877.78
executions: 38,813
Module: exp@Oracle1 (TNS V1-V3)
SELECT TOWNER, TNAME, NAME, LENGTH, PRECISION, SCALE, TYPE, ISNULL, CONNAME, COLID, INTCOLID, SEGCOLID, COMMENT$, DEFAULT$, DFLTLEN, ENABLED, DEFER, FLAGS, COLPROP, ADTNAME, ADTOWNER, CHARSETID, CHARSETFORM, FSPRECISION, LFPRECISION, CHARLEN, TFLAGS, 100 FROM SYS.EXU8COL
WHERE TOBJID = :1 ORDER BY INTCOLID
I use the following export command for each schema:
$ORACLE_HOME/bin/exp user/pass file=somefile.dmp owner=$SCHEMA log=somelog.log buffer=9000000
I have an Oracle Standard edition 11.1.0.7 database on 64bit Linux with a 7GB SGA. I currently export (I use exp not datapump because datapump is a lot slower and we can't use parallel processing features of datapump on a standard edition database) approx 200 schema's each night. The export normally takes 1 hour 50 minutes which is approximately 2 schema's exported every minute. When the exports run slowly each export takes almost 2 minutes to complete.
The database has about 20 GB data and 50 GB indexes. The database has also approx 500 connections via toplink connection pools from 8 application servers.
View 2 Replies
ADVERTISEMENT
May 7, 2011
Recently i've migrated from Oracle 9i SE(9.2.0.1.0) to Oracle 11gR2 EE(11.2.0.1.0). Previously i'm taking export of some of my schema and it's file size was around 1g.(with exp utility of Oracle 9i). As per earlier practice now i'm taking export of same schema with same no of objects and same data volume, the size of export file size on Oracle 11gR2 database is significantly gone down , actual size around 825mb(with expdp utility of Oracle 11g).
So i would like to know why there is a difference in file size(.dmp files) of export files between two oracle versions. I have crosschecked objects and rows of data tables. It is perfectly same.
Command line parameter for export on Oracle 9i
exp test/test FILE=test.dmp OWNER=test GRANTS=y ROWS=y COMPRESS=y LOG=test.log
Command line parameter for export on Oracle 11g
expdp test/test DIRECTORY=dpump_dir DUMPFILE=test.dmp LOGFILE
=test.log
View 3 Replies
View Related
Dec 17, 2012
Which command is the right way to open a alert log file of below size.
-rw-r----- 1 oracle asmadmin 28070129 Dec 17 08:06 alert_DEMO.log
I used to open a alert log file using vi editor. Is that a right way to open and view a alert log ?
View 5 Replies
View Related
Oct 1, 2012
Server:
Windows 2008 R2 x64
Oracle Database 11g Release 11.2.0.1.0 Standard Edition
I recently applied the Oracle® Database Server Version 11.2.0.1 Patch 16 I have several databases running on this box with the same application.Since then I've had lots of customers complaining about slow systems. The server has plenty of RAM availableI also have several other servers all on the same versions, same patch, running the same application with no issues.
Looking at v$session there are often lots of active sessions that have a sql_address of "00". I've never seen this before and I regularly look at v$session, as you can see for below these have been running for a while. The sessions do drop off but I am at a loss as to what is happening.
select username, last_call_et, sid, serial#, user#, status, sql_address, sql_hash_value
from v$session
where username is not null and status = 'ACTIVE'
order by last_call_et desc, sid
[code]...
View 8 Replies
View Related
Aug 10, 2013
I just did a 112G file migration of production data using oracle_datapump so I know this works in principle. When I tried it on my test instance I am seeing stuff like this
[oracle@aggs00.test for_test]$ ls -l aggs_day_conversion_agg_2419
-rw-r----- 1 oracle oracle 15917056 Aug 10 09:06 aggs_day_conversion_agg_2419
CREATE TABLE IMP_3251198_2419(
PARTITION_DATE DATE,
USER_ID NUMBER,
SID NUMBER,
[code]....
Executed in 1800.642 seconds
why it could be taking 1800 seconds to select one record from a not very big table? File corruption? Disc fragmentation? Oracle instance configuration?
View 29 Replies
View Related
Oct 1, 2012
We had AIX OS on 570 machine and database 10.2.0.4. We took expdp and it took 2 and hour to complete every night.
Now we upgrade to 10.2.0.5 and 770 machine and now same command takes 6 hours to complete even database and hardware is upgraded
Command is
expdp T24SILK/oracle directory=backup dumpfile=exp_beod_T24_%U_$dt
.dmp logfile=exp_T24_$dt.log EXCLUDE=TABLE:"LIKE '%TRACE'" parallel=6
View 1 Replies
View Related
Aug 23, 2010
lsnrctl RELOAD [listener_name] Hope the above command wont terminate the current sessions to the database.
View 2 Replies
View Related
Jan 6, 2011
I am trying to get a readable version of a .trc file generated by Oracle
10.2.0.4.0 with tkprof, but I still have been unable to get this done...and this is what I have already tried:
tkprof /u00/app/oracle/admin/DB/udump/*BITN1234.trc /batch/salidas/student/BITNTRACE.txt
And whether I run this from my PL/SQL process (PRO*C) or from a command line, it returns:
TKPROF: Release 10.2.0.4.0 - Production on Thu Jan 6 11:04:38 2011
Copyright (c) 1982, 2007, Oracle. All rights reserved.
could not open trace file /u00/app/oracle/admin/DB/udump/BITN1234.trc
I have already (from my PRO*C code and from command line)...
- made sure that the file is in the directory.
- run this from the udump directory, where the .trc file is...didn't work.
- run this from the udump directory, and specifying explicitely the
complete path anyway in the tkprof line (redundant...I know)...didn't work.
- tried to copy the file to another directory in order to run the tkprof,
and it returns:
cp: BITN1234.trc: The file access permissions do not allow the specified
action.
- tried changing privileges with:
chmod a+r BITN1234.trc
and it returns:
chmod: BITN1234.trc: Operation not permitted.
- tried to run tkprof01 instead of tkprof and it returns:
We trust you have received the usual lecture from the local System Administrator. It usually boils down to these two things:
#1) Respect the privacy of others.
#2) Think before you type.
View 3 Replies
View Related
May 10, 2010
When I am creating service for oracle using oradim Utility ,following messsage appearing
F:Users>oradim -new -sid stby -startmode manual
DIM-00014: Cannot open the Windows NT Service Control Manager.
O/S-Error: (OS 5) Access is denied.
View 9 Replies
View Related
Mar 9, 2012
I am recieving errors when trying to load the control file. The errors are as follows:
SQL*Loader-500 Unable to open file (homework.ctl)
SQL*Loader-553 file not found
SQL*Loader-559 SYstem error: The system cannot find the file specified.
My control file is located directly in the C drive (C:homework.ctl). The control file contains the following
LOAD DATA
INFILE 'c:country.dat'
APPEND INTO TABLE homework
fields terminated by ',' optionally encloded by '"'
(country, month, day)
WHEN (month='April')
The command I am entering is:
sqlldr system/password control=homework.ctl
I've tried c:homework.ctl, 'c:homework.ctl', and placing the file in the BIN folder of Oracle.
View 7 Replies
View Related
May 31, 2004
What is the parameters of exp and imp command?
View 4 Replies
View Related
Nov 1, 2006
I'm getting an error when trying to use the new Data Pump Export/Import utility.
I am able to create a directory using SQLPLus, and I get the "Directory Created" message, but no directory actually gets created on the server.
SQL> CREATE DIRECTORY datapump AS 'C:Inetpubdatafiledatapump';
Directory created. But I dont see the directory created on the server.
Then on the server:
C:Documents and SettingsAdministrator>expdp ******/****** FULL=y DIRECTORY=datapump DUMPFILE=expdata.dmp LOGFILE=expdata.log
Export: Release 10.2.0.1.0 - Production on Wednesday, 01 November, 2006 1:51:55
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 475
ORA-29283: invalid file operation
View 5 Replies
View Related
Jan 11, 2013
I am new to SQL*loader and I would like to know what is the maximum number of ROWS that can be loaded in a Conventional bind array while specifying the command line parameter.
View 2 Replies
View Related
Aug 27, 2007
i am trying to run DBCA from unix command line. but it gives me error like this:
dbca[158]: /home/oracle/product/10.2.0/jdk/jre/bin/java: not found.
Also i tried to open some other utilites too. like netca it gives me same error.
View 5 Replies
View Related
Jul 25, 2012
I ran "exp" command to take a back of Oracle Db based on user and later imported(using "imp" command) the dump into another db. Its seen that some the tables are not exported during exp command run. Can I use exp command on Oracle 11.2 version?
or should I always be using expdp command?
View 3 Replies
View Related
Apr 5, 2010
load/unload csv file using sql command
View 1 Replies
View Related
Dec 27, 2011
I have taken database backup using exp command and when I try to import in other pc the foreign keys are not imported. It saying error message that no matching unique key or primary key for this column.
how will i take backup including with primary keys?
View 7 Replies
View Related
Jun 26, 2013
Below is my import command for importing specific function from export file but iam getting below errors
impdp system/PASSWORD schemas=TNC6 directory=dumpdir dumpfile=FULL01-02-2011.dmp logfile=IMP.log include=FUNCTION:"IN ('TNC_IS_NUMBER')"
ORA-39001: invalid argument value
ORA-39071: Value for INCLUDE is badly formed.
ORA-00936: missing expression
View 4 Replies
View Related
Jan 6, 2012
I want to take a export of schema JACK of size 700 MB which contains list of objects in it.
SQL> select count(*),object_type from dba_objects where owner='JACK' group by object_type;
COUNT(*) OBJECT_TYPE
---------- -------------------
207 INDEX
4 PROCEDURE
190 TABLE
80 VIEW
3 SYNONYM
67 SEQUENCE
6 rows selected.
The export command i am going to use is as below.
exp system/oracle@ORCL1 file=schemaexp.dmp log=schemaexp.log owner=JACK rows=y direct=y
grants=N constraints=y COMPRESS=N buffer=100000000 RECORDLENGTH=64000
Is it possible to take this schema export in windows command prompt mode and any guess how long it would take to complete the export ?Because based on the time it takes, i am going to perform the export in windows command prompt.
View 3 Replies
View Related
Dec 30, 2010
While i like to start CSS service to create new ASM instance in my own pc for testing purpose gettting the below errors "'localconfig' is not recognized as an internal or external command, operable program or batch file.".
View 1 Replies
View Related
Apr 5, 2011
I'm working with sqlldr and i try to insert data from a csv file to a CTL file. One field of my table contains 5 characters but one row has 6 characters in this field, so it's rejected by oracle. (Logical, you can't insert 6 chars in a 5 chars field)
an error is visibly returned, so i wondered how you could catch the value of this error?is it a code? a message?
I'd like to add to my script a condition so that the end of the script would continue even if this error code is returned for that CTL execution.
View 11 Replies
View Related
Sep 11, 2013
I am trying to run PL/SQL Developer from command line, So it can open with an opened SQL WINDOW by default, with a script (which I specify in the command ) already in it:
I've tried:
"C:Program Files (x86)PLSQL Developerplsqldev.exe" userid=andrey/andrey@mydb
but it just opens the application with a connection, no file.
*CORRECTION* While I was writing this post, I tried something , which is not so sophisticated, and it appeared to work...
C:Program Files (x86)PLSQL Developerplsqldev.exe" userid=andrey/andrey@mydb "C:Program Files (x86)PLSQL Developerandrey.sql"
View 1 Replies
View Related
Mar 30, 2008
I am trying to have sqlldr running against a file:
C:oratest20080318
Is it possible I get SQL*Loader-500: Unable to open file and 503 file not found just because the file name does not have an extension?
I can see that any file name I try it looks after whateverFileName.dat! Is there a way to have sqlldr working with files that do not have extensions?
View 22 Replies
View Related
Jun 23, 2010
I am facing one problem we used to copy any document and for open any document in WINDOWS
Host('CMD /C COPY '||'"'||:Old_Doc||'"'||' X:�3�3'||:New_Doc);
Client_Host('Cmd /C Start '||'X:�3�3'||:New_Doc);
But these commands not running in Linux.
View 12 Replies
View Related
Jul 16, 2013
Is there a way to auto export or schedule exports? If not is there another program that can be used?
View 1 Replies
View Related
Mar 9, 2012
My application is opening a lot of sessions in my DB server. I applied resource_limit=true and idle_time=15 min. ans assign this profile to all application user.
Now I am seeing a lot of sessions having status sniped in v$session.
I want to clean up these sniped sessions and what they mean.
View 19 Replies
View Related
Aug 19, 2010
I am using oracle 10g as server in my lab. I faced some problems initially, but later after increasing the USERS tablespace it is working fine.
But there is still one problem. During the query execution some queries will be blocked and it doesn't leave any consequent queries to execute from the same user.
The blocked sessions will be displayed in the admin page under blocking sessions link. There is a option to kill the session. But when i do that, it affects all the users and the connection will be lost to all the users. again I have startup the database from beginning.
View 1 Replies
View Related
Jul 13, 2010
We had an issue last week were we had a session with a very basic SQL query lock up the database, spiking the CPU at 100%. When you would kill the session, the lock would just jump to another session and so on. We finally had to restart the database since our clients were being kicked out. After the restart of the database, the LGWR ended up locking and held the CPU between 85-95%. The archive logs were switching every 5 minutes, when normally it would be every 45min. We spoke with Oracle Support, but they just ended up brushing the issue off and saying it was a hardware issue and were not able to provide any kind of backing to that.
View 4 Replies
View Related
Mar 26, 2013
I have a simple question about database sessions. The value of parameter "sessions" is set to 500 and the users connect to database through an application server(Jboss). There are more than 500 users connect to the database through application.
My question is, how more than 500 users can connect to the database without any issue, if we set the value of "sessions" parameter to 500?
View 2 Replies
View Related
Feb 5, 2011
I have a table with counter value which will be incremented or decremented by several application servers.
SQL> select * from test;
COUNTER
----------
10
Application servers(multiple servers) will be running update against this row for increasing the counter value or decreasing the counter value.
update test set counter=counter+1;
update test set counter=counter-1;
update test set counter=counter+1;
update test set counter=counter+1;
So when update happens concurrently to this table will the counter value gets messed up?
I did a small test by opening multiple sessions for running update and the result I got for above update statement was 11,10,11,12.
But our developer is bit skeptical about this approach and he is using select for update and then updating the row.
Which approach will be better?
View 9 Replies
View Related