Server Administration :: Core Dump Under $ORACLE_HOME / DBs

Oct 9, 2012

I saw big size .core file is generated in $ORACLE_HOME/dbs folder even when no dump dest parameter is set to dbs folder. How to check what causing this genrarating of these files.

PROD1 /oravl01/oracle/11.1.0.7/dbs > ls
1 core.11528 core.15973 core.22231 core.26792 core.29496 core.5262 initCCLCLMP.ora
controlfile.binary.PROD1.Fri core.12504 core.15981 core.2262 core.27129 core.29802 core.5847 initdw.ora
controlfile.binary.PROD1.Mon core.12520 core.1617 core.23115 core.27196 core.30793 core.6176 init.ora
controlfile.binary.PROD1.Sat core.1256 core.1626 core.23276 core.27531 core.30877 core.6221 initTESTCRM.ora
controlfile.binary.PROD1.Sun core.12594 core.16364 core.23681 core.27560 core.30885 core.6535 orapwPROD1

[code]....

View 7 Replies


ADVERTISEMENT

Server Administration :: Difference Between ORACLE_HOME And ORACLE_BASE?

Oct 3, 2011

what is

ORACLE_HOME
ORACLE_BASE

difference between ORACLE_HOME and ORACLE_BASE

View 1 Replies View Related

Server Administration :: How To Move Listener Log From ORACLE_HOME To DIAG_HOME

Jun 29, 2012

To one of my server i can see listener log are on normal location i.e. $ORACLE_HOME/network/log and on other server listener log is DIAG home. how to move listener log from ORACLE_HOME to DIAG_HOME.

oracle > lsnrctl status LISTENER
LSNRCTL for HPUX: Version 11.2.0.2.0 - Production on 29-JUN-2012 10:20:14
Copyright (c) 1991, 2010, Oracle. All rights reserved.
Connecting to (ADDRESS=(PROTOCOL=tcp)(HOST=)(PORT=1521))
STATUS of the LISTENER
------------------------
Alias LISTENER
Version TNSLSNR for HPUX: Version 11.2.0.2.0 - Production
Start Date 03-MAY-2012 08:09:57
Uptime 57 days 0 hr. 10 min. 16 sec
[code]...

View 5 Replies View Related

Server Administration :: Information Regarding Latest Patch Installed For ORACLE_HOME

Feb 23, 2010

How do i find out the information regarding the latest patch installed for ORACLE_HOME..

OR

If given a patch number, how do i find out whether it's applied to ORACLE_HOME

View 1 Replies View Related

Server Administration :: How To Read Dump Contents

Jan 4, 2012

How to read the treedump contents of a index IDX_TB_TEST_N1?

SQL> select object_id,object_name from dba_objects where owner='HXL';

OBJECT_ID OBJECT_NAME
---------- ---------------------------------------------
51786 IDX_TB_TEST_N1

alter session set events 'immediate trace name treedump level 51786'

/u01/app/oracle/admin/oracl/udump/oracl_ora_2679.trc
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
ORACLE_HOME = /u01/app/oracle/product/10.2.0/db_1

[code]...

View 6 Replies View Related

Server Administration :: Cannot Audit Entry In Dump

Jul 26, 2012

I am trying to enable auditing as

SQL> alter system set audit_trail=OS SCOPE=SPFILE;

System altered.

SQL> STARTUP FORCE
ORACLE instance started.

Total System Global Area 171966464 bytes
Fixed Size 2019320 bytes
Variable Size 113246216 bytes
Database Buffers 50331648 bytes
Redo Buffers 6369280 bytes
Database mounted.
Database opened.

SQL> show parameter audit

NAME TYPE VALUE
------------------------------------ ----------- ------------------------------
audit_file_dest string /u01/app/oracle/admin/orcl/adu
mp
audit_sys_operations boolean FALSE
audit_syslog_level string
audit_trail string OS
SQL>

SQL> create user apexos identified by abc1;

User created.

SQL> grant connect, resource to apexos;

Grant succeeded.

SQL> audit select table, insert table by apexos by access;

Audit succeeded.

SQL> audit table by apexos by access;

SQL> SELECT audit_option, failure, success, user_name
FROM dba_stmt_audit_opts;

AUDIT_OPTION FAILURE SUCCESS USER_NAME
---------------------------------------- ---------- ---------- ------------------------------
TABLE BY ACCESS BY ACCESS APEXOS
SELECT TABLE BY ACCESS BY ACCESS APEXOS
INSERT TABLE BY ACCESS BY ACCESS APEXOS

cONN APPOS/ABC1

SQL> CREATE TABLE TAB1 (ID NUMBER, NAME VARCHAR2(20));

Table created.

SQL> insert into tab1 values (10, 'Michel');

1 row created.

SQL> insert into tab1 values (30, 'Andrew');

1 row created.

SQL> select * from tab1;

ID NAME
---------- --------------------
10 Michel
30 Andrew

SQL> /

ID NAME
---------- --------------------
10 Michel
30 Andrew

SQL>

SQL> select username, timestamp, action_name, action, SES_ACTIONs, sql_text
2 from USER_audit_trail where username='APEXOS';

no rows selected

SQL>

I also did not find any file contiaing the above statement as audit record in
/u01/app/oracle/admin/orcl/adump.

There are numerous old file in the /u01/app/oracle/admin/orcl/adump locaton. But When I executed the sql statement then that time no audit file was not generated in the location.

how to find audit record.

View 5 Replies View Related

Server Administration :: User Dump / Trace Files With Huge Size In MBs

Mar 22, 2012

I am facing problem in user_dump_dest directory...I have noticed that there are a lot of trace files with huge size in MBs.I clean it and after 4 days there are 40G of size..

View 1 Replies View Related

DB 11g On Min2K3 Take 100% Of One Core 2 To 4 Minutes After Startup

Feb 23, 2013

I have a Windows 2003 VirtualBox instance, to which I assigned 3 out of 4 cores my laptop has. This is a demonstration environment for an Oracle vertical product. I got it from my colleagues. The OS boots without starting the DB services - I did this deliberatley while trying to figure out what is happening.

About 2 and half to 3 and a half minutes after the service is started the oracle.exe "latches onto a core and does not let go" (as best as I can describe what I see). With 3 cores I see 33%-34% cprocessor use in the task manager with oracle.exe doing all the using. Nothing else is started. There is no process of which I am aware which actually uses the DB. I only start the TNS listener and the database service.

Once I start it, the demonstration software uses the database extensively for complex queries. With one of the 3 cores 100% used by the oracle.exe I am running short on CPU at times, which makes the demonstratin seem slaggish, and queries take longer than is really acceptable (not surprising seeing oracle.exe is very busy doing I know not what).

View 2 Replies View Related

Possible To Have Oracle RAC Standard Edition With 3 Nodes Of Intel Quad Core

Feb 5, 2013

I read that Oracle RAC which is bundled in Standard Edition can support at max 4 sockets. One of my client has a proposal of using RAC but for 3 nodes each using single 1 quad core Intel processor. As far as i understood, an Intel quad core is a multi chip module and actually is a combination of 2 modules dual core, so each Intel quad core may be counted as 2 sockets. Which yield to the proposal of my client will be failed, as the total number of sockets in it will be: 2*3=6 that exceed the max 4 support.

View 4 Replies View Related

Enterprise Manager :: Core.xxx Files Generated By Grid Control Agent?

Aug 2, 2010

I have a server with Red Hat EL 5.5 running an oracle database 10g R2 and an Oracle Agent 10.2.0.5The disk ran out of space and I realized that there are a lot of files stating with core.xxx such as :

-rw------- 1 gridagent oinstall 17M Aug 2 13:10 core.10348
-rw------- 1 gridagent oinstall 17M Aug 2 13:15 core.10827
-rw------- 1 gridagent oinstall 17M Aug 2 12:30 core.4129
-rw------- 1 gridagent oinstall 17M Aug 2 12:35 core.4772

What's that ?Why are those files generated at $AGENT_HOME/<HOST>_<SID>/sysman/log/ ?

There are more than 24 GB of those files

in nmc.log I can see lines such as:
NMC-00000 2010-08-02 13:25:39 [11760, nmcdbg.c,0440]TRC: Debug context enabled.
NMC-20020 2010-07-30 01:31:58 [23083, nmccole.c,0725]ERR: Could not collect using DSGA Collection method
NMC-20014 2010-07-29 01:31:56 [11258, nmccole.c,0822]ERR: Could not attach to the SGA

View 1 Replies View Related

Oracle_Home Directory Recovery

Apr 21, 2011

If ORACLE_HOME Directory crash/lose then how ? is there any option to recovery ? if yes how?

View 2 Replies View Related

Oracle_home / Bin Directory Deleted

Nov 4, 2011

I have a major problem my oracle_home/bin directory was delete mistakenly by on of our user. Because i cannot use all oracle utitilessqlplus,dbca,netca,netmgr. how do i the cover this problem?

View 2 Replies View Related

ODA Database Appliance Oakcli Create ORACLE_HOME

May 15, 2013

When using oakcli to create an oracle home in ODA, are there any parameters or configuration files to determine the directory to install the database software? I want to create 3 separate 11.2.0.3 ORACLE_HOMEs in these directories, all owned by the 'oracle' user..

/u01/app/db1/product/11.2.0.3/dbhome_1
/u01/app/db2/product/11.2.0.3/dbhome_1
/u01/app/db3/product/11.2.0.3/dbhome_1

But oakcli seems to accept only the version parameter:

oakcli create dbhome -h
Usage:
oakcli create dbhome [-version <version>]

where:
version - Version information for creating the database home.

That would install Oracle 11.2.0.3 into /u01/app/oracle/product/11.2.0.3/dbhome_1 only. The reason for the separate homes is for flexibility to patch them independently of each other (ie dependent on only the requirements of the products they support)

View 1 Replies View Related

Server Utilities :: Importing 9i Dump Into 10g?

Jun 22, 2011

how can i import the oracle 9i dump file into 10g database, while iporting i get following error imp-00002 fail to open dump file

View 4 Replies View Related

Server Utilities :: Reading From Oracle Dump

Jun 14, 2006

Is there is any way of reading from oracle dump file?

View 16 Replies View Related

Server Utilities :: Dump Table From One To Other Schema

Mar 4, 2010

i need to dump the table from A schema to B schema with different table name.

Suppose i have TABLE A IN "A" SCHEMA i need to dump the table with DATA+sTRUCTURE in " B"SCHEMA WITH TABLE NAME AS B.

View 3 Replies View Related

Server Utilities :: Dump File Determination

Mar 29, 2013

Is it possible to determine whether the dump file is created using data pump export or normal export method by just looking at dump file, If yes, how ?

Why i am asking such question is...normal export and data pump export would create a dump file with an same extension filename.dmp. So to avoid confusion during import, i would want to determine by what method the dump file was created.

Also this would be useful for me at the scenario when the customer sends me only the dumpfile and ask to import into target database. ( may be the customer don't know in what method the dump file was created ).

View 23 Replies View Related

Server Utilities :: How To Import Dump In Schema

Feb 7, 2011

I have exported a schema dump with schema name as 'A'.I want to import that dump in to schema 'B'.how ?

View 5 Replies View Related

Server Utilities :: FTP Dump File Over Network

Apr 19, 2010

I have A Daily hot backup using Expdp Command On oracle 10g R2 installed on the Linux server. And I'm trying to move this Dump File to Another directory on Windows server 2003 over network using Ftp script which will be run after the export process finished Automatically.

View 9 Replies View Related

Server Utilities :: Dump File Generation

Jan 18, 2012

I have a question on export dump file generation.

select sum(bytes)/(1024*1024*1024) "GB" from dba_segments where owner='JACK';

The above select query give the output of Schema size with 15 GB. When i perform the same schema export, the dump file size generating is 2 GB. What is the difference between the two scenarios as how come there could be a variation in file size?

View 6 Replies View Related

Server Utilities :: How To Reimport 9i Dump File To 10g

May 29, 2010

I am in the process of upgrading our 9i DB to 10g . As they are on different servers, I have installed 10g on the new server and applied the latest patchset 10.2.0.4.

I am creating the production database and importing th e9i dump file into this.Now I will be testing the whole application that uses this database.After a week, I need to take the latest 9i dump and export to the new 10g DB.

Do I need to just import the latest 9i dump into the 10g db or do I need to do anything else?

View 3 Replies View Related

Server Utilities :: Import Dump File In 11g

Feb 24, 2012

I am facing a problem importing DMP file in 11g. While importing it gives me error not responding. I have to attached the jpg file for that to clear you my point whats wrong is going during import. My Dump is on 9i i want to import that on 11G R2.

View 4 Replies View Related

Server Utilities :: Export Dump File

Jul 29, 2011

Is it possible to identify what level of export by looking at export dumpfile .. whether it is a schema export,full export,table export,..

If yes.. how ?

View 3 Replies View Related

Client Tools :: SP2-0750 / Set ORACLE_HOME To Oracle Software Directory

Jan 29, 2010

bash: sqlplus: command not found
Error 6 initializing SQL*Plus
Message file sp1<lang>.msb not found
SP2-0750: You may need to set ORACLE_HOME to your Oracle software directory

i install oracle 10gR and the process was successful i could do everything. the problem started when i install Zend Core for oracle, the install of zend was successful but i now can't start sqlplus.

View 18 Replies View Related

Server Utilities :: Export Dump Of Large Table

Apr 9, 2010

We have two databases running on 10.2.0.4 and 9.2.0.8. Both are having the same unpartitioned table of size 80G. I am exporting the table on 10g by using parallel=8 and dumpfile with %U option. That took around 4 hours to export the table.

And on 9.2.0.8, i am exporting using below parameters, taking around 5 hours.

buffer=2000000
recordlength=64000

options i can try to speed up the export in both versions.

View 2 Replies View Related

Server Utilities :: Import Dump File Without 2 Tables

Jan 3, 2012

I want to import dump file (without 2 tables) .The dump file contains 100 tables,indexes and constraints. So out of 100 tables i want to import 98 tables from dump file (without 2 tables).

View 13 Replies View Related

Server Utilities :: Import Oracle 10g Dump Into 9i Database

Mar 31, 2010

If I want to import 10g export dump file in to the 9i database, I am connecting to 10g database from 9i database using
exp user/password@10gdb ....

However is there any option like executing 9i catexp.dat on 10g database and do the export from 10g database itself to be imported into 9i?

View 6 Replies View Related

Server Utilities :: Evaluate Export Dump Size

Jan 11, 2012

I want to take a schema level export .The schema size is 115 GB size . Do we require same amount of space to be available in server side (where we are taking a dump) as the schema size or less or more space is required in server side ?

View 6 Replies View Related

Server Utilities :: Procedures To Import A Dump From 9i To 11g Directly?

May 31, 2011

I tried to import a dump in 11g that was taken in oracle 9i. The import started but it hangs after some time. Exactly say it check only the character set of the DB's then it hangs. let me know if there are any specific procedures to import a dump from 9i to 11g directly.

View 8 Replies View Related

Server Utilities :: IMPDP (ORA-31619 Invalid Dump File)

May 29, 2012

I need to recreate/ clone my database to a new machine. The two machine are not connected in the network.

Step 1. (Oracle 10.2.0.5 AIX 64-bit)
expdp username/password@db1 full=y dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_expdp.log job_name=full_exp

Step 2.
FTP dump files to Windows

Step 3. (Oracle 10.1.0.2 Windows 32-bit)
impdp username/password@db1 dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_impdp.log full=y

I got:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"

Done in AIX:
create directory dp as '/bak'
grant read, write on directory dp to public;
grant exp_full_database to username;

Done in Windows:
create directory dp as 'C:P7DB';
grant read, write on directory dp to public;
grant exp_full_database to username;
grant imp_full_database to username;

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved