Unable To Deleted Archive Log File On Windows Server?
Sep 20, 2010
version: 10.2.0.4
OS: windows server 2003
I am not able to delete one month old archive log file manually on windows which doesn't having info about the standby on v$archived_log view of primary database. the sequence were already applied to the standby database. It shows the status as deleted in v$archived_log. while deleting the file manually. it showing an error like another program or person is using it.
ora-00257 archiver error. connect internal only until freed
when we tried to remove the unwanted arc files thro ASMCMD,we are getting below error:
ASMCMD> rm -ef 2011_04_05/ Unknown option: e usage: rm [-rf] <name1 name2 . . .> ASMCMD> rm -rf 2011_04_05/ ORA-15032: not all alterations performed ORA-15028: ASM file '+XCOM_BACKUP_DG/TXCOM/ARCHIVELOG/2011_04_05/thread_2_seq_27215.1143.747641143' not dropped; currently being accessed (DBD ERROR: OCIStmtExecute) ORA-15032: not all alterations performed ORA-15028: ASM file '+XCOM_BACKUP_DG/TXCOM/ARCHIVELOG/2011_04_05/thread_3_seq_21762.826.747641143' not dropped; currently being accessed (DBD ERROR: OCIStmtExecute) ORA-15032: not all alterations performed ORA-15177: cannot operate on system aliases (DBD ERROR: OCIStmtExecute)
We are getting below error....Our one package called orawpcom.dll library file.
ERROR at line 1: ORA-06520: PL/SQL: Error loading external library ORA-06522: Unable to find library '/oracle9i/app/product/11.1.0.7.0inorawpcom.dll'. ORA-06512: at "GFSAM.OAINVOKEDOUBLE", line 1 ORA-06512: at "GFSAM.ORDCOM", line 229 ORA-06512: at "GFSAM.ORDEXCELSB", line 450 ORA-06512: at line 43 ORA-06520: PL/SQL: Error loading external library ORA-06522: Unable to find library
In my database smon tracefile was huge,now we want to delete the same,I also tried with ordebug operation but no use, delete tracfile which Smon generates.
I'm facing problem with archive log file size, Archive logs are generated with only of 90m or 92m or 94m(Variable sizes of less than 100m), Although i had set 100m for each of my redo log file. Here i'm providing my create db script for your reference. I want to know why the log switches before it reaches 100m.Is there any connection of intial 10m for my .dbf files.
We are facing a different issue in our database. From yesterday night, the archive log generated with 5 digit. But it supposed to be 6 digit. Hence we are not able to apply the logs in DR Location.
In normal days size of archives generated in a day is 14-15GB. But since yesterday morning, almost 150GB of archives have been generated and are still getting generated(200MB every 1-2 minutes).
There was a sudden reboot of server yesterday morning. At that time there was heavy load of transactions on database. Can it be a reason that smon is still doing recovery? (I am not sure on this). Also, Undo tablespace is increased from 18 GB to 50 GB since yesterday (autoextend on).
Now we are running out of space for archive file system (can't delete them also until they are transferred to DR) Size of redo log is 200MB. This database supports around 2500 users.
performance wise I don't see any hit. Also wait events are normal. (only few db file sequential read) finding the query/session which are causing this much huge amount of archives?
I have a 11g database installed on a windows 2003 server. When I created the database I could not assign more than 2GB for SGA_TARGET where as I have 16GB of RAM available on the server. I created the database with SGA_TARGET as 1.5 GB and Memory_TARGET as 2GB. I could create the database successfully. Later when I again tried to increase SGA_TARGET to 6GB and MEMORY_TARGET to 8GB, I could not start the database. I got below error;
ORA-27102: out of memory
OSD-00022: additional error information
O/S-Error: (OS 8) Not enough storage is available to process this command
Quote:I added /pae in the boot.ini Added AWE_WINDOW_SIZE key to registry as 2000000000 Set use_indirect_data_buffers=true added db_block_buffer=131072 (ie. 2GB DB_BLOCK_SIZE=16kb) java_pool_size= 1000M large_pool_size=1000M shared_pool_size=2000M
I got again the same error
I could not use SGA_TARGET. So is AMM not allowed with AWE?
working on setting up connection between a Windows 2008 server and a pair of Oracle 11g DBs in a RAC Cluster. One Database (let's say DatabaseA) is in one data center, and the other (DatabaseB) is an a secondary, backup database. The RAC Cluster is all set up, working fine, etc. However, I Need to set up the machine.config file on my Windows Server, to go only connect to DatabaseA, unless it fails, in which case, we want it to connect to DatabaseB. Think we could do this if the host app server was Linux/Unix, but it is windows, and I just don't have the background as to the parameters to set up in the machine.config file. They are similar, but different, and we want a very specific behavior (use DatabaseA, unless fails, then DatabaseB). Application is .NET 4.0 app.
We have a .NET application on windows server 2008 32bit using the stored procedures on an Oracle 10g environment. We are trying to deploy the .NET application onto a 64bit x86 Windows server 2008.
We face an error trying to install the x64 version of the Oracle 10g client on windows 2008. The error we get is Problem signature:
Problem Event Name: APPCRASH Application Name: javaw.exe
Is the x64 version of the Oracle 10g client certified for Windows 2008? Has anyone successfully installed the 64bit version of the oracle client on win2008?
We have a workaround in place with 32 bit version of the client running with the 32 bit version of the .NET application on x64 win2008. However to maximize the infrastructure we need to use the 64 bit version or we would have to turn to MSSQL Server .
Import: Release 11.2.0.1.0 - Production on Tue May 7 16:51:47 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Username: sys as sysdba Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options ORA-39001: invalid argument value ORA-39000: bad dump file specification ORA-39088: file name cannot contain a path specification
[oracle@oracledbserver product]$
This is Oracle 11g hosted on an eucalyptus cloud instance.
when i try to install Oracle 10g (10.1.0.2) 64- bits on windows server 2003 enterprise edition 64 bits service pack1 and i try for windows server 2003 standard edition 64 bits on a pc,i am getting the below error.
the image file E: is valid, but it is for a machine type other than current machine
I tried a lot to load data to table from excel(.csv) using sql*loader the oracle version of sql*loader doesn't support the control file created using notepad(.ctl).Though i given a filename with extension as .ctl it seems as a .txt file. Is there any alternate way to create it?
I am recieving errors when trying to load the control file. The errors are as follows:
SQL*Loader-500 Unable to open file (homework.ctl) SQL*Loader-553 file not found SQL*Loader-559 SYstem error: The system cannot find the file specified.
My control file is located directly in the C drive (C:homework.ctl). The control file contains the following
LOAD DATA INFILE 'c:country.dat' APPEND INTO TABLE homework fields terminated by ',' optionally encloded by '"' (country, month, day) WHEN (month='April')
The command I am entering is:
sqlldr system/password control=homework.ctl
I've tried c:homework.ctl, 'c:homework.ctl', and placing the file in the BIN folder of Oracle.
I need to copy .CSV File from a Windows Server shared path (\hostnameoutput) to another server which i believe is on unix.The other server name is abc.hcl.com. On this server i need to put it in the root directory. I will have to use SFTP and not FTP.
We have the task of migrating a legacy system to a spanking new version. The legacy system is currently running Oracle RDBMS 7.3.4. In order to verify and test the migration we are building a migration VM. This requires us to install Oracle 7.3.4. We don't have the media anymore. Metalink does not seem have it either and there does not seem to be an Oracle Download archive in existence.
where we can obtain the media or download the media from?
I am trying to move my archive from linux to window as the log switch occur.... for this i mount my directory of window in linux .. but it doesn't moving... dest 1 on linux and dest 2 for windows...
In alert log the permission denied appears... the mounted directory in linux of windows is owned by root ... and is not changing to oracle.
Upgrading from 10.1.0.2 to 10.1.0.5. Enterprise Manager requires 'newest' version of Oracle JDBC drive.Downloaded what I believe to the correct file (classes12.jar). I'm unclear what to do with this, my readings have pointed me in the following direction:
1) copy to c:oracleproduct10.1.0db_1jre1.4.1in 2) extract
here is the problem...tried:
1) just clicking on it (nothing)
2) c:program filesjavajre1.6.0_03injavaw -jar classes12.jar Error: Failed to load Main-Class manfest atrribute from c:oracleproduct10.1.0db_1jre1.4.1inclasses12.jar
Is my location correct, I've been hunting everywhere..making no progress.
I have an oracle DB version 11.2 running on oracle enterprise linux 5.9. How to transfer data from the oracle DB to a flat file on a windows server. What i have done so far is to use utl_file to create a csv file on the oracle server and am now attempting to transfer this file.
I was going to use scp or rcp but am unable to get this to work(was looking at filezilla). Another option i can use is ftp as i have a UNIX script which i can run to do this. All this is done through an oracle package which is run hourly through dbms_scheduler. I have been using sp_host_command to run unix commands directly from pl/sql so can use this to run a unix script for last resort if i cant find an easier way to automate this.
I'm getting an error when trying to use the new Data Pump Export/Import utility.
I am able to create a directory using SQLPLus, and I get the "Directory Created" message, but no directory actually gets created on the server.
SQL> CREATE DIRECTORY datapump AS 'C:Inetpubdatafiledatapump';
Directory created. But I dont see the directory created on the server.
Then on the server:
C:Documents and SettingsAdministrator>expdp ******/****** FULL=y DIRECTORY=datapump DUMPFILE=expdata.dmp LOGFILE=expdata.log Export: Release 10.2.0.1.0 - Production on Wednesday, 01 November, 2006 1:51:55 Copyright (c) 2003, 2005, Oracle. All rights reserved. Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production With the Partitioning, OLAP and Data Mining options ORA-39002: invalid operation ORA-39070: Unable to open the log file. ORA-29283: invalid file operation ORA-06512: at "SYS.UTL_FILE", line 475 ORA-29283: invalid file operation
I am trying to generate dynamic control file, as the files I want to upload are coming from different source and their name is constantly changing but following a fix pattern and naming convention.
I am able to generate dynamic control file through SQL. But while calling from BATCH file, i am unable to sent the file name as parameter.
All the examples i have searched are for UNIX, how to do it with BATCH File in WINDOWS.
### Changes made ### 1 week before we did a change on tablespace segment management - from MANUAL to AUTO by following method: 1. create INVD2 & INVX2 & LOBD tablespace. 2. Move TABLE from INVD to INVD2. 3. Rebuild INDEX from INVX to INVX2. 4. Move LOBSEGMENT from INVD to LOBD tablespace. 5. After confirm no segments exist in old tablespace, offline and drop INVD & INVX. 6. Change default tablespace for INV user to INVD2. 7. RENAME TABLESPACE INVD2 to INVD, INVX2 to INVX. 8. Change default tablespace for INV user to INVD back. 9. Run Gather Schema Stat for INV using UNIX scheduler which work usually. However, error ended with ORA-03113 & ORA-03114. 10. Manual execute with same statement the following day, procedure completed successfull.
After 1 week later, inventory forms detected error FRM-40735 in all forms. Checked the gather schema stat job was run in the morning before user feedback..
AFter refer notes from metalink, I understand this is a bug where RENAME of the tablespace could not rename as the previous one, as the deleted entry is still exist in sys.ts$?
There is no segments exist in the deleted tablespace, or any user default tablespace is assigned to the deleted tablespace.
My Question:How can we delete the deleted entry from sys.ts$?And should we rename the tablespace from INVD to INVD3 (or can we use back INVD2) to avoid any unforseen error again?
Is there any way I can find out what caused the database to crash; either a history of commands executed within the database, I lost my bdump directory before the scheduled backup ran and the only logs available are after I re-created the directory.
SQL> startup ORA-00444: background process "PMON" failed while starting ORA-07446: sdnfy: bad value '' for parameter . SQL> [code]....
I am not able to delete one month old archive log file on production database. it already applied on standby database. While deleting the files, it throwing errors as "Another program is using it".
1.2.0.2 on RHL.. 3 Log Groups with 1 member each. db_recovery_file_dest string /oracle/oraarch For the purpose of increasing log file size, if i use ALTER DATABASE ADD LOGFILE GROUP 1 SIZE 300M; but it creates Log Group with 2 member. one is at /oracle/oraarch location and other at /oracle/oradata (db_create_file_dest).
We are using ORACLE MANAGED FILE SYSEM . I want only 1 member at /oracle/oraarch (to keep the previous setting intact ...just increasing the size from 100 to 300M). If I manually give the path where to create the logfile member, I get this error: ALTER DATABASE ADD LOGFILE GROUP 1 '/oracle/oraarch/DB/onlinelog/' SIZE 300M; ALTER DATABASE ADD LOGFILE GROUP 1 '/oracle/oraarch/DB/onlinelog/' SIZE 300M * ERROR at line 1: ORA-00301: error in adding log file '/oracle/oraarch/DB/onlinelog/' - file cannot be created ORA-27038: created file already exists Additional information: 1