I am using Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production installed in IBM AIX 6.1.0.0
I find this peculiar situation that the alert log is not getting update with proper timestamp for example i see an update now under my alert log of this fashion
Tue Oct 23 08:38:46 2012
ARC2 started with pid=37, OS id=50
Tue Oct 23 08:38:46 2012
ARC3 started with pid=38, OS id=178
Tue Oct 23 08:38:46 2012
[Code]...
But the current system time is
Tue Oct 23 18:40:20 GMT+05:30 2012
My memory_target parameter value is 2GB
SQL> SELECT TZ_OFFSET(SESSIONTIMEZONE), TZ_OFFSET(DBTIMEZONE) FROM DUAL;
I would like to exit from a cursor loop based on certain conditional checking. I am checking for a lot of different parameters and if they fail, I want to bypass it and fetch the next record in the cursor. I tried just putting an 'Exit' statement in the logic, but it fails. An example of my code is below:
For Row1 in cursor1 Loop If amount < 0 then balance := 0; Else
The thing is I am not getting any output when i run your procedure.I need to set any nls language? .. even i try to run below qry from the another thread,
column french format a20 with t as ( select 'fish' txt from dual union all select 'dog' txt from dual union all
[code]...
My output is blank for the french values. I am sure that I missing something, But i dont know what I am missing .currently my CHARACTER SET WE8MSWIN1252
I have a block based on a view. The view is a join on 2 tables, the first table always brings back 4 records for each parameter passed to it in the where clause. The second table is outer joined to the first table and may contain no matched records or some matched records. In some cases there will be a 1:1 match to the first table.
The problem is how to create the table handler procedure correctly. I need to update 2 tables in the table handler procedure.
The block is only enabled for update (to preserve the 4 rows), however some values on the block correspond to values from the second table. When you update a row in the block, how do you know if you are actually inserting a row into the second table or updating an already existing record on the 2nd table.
I have a page process that calls back end PL/SQL. I have a page item that is populated by said PL/SQL procedure, P35_PROCESSING_MSG. As you know the Process allows me to specify a message for Success and another for Failure. My desire is that, if there is a back end failure, the error will display in the dialog area, using the built-in "Process Error Message" functionality. Instead, after execution, the page item still has its default value and the following, ugly, error displays in the tabular report region.
report error: ORA-20001: Error fetching column value: ORA-01403: no data found ORA-06510: PL/SQL: unhandled user-defined exception
I just don't get why it displays here instead of in the usual failure area. Moreover, I don't understand why the error is "*+unhandled+* user-defined exception" when, as you will see, I have handled it.
I have a page process to perform custom MRD for a tabular form. (I don't think this is really germane to the more generic issue but, I bring it up as an explanation for the following code sample). I am deliberately causing a DIVIDE BY ZERO error to test the error handling form the back end.
PROCEDURE PROCESS_MARGIN_CALL_DELETES( as_StatusMsg OUT VARCHAR2 ) IS lb_InnerErrorOccured BOOLEAN := FALSE; ln_DeleteTargetCnt NUMBER := 0; ln_DeleteTargetRow NUMBER := 0; BEGIN [code].......
I am using Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production version
I am having the data in following table -
drop table stud_fact; create table stud_fact(stud_NM, LVL_CD,ST_DT_DIM_KEY,OVRNK) as select 'ABG Sundal','H','20110630','175' from dual union all select
I have a problem with my DB agent. it doesn't upload data correctly, so I can't see updated information in first page of EM ( it doesn't show cpu usage, Active Sessions, Listener properties,...). when I checked the logs, I found that emagent.trc shows this error: "exceed max amount of upload data: 77 files, 52M, disabling collections warn : collector : disable collector"
I have used a HUAWEI dongle directly connected to my PC. First, the operating system and Oracle start up, then I connect to the internet using the Huawei Mobile Partner utility. Any application can connect to Oracle and use it correctly. All OK. Now I have added a 3G router which is connected to the PC by ethernet cable and has the Huawei dongle connected to a USB port on the router. All works correctly if I follow the sequence:
1) Start the PC with the router switched off, and wait until Oracle has fully started.
2) Power the router up and wait 2 minutes until Windows says the 100 Mb ethernet link is open. Oracle.exe in Task Manager is about 210 Mb, and sometimes more, depending on activity.
However, if I switch on the router first and then start the PC, Oracle does not start correctly. It rejects connection requests forever more, indicating Network Adapter could not establish the connection: Code 17002.In Task Manager Oracle.exe occupies only 145Mb. What can I do to get it to start correctly ? I can dig into Listener.ORA, TNSNAMES.ORA and HOSTS without any problems, and have often modified them manually (not the case here), but logs and traces escape me !
I am running my oracle apex application on 4.2.2 on apex.oracle.com. Exporting the application seems to work but when I go to import the application as a copy the regions within a particular page are not working correctly. I want to be able to export the application and import a new copy as a DEV/TEST application and have it function the same way. On the original application the region will display to the right when hovered over the edit button but in the new one it will go to the left and not format correctly.How I would like it is just for the copy application to perform the same way the original copy does. I selected to install all dependent objects and don't know what steps I have missed to have the regions not working like they do on the original application. related to CSS not exporting/importing correctly?
I have an application item that receives a web service result. This result is like 'MARIA','JOSE','JESUS'. I'm using this string into the parameters of the interactive report, but this is not recognized. I'm showing the content of the application item into a pl sql region and the content is 'MARIA','JOSE','JESUS', when I include this application item into the query, the IR shows me 'MARIA','JOSE','JESUS'
We have Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit RAC in Linux and syslog server also in Linux.
syslog server is a centralize server to monitor all the system and database log.
is there any way to create a multiple alert log file. so that we can keep one alert log file in default location and another alert log file in centralize syslog server for monitoring purpose.
I am working on a table what has duplicate id with different data set. The problem is this table does not have any time stamp. and this table is created long ago. now I need to remove those records with duplicate id. I want to keep the oldest one. How can retrieve the time stamp of those old data.
How can i come to know that which current alert log file is being used for database? Is their a command at database level to find out the current alert log file to which database is using ?
i am using .dot net frame work to show a pop up on desktop like reminder , can i do that using any oracle tools i am using oracle 10g along with forms 6i , i am using one query for this.
From ETL to Oracle, I have stored the timestamp in varchar2 as '30-MAY-11 06.30.00.000000 PM'. Now I need to convert this varchar2(timestamp format) to date. I used:
select TRUNC('30-MAY-11 06.30.00.000000 PM','dd/mm/yyyy hh24:mi') from dualBut, it doest work. get date format?
how to create trigger or procedure for send all ORA- error alert mail from database. if any ORA - error comes in ALERT_SID.log then after trigger send the mail to my mail id
I am running Oracle XE . This database collects information about server statistics in my organization ( memory usage, threads running, etc). Now , I need to send alert/ notification emails if the certain thresholds are crossed by server ( ex: if number of threads increase more than 100, send an alert email).
One way to do this is to write stored procedures and schedule them to run every hour and send alert email if conditions are met.
But what I want to do is to use some sort of rule engine where I can specify the rules/conditions which have to be true for alerts to be sent. The reason I want this approach is so that every time there is a new rule, I do not have to write a new stored procedure and I can just specify the condition in the rule engine . So are there any Rule Engines out there that I can use (preferably open source and free to use)?