I need to verify if the current date is grater than the 15th of the current month. If its grater than the 15th of the current month i need to do an action or if else its lesser than 15th of the current month i need to do an other operation.
How to select the transactions out of the database that occurred within 70 seconds of each other. The toll_date field is a TIMESTAMP field.
Problem is, I seem to only get transactions that occurred within 70 minutes of each other. On the timestamp field I break the math down into the seconds in a day and I add 70. I then subtract that value and add that value to the timestamp and I should get anything between those values right?
I ran this following query and somehow i feel the results are wrong.
SQL> select to_char(starttime,'dd-mm-YYYY hh24:mi:ss') from report where dateofmonth between to_timestamp_tz('22-Apr-2013 12:00:00','dd-mm-YYYY hh24:mi:ss') and to_timestamp_tz ('23-Apr-2013 14:00:00','dd-mm-YYYY hh24:mi:ss');
SQL> select to_timestamp_tz(starttime,'dd-mm-YYYY hh24:mi:ss') from report where dateofmonth between to_timestamp_tz('22-Apr-2013 12:00:00','dd-mm-YYYY hh24:mi:ss') and to_timestamp_tz ('23-Apr-2013 14:00:00','dd-mm-YYYY hh24:mi:ss');
view the below select statement..why it's adding extra zero's...
select to_timestamp('2001-05-22 12:00:18.600','YYYY-MM-DD HH:MI:SS.ff3AM') from dual output: 5/22/2001 12:00:18.600000000 PM ---why it's adding extra zeors's my output should be as " 5/22/2001 12:00:18.600 PM"
Have a table which has 3 columns id,name,time where time is of datatype timestamp and it stores the time when the row was inserted. Need an query which accepts 2 parameters as input Ex: Start_Time,End_Time and all the rows in between the above mentioned timestamps must be deleted.
I have to create the following table. The fields Trend_Date, Price and Trend are already given. I have to calculate the field permanently and to insert the value in this permanent table.
Fields:
The field price belong to the value of a product during the trade. The field trade_date belongs to the moment of the trade. The field trend belongs to the future behavior of the the price. Here, the price of the present moment is compared to the following price (possible characteristics: 'UP', 'DOWN', 'STABLE'). The field permanently belongs to the time (in seconds) how long the value of the field Trend_Date (depending on the price) is still true.
For example:
Row 1: The trend in row 1 is 'UP' and it has a price of '11'. Until row 3 this remains true (the price is greater or equal to 11). In this case, the difference between row 1 and row 3 are 9801 (rounded) seconds.
Row 2: The trend in row 2 is 'DOWN' and it has a price of '12'. This remains true till to the end (the price is never greater than 12) In this case, the difference between row 2 and row 11 are 97346 (rounded) seconds. To calculate the 97346 seconds the field has to consider that between row 2 and row 11 are two days. There will be no trade between 18:00 and 07:00 o'clock. This belongs to 7 hours for each days, in seconds (2*46800) 93600. -> 190945-93600 = 97346s
Row 6: The trend in row 6 is 'UP' and it has a price of '5'. This remains true till to the end (the price is never smaller than 5) In this case, the difference between row 6 and row 11 are 65729 (rounded) seconds. To calculate the 65729 seconds the field has to consider that between row 65729 and row 11 are one days. There will be no trade between 18:00 and 07:00 o'clock. This belongs to 7 hours for each days, in seconds (1*46800) 46800. -> 112528-46800 = 65729s
Row 9: The trend in row 9 is 'STABLE' and it has a price of '8'. Until row 10 this remains true (the price is equal to 8 ). In this case, the difference between row 9 and row 10 is 14418 (rounded) seconds.
Row 11: Is empty because there are no values to compare.
i m creating the dynamic table every month to maintain the particular month data seperately .when the records getting inserted in the table,trigger will automatically insert the records in the dynamic table. only date alone(without timestamp) getting inserted in the dynamic table from staging. so by default ,00:00:00 is getting appended with date instead of actual timestamp. tried select to_date(to_char(:new.ACTN_DATE,'YYYY-MM-DD HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS') INTO v_temp_actn_date from dual; but i am getting only date alone . in my table and dynmaic table datatype for date column is date
I'm trying to generate count of the number of entries in a table for each day.The problem is the date column is of datatype timestamp and looks like this "2006-12-30 18:42:03.0"
How would I generate a report of number of entries in the table for each date (I'm not intrested in the "time" only the "date" i.e YYYY-MM-DD)?
SELECT COUNT(*) FROM my_table_name WHERE my_date_column LIKE '2006-12-30%' GO
It returns zero rows ( and I kno there are rows in the table) I'm using Oracle 10g.
how to caluclate days between two dates of single timestamp filed and with this
query Select * from m_activity_transaction where actn_opp_id in ( Select actn_opp_id from m_activity_transaction where ACTN_ACTV_ID = 218 Group by actn_opp_id
[code]...
and i nedd to caluclate no.of days between two dates like 27-JAN-12 11.06.20.000000 AM and 08-FEB-12 05.32.54.000000 PM where actn_id is unique AND ACTN_OPP_ID IS NOT UNIQUE.
Is there a way to query an oracle database in an automated fashion by a timestamp field based on current timestamp, like: 04/29/08 00:00:00 - 72 hours?
(both these fields a_std and a_time are coming as varchar from the parent table in a cursor.(basically they are time period and actual arrival time respectively)
i was juggling with the attempt to make varchar to timestamp or date..but caught with Round up /Round down)
Formula ->
A = Round down [A_TIME - A_STD] B = Round up [A_TIME) - 10 minute + A_STD]
where
A_TIME VARCHAR2(8) N Time (Format" HH:MM AM/PM") eg "3:50 PM" A_STD VARCHAR2(5) N Standard time (Format" HH:MM") eg "1:00"
Allowed values for A & B after round up/down = multiple of 10 ( 11:00,11:10,11:20 etc.)
I have 3 tables, user_login_event, person and resource_viewed_event. What I want to do have a report for each month, users logged in our application and then show for each month, how many records were created in table person and how many resource views events were logged in resource_viewed_event.
Lets only worry about the timestamp fields in these tables now as I want to use them to join the tables together or at least build correlated subqueries along the months. I have tried several options, all not leading to a desired result:
Left outer join. Works but its incredibly slow:
SELECT distinct to_char(ule.TIMESTAMP,'YYYY-MM') as "YYYY-MM", count(distinct ule.id) as "User Logins", count(distinct ule.user_id) as "Users logged on", count(distinct p2.id) as "Existing Users", count(distinct p1.id) as "New Users", count(distinct r1.id) as "Resources created"
[code]....
Tried the same with left outer joins of temporary tables created through select statements:
select distinct ule.month as "Month", count(distinct p1.user_id) as "Users created", count (ule.id) as "Logins", count (distinct ule.user_id) as "Users logged in", count(rv.id) as "Resource Views", count(distinct rv.resource_id) as "Resources Viewed"
[code]....
Tried the same with left outer joins of temporary tables created through select statements:
select distinct ule.month as "Month", count(distinct p1.user_id) as "Users created", count (ule.id) as "Logins", count (distinct ule.user_id) as "Users logged in", count(rv.id) as "Resource Views", count(distinct rv.resource_id) as "Resources Viewed"
[code]....
another approach is to create my own temporary tables using select statements and create fixed Month values which I can use to directly link the sets together.
select distinct ule.loginday as "Month", count(distinct ule.id) as "Logins", count(distinct ule.user_id) as "Users logged in", count(distinct p1.user_id) as "Users created", count(distinct p2.user_id) as "Existing users1"
[code]....
performance is OK with 2 tables but the example above takes forever to execute.
Tried an approach with union but this creates new rows for each table
SELECT DISTINCT p1.MONTH AS "Month", COUNT(DISTINCT p1.user_id) AS "Users created", NULL AS "Logins", NULL AS "Users Logged in", NULL AS "Resource views", NULL AS "Resources viewed" FROM (SELECT To_char(person.created_on_date, 'YYYY-MM') AS MONTH,
and want to check if there is an event in my EVENTS table that occurs in the same dd/mm/yyyy as the input, and can disturb the input event times. means:
input.event_start_time is between EVENTS.event_date and EVENTS.event_end_date and input.event_end_time is between EVENTS.event_date and EVENTS.event_end_date
but to compare only the hours here! (HH24:MI) because the date (dd/mm/yyyy) is checked before..
I don't know how to cut only the hours out of the date and compare them, and don't know how to write the whole function.
Here you can see that we have data for 27th Dec 2010 02,07,09 and 12 hours. I want a query which will show the full 24 hours data even if it doenst have any records. like the following,
i am using one query but not getting correct minutes.
here is my query:
v_Interval:= to_timestamp(v_temphrs,'HH24:MI:SS')-to_timestamp(v_outpunch1,'HH24:MI:SS'); v_TotalHrsMin1 := extract(hour from v_interval) * 60 + extract(minute from v_interval);
here v_interval datatype is "interval day to second" and v_temphrs datatype is varchar2 and value is : 12:00:00 and v_outpunch1 datatype is varchar2 and value is: 06:10:00 and v_totalHrsMin1 datatype is number.
here i should get value 370. but i am getting value 350.
I am using the below sql query to calculate working hours. The problem which i am facing is that query is taking lot of time to calculate the working hours. reduce the execution time of this query or if there is any other way to calculate working hours
The following query take 63.499 sec
SELECT sql_calc_found_rows gstime, MAX(stoptime) AS mx, MIN(starttime) AS mn,
SELECT audittimestamp + interval (SELECT EXTRACT(TIMEZONE_HOUR FROM systimestamp) FROM DUAL) hour from tab1I want to add Timezone_hour to my timestamp.
Recently we have upgraded from 11.1 to 11.2 . But after upgrade SQL statements that are running fine in 11.1 was running for hours in 11.2. Statistics are collected 100%...
format of dtActivityStartDate/dtActivityFinishDate: 2010-09-17 14:50:51.150 Note: Both dtActivityStartDate/dtActivityFinishDate vcActivityName = Process Request usdFuncTimeCalc (vcActivityName,dtActivityStartDate, dtActivityFinishDate)
i need to calculate time elasped for that type of activity following are the rules:
(If Process Request is the activity) Working Days: Monday through Saturday Hours of Operation: 9AM 5PM
only working hours of day need to the counted like for example if it is sep 15 11 Am is dtActivityStartDate & Sep 17 is dtActivityFinishDate is 10 Am. then time elapsed is 11am to 5pm on sep 15 , 9 to 5 on sep 16 & 9 to 10 on sep 17 so total should be
6+ 8 + 1 = 15 hours + minutes. format of date time: 2010-09-17 14:50:51.150 vcActivityName = Process Request Don't worry about process request..