SQL & PL/SQL :: Sum Daily Total To Weekly In Oracle?
Feb 17, 2010
I need to sum the below daily total to weekly with the week starting day of Saturday Jan 3 2009 for the 52 weeks in 2009. The below query provides me the weekly total with the week starting day MONDAY. But I need the week starting day to be SATURDAY instead of MONDAY.
select to_char(report_date, 'YYYYIW'), sum(total)
from report_table
where to_number(to_char(report_date,'YYYYIW')) >=
to_number(to_char(to_date( '&one_year_ago'),'YYYYIW'))
group by to_char(report_date, 'YYYYIW')
Here is a query I used to generate the Daily Sample Data:
SELECT DISTINCT A.PRODUCT, TO_CHAR(B.BEGIN_DT,'YYYY-MM-DD') as post_date,'RCSL', A.DEPTID, A.ACCOUNT, SUM( A.POSTED_TOTAL_AMT) as Amount_Posted
FROM A, B, C
AND B.BEGIN_DT BETWEEN TO_DATE('&StartDate','YYYY-MM-DD') AND TO_DATE('&EndDate','YYYY-MM-DD')
GROUP BY A.PRODUCT, TO_CHAR(B.BEGIN_DT,'YYYY-MM-DD'), A.DEPTID, A.ACCOUNT
For the below sample data I would need to add data starting 2010-01-02 - 2010-01-08 to get my weekly total.
i want to run the salary posting procedure automatically at mid night every day. i have created a job in toad and defined the required information but it is not working.
The Oracle machine A (11R1) is placed in San Jose. The Oracle machine B (10R2) is placed in Sydney.
I need to develop a daily replica from A to B. I don't need the entire db, but only some single schema.
The entire DB size is 1TB.The exp pump gzipped dump size will be 5+GB.Tnsping from A to B takes 500+ ms.The regular A exp pump -> ftp -> B imp pump will be too slowly because of the poor network.Which way it would be the best to implement it?
How to get page wise total on every page and grand total on last page in oracle tabular report. I have tried by placing summary column at report level and its reset property to page wise and print job on all page but last but it did not work .
If we want to know the number of instances, number of RAC databases and whole total disk space used by oracle (not file system size),1. Any script can be ran from OEM grid control against all instances/databases? or2. we have a repository unix server which has all tnsnames of whole databases, any script we can run from there?
I am looking to get the maximum value for every 24 hour period for a month. So for example my date range can be defined by...
select to_date('&date','mm yyyy')-1 + level as DateRange from dual connect by level <= '&days'
...where I can provide the first date of the month and number of days in the month or a lesser value if less time is required. So, the results of the above query plus 24 for the range. I thought a some Googling would provide me what I needed, but my search came up empty.
I was hoping to do something like this...
select utctime, max(value) from table where utctime between.
How do i run three scripts one after another automatically for daily basis. Say i have three scripts A,B,C and i want to run the three scripts A followed by B followed by C.
I'm working on my test db and what I thought to be an easy task is turning out to be very difficult. I am trying to set up a cron job to email me daily the alert log to check for errors. I tested the script and it is not working this is what I have to email me the alert log
mailx -s "Alert Log for Testdb" oracleexample@gmail.com > /testapp/oracle/diag/rdbms/testdb/testdb/trace/alert_testdb.log
There are 2 databases, database A and database B. Database A is Oracle 11.2.0.2 which runs on linux and Database B is Oracle 11.2.0.2 which runs on windows xp machine. In database A, there are 100's of tables which are being updated every 10 minutes or 15 minutes. For reporting purpose, the developer wants to run report for the tables. But since database A is being updated every now and then, generating reports takes almost 15 to 20 minutes. So the reports can be generated in Database B. Once in a day the database B should have the updated data from database A so that the reports can be generated in database B with less time. What could be the best solution for the database B to have the updated data on daily basis from database A in oracle?
However, I need to refresh the group manually sometimes. Therefore, I cannot set the interval as sysdate+1.have tried setting the interval as follows. However, they are not correct
Quote:trunc(sysdate+1) +4/24 The next interval will show 9:25:20pm.
We have a 2-Node RAC11g R2.0.3 installed on Linux 5.5.
My problem to have a private connection disconnected on daily bases at 12:00 PM and 3:00 PM ONLY,and come back life within 2 minuets,I am using cross-over cable to connect those private interface , So in the ocssd.log Stating :-
2012-09-18 15:06:43.735: [ CSSD][1096161600]clssnmPollingThread: node racmain1 (1) at 50% heartbeat fatal, removal in 14.340 seconds 2012-09-18 15:06:50.752: [ CSSD][1096161600]clssnmPollingThread: node racmain1 (1) at 75% heartbeat fatal, removal in 7.330 seconds 2012-09-18 15:07:19.821: [ CSSD][1096161600]clssnmPollingThread: node racmain1 (1) at 90% heartbeat fatal, removal in 2.480 seconds, seedhbimpd 1
I need any good suggestion basically I have two different location
1- Factory 2- Head office
I have developed PRODUCTION module in factory and it is working fine and now I want to send data on daily basis to head office therefore I develop a form which will create backup in export format (.dmp) then user will send via email to head office.
Backup file should be save in pre-defined location then user will use a form which I developed for loading data into head office there are two different buttons in this form;
First is used to load data, actually first I load data in a temporary user which creates whenever user will press this button. Second is used to copy data in application user but first it checks if data exists then update otherwise insert.
I have to create a function. I need to find the max last logout date for each agent daily. For example, if an agent logged in for the first time at 9:00 and he logged out at 12:00 and he logged in again in 14:00 and he logged out at 15:00 the time I need my report to show is 15:00. How can I do that?In order to make it easiest for you to understand I am sending you this query:
select a.login as login2, To_Char(max(s.endtime), 'dd/MM/yyyy, HH24:MI:SS') as lastLogout from cti.agent a inner join cti.agentsessionlog s on s.agentid = a.agentid and To_Char(s.endtime) != '31-DEC-99 11.59.59.000000 PM' group by a.login;
This query returns the agent's login and the agent's last logout time. It works fine if I enter a date between but I cannot do that. If a use this query as it is and I try to export a report for 31/5 it shows as lastlogout the logout for 01/06 or 2/06. Is there a function I can use? I have a deadline.
I made small Inventory software for Medical store. Now I want daily base data in DMP file. How to make current date in DMP file don't need all.
I mean I have 30 tables in oracle sql . They are daily update with new entry and some table has date column and some not. Actually I want to send daily Data via mail.
Oracle on 11gR2 on hp-ux 11.23.Need to develop a shell script which will do a monitoring of the alert log daily.
It should do a audit trail (only check for warnings/errors) for the last 1 day daily till collection time and output to a file.if no errors should output "No Errors/Warnings", else ouput the relevant.
create directory asmexpdir as '+RECO/FILTDB/EXPDP'; grant read,write on directory asmexpdir to oraasfs; expdp oraasfs/oraasfs2301 directory=asmexpdir dumpfile=SBSR_EXP.dmp tables=TM_SFS_CUST_01 logfile=EXPDP_LOG:SBSR_EXP.log
SUCCESS MESSAGE
. . exported "ORAASFS"."TM_SFS_CUST_01" 387.2 MB 817684 rows Master table "ORAASFS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded ****************************************************************************** Dump file set for ORAASFS.SYS_EXPORT_TABLE_01 is: +RECO/filtdb/expdp/sbsr_exp.dmp Job "ORAASFS"."SYS_EXPORT_TABLE_01" successfully completed at 03:34:59
And I like to run this daily and delete after 14 days. but it show error, what can be the solution to run this script?
#!/bin/bash #Script to Perform Datapump Export backup Every Day ################################################################ #Change History
My example: I'm given an Allowance throughout the week. It happens to be 10 dollars but it can vary from day to day.I can create a running total with SUM(Amt) Over etc...This is the CUMUL column in the example below.
On certain days I've spent different percentages of the allowance. (The SPENT Column which is a field in the database)I can't manage to create the AMTLEFT column in the example below.The AmtLeft column seems to be a kind of running total that 'refers to itself' so this is where I'm stumped.
I'm trying to Rank Username based on the Total Sum of amount waived but I want to avoid Ranking the Overall Total at the bottom, plus I dont want them in Ranking order, I want the order to stay the same as it currently is.
SELECT DECODE(GROUPING(USERNAME),1,'TOTAL',0,UPPER(USERNAME)) as "USERNAME", SUM(CASE WHEN TO_CHAR(DATE_PROCESSED,'MON') = 'JAN' THEN AMOUNT_WAIVED ELSE 0 END) AS JAN, SUM(CASE WHEN TO_CHAR(DATE_PROCESSED,'MON') = 'FEB' THEN AMOUNT_WAIVED ELSE 0 END) AS FEB,
I would like to give back to the our application user a page of results for a given query along with the total result count, something like: "Showing 1-25 of 650 total results".
Currently I am doing this by submitting a second query:
select count(*) from (<previous query criteria>)
Is there a better performing approach I could be using?
I am writing a report that breaks on the first 4 fields. That part is working fine. I also want a count for each Group (the 4 fields), and a grand total. Since I want to break on all 4 fields as if they are one combined field, I made a concatenated column (called Break_key) and had the report total on that.
I was surprised when the count appeared at the top of each group, rather than at the bottom. The grand total is at the very bottom of the report, as I would have thought. How can I get the sub-totals at the bottom, rather than the top?