SQL & PL/SQL :: Archive Records In A Table
Mar 16, 2012
We have a table which gets updated on a daily basis. We need to manage the records with the retention period of 30 days and move the remaining records to another table.
I understand that we can use "select insert" statement and then the "delete" statement to achieve this. Is there any sql to directly move the records to another table rather than inserting and then deleting?
View 1 Replies
ADVERTISEMENT
Jun 1, 2010
I am trying to update records in the target table based on the records coming in from source. For instance, if the incoming record is present in the target table I would update them in the target else I would simply insert. I have over one million records in my source while my target has 46 million records. The target table is partitioned based on calendar key. I implement this whole logic using Informatica. Looking at the informatica session log I find that the informatica code is perfectly fine but its in the update part it takes long time (more than 5 days to update one million records). find the TARGET TABLE query and the UPDATE query as below.
TARGET TABLE:
CREATE TABLE OPERATIONS.DENIAL_REGRET_FACT
(
CALENDAR_KEY INTEGER NOT NULL,
DAY_TIME_KEY INTEGER NOT NULL,
SITE_KEY NUMBER NOT NULL,
RESERVATION_AGENT_KEY INTEGER NOT NULL,
LOSS_CODE VARCHAR2(30) NOT NULL,
PROP_ID VARCHAR2(5) NOT NULL,
[code].....
View 9 Replies
View Related
Jul 7, 2011
I have a partitioned table that is streamed to another database. I need to archive data on that table. That is I need to add a partition and remove a partition.
If I make those changes to the source table, will it stream over to the destination table?
If not, can I ...
pause streaming make changes to source table make same changes to destination table sreenable streaming. I know making data changes to the destination table can screw up streams but not sure if that holds for ddl.
View 1 Replies
View Related
Aug 8, 2013
We have data archive scripts, these scripts move data for a date range to a different table. so the script has two parts first copy data from original table to archive table; and second delete copied rows from the original table. The first part is executing very fast but the deletion is taking too long i.e. around 2-3 hours. The customer analysed the delete query and are saying the script is not using index and is going into full table scan. but the predicate itself is the primary key,More info below
CREATE TABLE "APP"."MON_TXNS" ( "ID_TXN" NUMBER(12,0) NOT NULL ENABLE, "BOL_IS_CANCELLED" VARCHAR2(1 BYTE) DEFAULT 'N' NOT NULL ENABLE, "ID_PAYER" NUMBER(12,0), "ID_PAYER_PI" NUMBER(12,0), "ID_PAYEE" NUMBER(12,0), "ID_PAYEE_PI" NUMBER(12,0), "ID_CURRENCY" CHAR(3 BYTE) NOT NULL ENABLE, "STR_TEXT" VARCHAR2(60 CHAR), "DAT_MERCHANT_TIMESTAMP" DATE, "STR_MERCHANT_ORDER_ID" VARCHAR2(30 BYTE), "DAT_EXPIRATION" DATE, "DAT_CREATION" DATE, "STR_USER_CREATION" VARCHAR2(30 CHAR), "DAT_LAST_UPDATE"
[Code]...
Data is first moved to table in schema3.OTW. and then we are deleting all the rows in otw from original table. below is the explain plan for delete
SQL> explain plan for 2 delete from schema1.mon_txns where id_txn in (select id_txn from schema3.OTW);
Explained. SQL> select * from table(dbms_xplan.display);
PLAN_TABLE_OUTPUT--------------------------------------------------------------------------------------------------------------------------------------------
Plan hash value: 2798378986
-------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
-------------------------------------------------------------------------------------| 0 | DELETE STATEMENT | | 2520 | 233K| 87 (2)| 00:00:02 || 1 | DELETE | MON_TXNS | | | | ||* 2 | HASH JOIN RIGHT SEMI | | 2520 | 233K| 87 (2)| 00:00:02 || 3 | INDEX FAST FULL SCAN| OTW_ID_TXN | 2520 | 15120 | 3 (0)| 00:00:01 || 4 | TABLE ACCESS FULL | MON_TXNS | 14260 | 1239K| 83 (0)| 00:00:02 |
-------------------------------------------------------------------------------------
PLAN_TABLE_OUTPUT
--------------------------------------------------------------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
View 6 Replies
View Related
Feb 7, 2012
We deleted millions of records from a table.
1.Is it necessary to reorganize a table and index after the deletion of records from table ? Because i see some change in table size after table and index reorganization.
2.Will re org table and index improve the database performance ?
View 7 Replies
View Related
Jul 17, 2013
Oracle 11gI have a large table of 125 million records - t3_universe. This table never gets updated or altered once loaded, but holds data that we receive from a lead company. I need to select records from this large table that fit certain demographic criteria and insert those into a smaller table - T3_Leads - that will be updated with regard to when the lead is mailed and for other relevant information. select records from this 125 million record table to insert into the smaller table.
I have tried a variety of things - views, materialized views, direct insert into smaller table...I think I am probably missing other approaches. My current attempt has been to create a View using the query that selects the records as shown below. Then use a second query that inserts into T3_Leads from this View V_Market. This is very slow. Can I just use an Insert Into T3_Leads with this query - it did not seem to work with the WITH clause? My Index on the large table is t3_universe_composite and includes zip_code, address_key, household_key.
CREATE VIEW V_Market asWITH got_pairs AS ( SELECT /*+ INDEX_FFS(t3_universe t3_universe_composite) */ l.zip_code, l.zip_plus_4, l.p1_givenname, l.surname, l.address, l.city, l.state, l.household_key, l.hh_type as l_hh_type, l.address_key, l.narrowband_income, l.p1_ms, l.p1_gender, l.p1_exact_age, l.p1_personkey, e.hh_type as filler_data, 1.p1_seq_no, l.p2_seq_no , ROW_NUMBER () OVER ( PARTITION BY l.address_key ORDER BY l.hh_verification_date DESC ) AS r_num FROM t3_universe e JOIN t3_universe l ON l.address_key = e.address_key AND l.zip_code = e.zip_code AND l.p1_gender != e.p1_gender
[code]....
View 2 Replies
View Related
Mar 25, 2013
following is the requirement
External Table
WKSHT_FILE_EXT
wksht_line
Export Table
Wksht_export
global_idvarchar2(10)
wksht_linevarchar2(250)
[code]....
Step 1.Insert all records from the external table into the export table. Truncate the export table first
Step 2.Read in a record from the export map table
Step 3.Search through export table records looking for the key words BRANCH =. Compare the branch code with the branch code form the map table
Step 4.If a match is found mark all records in the export table for the worksheet with the global ID from the export map table as follows..The first line of a worksheet is marked by the words WKSHTS..The last line of the work sheet is marked by the words COMPANY CONFIDENTIAL..We will need to capture the line break so also mark the next line after the COMPANY CONFIDENTIAL line
Step 5.Continue with Steps 2 - 4 until all records have been processed from the export map table.
first I have to create a procedure ti insert data from external table to export table.Global id will be blank.it will be updated by the mapping table's Global Id when The EB COLUMN's data(i.e 8p,2Betc ) will match with the BRANC=NA,2Betc of the datasheet loaded from the external table.. FOLLOWING IS THE SAMPLE DATASHEET
WKSHTS AAAAA BBBBBBBBBBB ELECTRONICS INC. TIME REPORT-DATE PAGE
SORT - BR, SLSREP AEC FIELD SALES REPRESENTATIVE 16:14 09/21/12 1
BRANCH = 2B
EMPLOYEE NAME SALVAAG, GREGG Days in the Month 28
[code]....
THERE ARE 2 pages..I have to split this LONG REPORT STORED IN WKSHT_LINE COLUMN OF EXPORT TABLE to 2 records..like wise 500 pages are there means 500 records.. AND THEN FIND BRANCH= after that which two words will come i.e NA,2B etc if it will MATCH WITH MAPPING TABLE"S EB COLUMN"S DATA,THEN MAPPING TABLE's GLOBAL ID WILL BE UPDATED TO EXPORT TABLE's GLOBAL ID WHICH IS BLANK
View 1 Replies
View Related
Mar 9, 2004
Trying to auto insert the newest records from one table into another Table. I have a vendor provided table that is part of my database (running Oracle 9i) so I can't change the underlying structure to it or their process stops fluxing. However, I can add a trigger to it. What I want to do is this:
When the vendor's software inserts a new row (through their own automated process) I want to insert data from that same new record into another table of my own. (where of course I can re-format it, etc., and make the data my own)
The original vendor table does not have a insertion timestamp field to work off of.What is the best way to trigger an insert off the latest inserted record? It works to replace all the records in the entire vendor table but I only want to insert one record at a time.
View 2 Replies
View Related
Jan 23, 2013
I have got a procedure that successfully creates an oracle external table and populates it with the contents of a file. This works fine until I have a situation where one of the fields is a VARCHAR2(2) and I try to insert say, a 5 character value. When this happens the record in question does not get populated in the external table (and rightly so), but I could do with working out if there is a discrepancy in the number of records in the file and the number of records that actually make it into the table so I could inform the user that there is a problem.
I have attached the code that creates the external table and populates it.
View 5 Replies
View Related
Apr 21, 2011
I have requirement like we need to send records from one table to another table. for example if i have 4 records in Table A , first i need to send only 2 records to Table B then again rest 2 records to the Table B.
View 10 Replies
View Related
Aug 10, 2011
I have a personname table which contains records of millions Person-names. My application has a requirment to return "any" 200 names that match the given Firstname and lastname entered by user.note the NOT actually "top-n", but "Any-N" , i.e. user wants "any" 200 names and NOT in any "specific order".
which is the best option to make most efficient search --
1) rownum < 201
2) row_num()
3) rank/denserank etc
View 3 Replies
View Related
Jun 23, 2010
I am dealing with a table containing millions of records. I have table loans_list table and he data looks similar to this..
LOAN_IDSEQUENCE_NUMCOMPLETE_DATE
1237000
1237005
1237010
5237010 6/23/2010 10:07:02 AM
5237000 6/23/2010 10:07:02 AM
12237000
I am trying to select only those loan_id from this table which contain all these 3 sequence_num = 7000,7005,7010 and containing null compelete date. I tried different way to write the query but can't think of efficient way of writing this query yet.
Since this table contain million of records, i dont prefer to call this table more than once in a query. I am just trying to avoid the longer time delay for the execution of this query..
View 7 Replies
View Related
Jul 18, 2012
I want to insert 10 records from table a to table b. If i m using statement level trigger how many record insert?In row level trigger how many record inserted?
View 2 Replies
View Related
Oct 3, 2013
how to display all the records in a table ,i am passing the table name as in param to the procedure/function suppose if i pass emp table name it will display 14 rec, if i pass dept it will display 4 records.
View 3 Replies
View Related
Aug 19, 2010
I'm selecting a set of records from one table, for example: ID, description and date. From this I'm only wanting the latest inserted row. I've used the max function on the date which is fine, however, there are some records that have had their description changed. This then returns two values for one ID, the max for the original description and the max for the changed description.
I'm getting:
ID |Description |Date
1 ABC 01/01/2010
2 XYZ 02/03/2010
2 XYZ1 03/05/2010
When I want:
ID |Description |Date
1 ABC 01/01/2010
2 XYZ1 03/05/2010
As ID 2 with XYZ1 Description is the very latest row for that ID.
This is an audit table so the ID appears on numerous rows as it a composite key with date.
View 1 Replies
View Related
Mar 11, 2013
We have table, which maintain log record of gl table. I don't know how much data exist in that table but problem is taking too much time while counting whole records.
View 11 Replies
View Related
Aug 24, 2012
Oracle Database Version is 10g ( 10.2.0.4.0)
I have two table emp_perm & employee.
1) Table emp_perm is a lookup table with 2 columns, temp VARCHAR2 & perm VARCHAR2.
Table emp_perm have below data in it:-
temp perm
1064885 0349034
0099982 7399982
6476456 9170346
5252525 52525252)
Now table employee might have following combinations of rows.
Records for both Temps file (1064885) and perm file number (0349034) does exist in employee table, in this case Delete Temp File Number records i.e. 1064885 and keep Permanent File Number (0349034) records as it is.
Records for only temp file number(0099982) does exists and corresponding perm file(7399982) does not exists in employee table in this case convert these records i.e. update 0099982 with its perm file number 7399982 using lookup table emp_perm.
Records for only perm file number exists (9170346) and corresponding temp file number (6476456) does not then do not amend employee table, as we are interested in perm file numbers.
If file number does exist in lookup table in emp_perm and both of them are equal (5252525) then do not amend employee table as temp and per file number are same.
View 5 Replies
View Related
Nov 15, 2010
I need to get the timestamp for all the existing records in my table...I am having one user definition field, is this possible?
View 4 Replies
View Related
Feb 2, 2012
I'm working with Oracle 10g.
I have a table like this;
ID Amount Date
123 5000 Oct-07-2011
123 null Oct-09-2011
124 7000 Oct-14-2011
124 null Oct-17-2011
124 null Oct-24-2011
What I'm trying to do here is loop thruogh the records and update the amount that's null with the previous amount with the same ID.
View 3 Replies
View Related
Apr 9, 2008
How to validate a sum of some records in a table to be exact value?
I want to guarantee that sum(val_column)=100 at database level. Check constraints out of question. Trigger before update statement disable any update to that column when incrementing one row and decrement another... (unless with for all...)
View 3 Replies
View Related
Apr 29, 2013
Consider tables A,B,C,D,E,F. all are having 100000++ records Tables B,C,D are dependent on table A (with foreign key constraint). When I am deleting records from all tables, table B,C,D are taking max 30-40 seconds while table A is taking 30-40 mins. All tables are having indexes.
Method I have used:
1. Created Temp table
2. then deleted all records from B,C,D,E,F for all records in temp table for limit of 500.
delete from B where exists (select 1 from temp where b.col1=temp.col1);
3. why it is taking too much time for deleting records in table A.
View 5 Replies
View Related
Sep 28, 2011
Below is my requirement,
Source Table: SRC
COL1 DATE_CREATED CREATED_BY
1 27-SEP-2011 GURU
1 28-SEP-2011 SANKAR
Target Table:TGT
COl1 DATE_CREATED CREATED_BY
1 28-SEP-2011 SANKAR
I need to take the MAX of date_created record and store it in target.
I tried,
select max(date_created), col1, created_by from src
group by col1, created_by,date_created
Which is giving me 2 records which i don't want. How to get only one record out of that source table?
View 2 Replies
View Related
Jun 13, 2011
We have a table, to which daily we insert 100 records. if we want to know recently or today inserted records from that table.
View 3 Replies
View Related
Nov 4, 2009
CREATE TABLE "SCOTT"."SEATALLOTMENT"
("YEAR" NUMBER(4,0),
"COLLEGECODE" CHAR(4 BYTE),
"COURSECODE" CHAR(3 BYTE),
[Code].....
Now i want to UPDATE reducing the AVAILABLE column by 1 in COURSESEATS table based on common columns collegecode,coursecode for a ROW inserted into SEATALLOTMENT table ,i am confused to what approach i have to follow whether its a procedure or a trigger
CASE:
Here in this case as i insert a row with krcl,cse as college code and course code respectively into seatallotment table the available column in courseseat table for the respective row with mentioned common column must become 59 from 60
View 5 Replies
View Related
Jul 29, 2013
I need to update more than 1million records in a table. I am using Oracle Warehouse Builder to do this Job. it is taking more than 3 hours still running. I don't have any Indexes on this table.
View 11 Replies
View Related
Feb 1, 2013
I need to insert 999999 records into one table, Already I wrote the below query
DECLARE
TYPE tt_type IS TABLE OF table_name.no%TYPE
INDEX BY BINARY_INTEGER;
tt_type_no tt_type;
rec_count NUMBER;
BEGIN
[code]...
But It took 5 mins to execute... Is there any other way there to insert fastly.
View 7 Replies
View Related
Nov 21, 2012
I am writing a query to give me all CustomerID's in the report. I will give you the tables that I am trying to get the query from.
1. Customer table:
Custid (pk)
memberid
fname
lname
bname
2. Cust_cokeid table:
Cokeid (pk, fk)
Custid (pk, fk)
3. Cokeid table:
Cokeid (pk)
[code]....
Here is the query I have wrote:
SELECT customers.custid AS CUSTID,
customers.memberid AS MEMBERID,
customers.bname AS BNAME,
drpepper_rebate.drpepid AS DRPEPID,
drpepper_rebate.totalcarb AS TOTALCARB,
drpepper_rebate.totalncb AS TOTALNCB,
[code]....
I have more then 700 customer records in the Customers table, but so far I can only pull 500 records.
Reason I am trying to pull all the records from the customer table is because I want to find out who is missing cokeid and Drpepid.
View 7 Replies
View Related
Nov 8, 2010
I have deleted all the records from the table.And I have committed.Now I want to get all the records back.
View 16 Replies
View Related
Jun 14, 2013
I want to insert bulk records to the table. I want to insert date rows for next 50 years in table ( from year 2001 to year 2050). I have following columns in my table :
YYYYMMDD MM/DD/YYYY Day of the week ( Monday, Tuesday etc) JulianDate
View 5 Replies
View Related
Apr 13, 2011
I need to delete duplicate records from a table (indeed they are multiple duplicates).
Table Data
IDGroupQty
1KK30
1KK0
1KK19
2AA0
2AA30
3AA0
3AA30
3AA30
3AA9
My aim is to delete duplicates out of above data, with the below condtions.
1) first record with value 30 and then with value 0.
2) if there are 3 duplicate records ex: ID is 1 and Group KK, then i have to delete both 30 & 0 qty records.
3) If there are more than 3 duplicate records ex: ID is 3 and Group is AA, the i have to delete all the records with qty value either 30 or 0 and.
I have written a query like below.
SELECT id,
unit,
RANK ()
OVER (PARTITION BY id, unit
ORDER BY id, unit)
num
FROM temp;
with the above query, i am unable to mark this dynamic duplications.
View 10 Replies
View Related