SQL & PL/SQL :: Global Temporary Table And Autonomous Transaction?
Jul 15, 2013
The actual flow, works on this way:
The Procedure A extracts and filter some data from the DW, this data is stored on the Global Temporary Table. Another Procedure, named B, use the data from the Global Temporary Table and store it on a normal table using another procedure Named X that Merge the data from Global Temporary against the Normal Table (inserting if not exist and updating some fields if exist).
(X isn´t important on the new flow)
Now, i need to add some steps on the normal flow:
The Procedure A extracts and filter some data from the DW, this data is stored on the Global Temporary Table. Another Procedure, named B, use the data from the Global Temporary Table and store it on a normal table. Using the Data from Global Temporary Teble i must to Store some fields on another normal table, for this i use another Procedure named C that merge the data from Global Temporary Table against the data from normal table, and i must to commit at this point. X Merge the data from Global Temporary Table and the data from the Normal table con the procedure "C" against another Normal Table (inserting if not exist and updating if exist).
The issue that i´m expecting is that i can´t use "C" for merge and commit, because this truncate the data on the global temporary table. I can´t change the on commit delete rows option, because another procedures are using this Global Temporary Table on production.
Before you ask, i try using AUTONOMOUS_TRANSACTION on "C" and didn´t works because "C" can´t found data on the Global Temporary table ( i use a select count on "C" from Global Temporary), this is because The Autonomous_Transaction (i think). So, what i can do? I tried to trace the session Number on C and A and with the AT is the same ( so isn´t session change problem).
I need:Commit on the Procedure "C" without Truncating Global Temporary Table because i need this data for X.
What is the best option for GLOBAL TEMPORARY TABLE
1) option create GLOBAL TEMPORARY TABLE with ON COMMIT DELETE ROWS. and wheverever this is used for calculation commit at the end of porcedure.
CREATE GLOBAL TEMPORARY TABLE gtt_test ( A NUMBER )ON COMMIT DELETE ROWS;
CREATE OR REPLACE PROCEDURE my_proc ( p_in in number) as begin
[Code]....
2) create GLOBAL TEMPORARY TABLE without ON COMMIT DELETE ROWS and wheverever this is used use delete from Temp table /Truncate table and then user it.
CREATE GLOBAL TEMPORARY TABLE gtt_test ( A NUMBER );
CREATE OR REPLACE PROCEDURE my_proc ( p_in in number)
I have created global temporary tables to be used in my stored procedure, in order to view reports which i created in JASPER. Since global temporary tables are session based, when multiple users are trying to generate the report, every user is getting inconsistent data.
To make it clear, what i meant is if a user A tries to view a report with some filter criteria and simultaneously user B is trying to generate the same report with another filter criteria, User A is getting User B's report data and User B is getting User A's report data. How can we avoid this problem?
create or replace procedure p_populate_gtt as begin insert into gtt select last_name,first_name,null from funcdemo where rownum <51; update gtt set vote=100 where ln ='Tim'; end; /
gtt is my global temp table. i am updating vote column which is null to 100.But i am not able to update it
create or replace procedure test as stmt varchar2(2000); begin EXECUTE IMMEDIATE 'CREATE GLOBAL TEMPORARY TABLE tt_Local(ID VarChar2(38)) ON COMMIT PRESERVE ROWS';
stmt := 'INSERT INTO tt_Local SELECT cardnumber FROM cards'; execute immediate stmt; end;
-- when am trying to execute this
begin test; end; -- showing ora-01031, insufficient privileges.
1.Insert all data from global table to base table. 2.Update all data (that means retrieved all data from base table to global table and update this data). Question: How to Insert and Update from Global temporary table ??
I have created a Package named pkg_pur_order which consists of a function and a Procedure.I have declared the procedure as Autonmous_ Transaction. But whenever I try to execute this procedure it fails and I get error msg as :
exec pkg_pur_order.prc_orders ORA-06519: active autonomous transaction detected and rolled back ORA-06512: at "DBO.PRC_WRITE_LOG", line 13 ORA-06512: at "DBO.PKG_PUR_ORDER", line 36 ORA-00001: unique constraint (DBO.SYS_C00138632) violated ORA-06512: at line 1 [code]....
I have a complex sql query that fetches 88k records. This query uses a global temporary table which is the replica of one of our permanent tables. When I do Create table..select... using this query it inserts only fewer records. But when I make the query point to the permanent table it inserts all 88k records.
1. I tried running the select query separately using temp and perm table. Both retrieves 88k records.
2. From debugging I found that this problem occurred when we were trying to perform a left outer join on an inline view.
However this problem got resolved when I used the /*+ FIRST_ROWS */ hint.
From my limited oracle knowledge I assume that it is the problem with the query and how it is processed in the memory.
I have two database DB1 for EBS database and DB2 for Portal database. DB2 is always up.
DB1 uses some Global Temporary tables to write and store session level information.
I have Secondary database also for DB1. Whenever DB1 is down and its secondary database base is up, my requirement is to enable write operation to these Global Temporary Tables. Since secondary database we open Read-Only mode , I can't write to these GTTs.
DB2 is always up so I want to create the copies of these GTTs in DB2 portal database. Is there any harm on doing this.
Is there any harm storing session level information of DB1 database In DB2 database through DB-Link.
There's an Application Express application which is based on a schema named TRAFOGLED. In order to let users test new features, there's a test application (Apex has export/import capabilities; no problem about that) which is based on another schema whose name is TRAFOTEST.
I'd like to export TRAFOGLED and import it into TRAFOTEST.I'm using 10gR2 EXPDP utility with a parameter file. Everything seems to be OK, except the fact that I'm unable to export global temporary tables (GTT). How can I tell? I didn't see them after import!
These are my GTTs: SQL> show user USER is "TRAFOGLED" SQL> SQL> select table_name from user_tables where temporary = 'Y';
[code]...
C:TEMP> No tables were exported. Certainly, I don't expect any data to be exported, but I'd be happy with CREATE TABLE statements so that I don't have to create these tables separately.
Let's say we have Table - A and we would like to replicate specific row transaction to Table B.
Here are the rows in *Table A* Time: Lets say 15:00
A1 Just Updated @15:00 A2 Just inserted @15:01 A3
B1 - Daily Delete Row -i.e just deleted a while back - Non scheduled process --executed by application @15:02 B2 - B3 - Daily Delete Row - i.e just deleted a while back -- Non Schduled process --executed by application @15:05 B4 - Just recently purged (As part of 180 Day purge ) - Scheduled process executed by operations team @15:10 B5 - Just recently purged (As part of 180 Day purge ) - Scheduled process executed by operations team @15:10 B6 -Just recently purged (As part of 180 Day purge ) - Scheduled process executed by operations team @15:10
Current Data in Table B (Before Replication) @15:00
A1 (without updates) A3 B1 B2 B3 B4 B5 B6
Expected rows in Table B (via replication/snapshot/materialized view / or any other method)
*Replication at 15:30* Table B - Read Only
Expected rows after replication-
A1 -- Newly updated details A2 -- Newly inserted row A3 B1 - Daily delete row is expected to be replicated B2 B3 - Daily delete row is expected to be replicated
***Note row B4 is not expected to be replicated to table B.
Questions:
1) How can we get updates, inserts and daily deletes replicated while ignore large purges? 2) How can large purge changes be reflected in replicated tables as well without deleting daily deletes?
PROCEDURE Return_Summary(WX IN dbms_sql.varchar2_table, WX OUT SYS_REFCURSOR) IS
Begin FOR i IN 1 .. Pi_ WX.count LOOP
/* I need to put this results in a temp table or table object Can I use temp table for this or do we have any other recommended method. The loop might execute max of 10 times and for each run it might return 100-200 records. */
select WX_NM, WX_NUM from TAB A, TAB B, TAB C where A.KEY1 = B.KEY1 and B.KEY1 = C.KEY1 and C.WX = WX(i); End Loop; End;
This creates a table webpen with around 54107 rows. What i am want is every time run "select * from webpen" it should run the above query and give the result as per the values in main table M_PENSIONER ,M_PEN_DCRG_WITHHELD.
What i want is it should truncate the existing values and insert the value by running the above mentioned query .
I migrate procedures MS SQL Server to Oracle.In MS SQL SSERVER the use of instructions INSERT with procedure results which are in storage or dynamic instructions EXECUTE in place of VALUES clause is permissible. This construction is similar to INSERT/SELECT but we have to do with EXEC instead of SELECT. The part of EXEC should include exactly one resulted collection about the equivalent types to the types of table columns. In case of the stored procedure, we can pass on proper parameters, use the form of EXEC('string') and even call up wideranging procedures or remote control procedures from different servers. Calling up remote control procedures from different server, which place data in temporary table, and later realizing join with obtainable data, we can construct diffuse joins. For example. I want insert results stored procedures sp_configure, proc_obj in temporary table.
1)INSERT #konfig exec sp_configure.
2) CREATE PROCEDURE proc_test @Object_ID int, AS SET XACT_ABORT ON BEGIN TRAN CREATE TABLE #testObjects ( Object_ID int NOT NULL ) INSERT #testObjects EXEC proc_obj @Object_ID,3,1 COMMIT TRAN RETURN(0) go
I have approximately 1200 transaction to be updated to a master table. There are other columns in the master table but only one column is being updated. I would like to use sqlloader if possible or any other efficient means. Those 1200 record is stored in an excel spreadsheet. The col1 of the excel spreadsheet have to match col1 of the master table inorder for update col2 from the excel spreadsheet. Here is an example of the data. My operation system is HPUX and database is Oracle 10g.
Master table col1 col2 col3 col 4 4238 susan 56e 5879 h698c rich 12g 7091 joyce 34b 0876 mike 25n 7501 k956b robert 87c 9498 angela 67r 3645 doris 92y
create table my_rows ( my_envvarchar2(100), anumber(2), bnumber(2) ) / insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); [code]....
The first row means that the value 10 represents 40% in the couple (10,20). Meaning if I have 100 rows with the couple (10,20), 40 rows will be marked with the value 10 and 60 will be marked with the value 20. To do this, I used to create a temporary table with the same structure as the my_rows table with a new column "the_value" and I used to update this new column wth a PL/SQL for loop. But I think it is doable in a signle SQL.