Streams :: Synchronizing Tables In Two Different DB
Jan 28, 2013
We have a core banking database and this database includes our customer related tables and these tables are really huge.
And we have other database for some applications and this database needs fresh customer data which take place in core banking database. Our current method is materialized views but we have some performance problems about it.
What can give better performance for synchronizing the tables between databases?
-oracle streams?
-triggers?
-dbms_comparison?
View 3 Replies
ADVERTISEMENT
Jul 20, 2009
We are using Oracle Streams for replication.
Column datatypes in some of the tables on Source and Destination are different (Number on Source and Varchar2 on destination).
Do we have to create any rule or dml handler to handle this or Streams will automatically take care of it?We are oracle 10g.
View 1 Replies
View Related
Nov 1, 2013
A streams apply process which applies to a sql sever database is increasing its pga use continually until i stop the process and restart it. I need to stop it once every week or it will use too much of the pga and the database will hand causin paging etc.
View 1 Replies
View Related
Jun 26, 2013
I encountered the following error while trying to setup streams replication at the database level using dbms_streams_adm.maintain_global. Desmond begin*ERROR at line 1:ORA-23616: Failure in executing block 6 for script.
E00C49DDDB27C899E040A8C04C0119DA withORA-06550: line 21, column 3:PL/SQL: ORA-00942: table or view does not existORA-06550: line 21, column 3:PL/SQL: SQL Statement ignoredORA-06550: line 23, column 3:PL/SQL: ORA-00942: table or viewORA-06512: at "SYS.DBMS_RECOVERABLE_SCRIPT", line 659ORA-06512: at "SYS.DBMS_RECOVERABLE_SCRIPT", line 682ORA-06512: at "SYS.DBMS_STREAMS_MT", line 2427ORA-06512: at "SYS.DBMS_STREAMS_ADM", line 3004ORA-06512: at line 2 SQL> select forward_block from dba_recoverable_script_blocks where script_id = '
[code]....
View 1 Replies
View Related
Feb 3, 2012
Source DB is logical standby 10.2.0.5
Destination DB is 11.2.0.3
Is it possible to set up streams on the logical DB to replicate some schemas to the 11gR2 DB?
View 10 Replies
View Related
Apr 18, 2013
Am trying to implement Oracle Streams Replication (Using Metalink Note 733691.1).I have configured the steps, but in my alert log am getting the below error:
Check that the primary and standby are using a password file and remote_login_passwordfile is set to SHARED or EXCLUSIVE, and that the SYS password is same in the password files. returning error ORA-16191
am successfully able to connect db's both server
From 1st server
sqlplus sys@2nddb as sysdba
From 2nd server
sqlplus sys@1stdb as sysdba
while conning it is asking for password
Both DB's are created with Same Oracle sys user password. after this disabled case sensitivity , still the error persists.
View 1 Replies
View Related
Oct 4, 2010
Platform: Windows 2003
Streams Set up: One way Streaming at table level
The error: ORA-26786: A row with key ("REPT_NUM", "STATE_CODE", "SURVEY_ID") = (067305669, 49, J) exists but has conflicting column(s) "DATE_TIME", "PREV_PARENT_ID" in table TOPCATI_JOLTS.UNIT ORA-01403: no data found
We are consistently getting this error every other day and some weeks more often, different records of course. On the capture site the application does a process called split cases. In this process the application will take an old PK case num insert a new PK case number with all of the data of the old case (Parent level). At the unit (Child level) the application will change all of the units to this new FK case number. This means the old case (parent) is left with no child units. This is all one transaction.
Is it possible that streams may be applying the LCR's out of order? especially since that whole process is one transaction.
View 2 Replies
View Related
Mar 26, 2013
I am trying to setup a one way queue that will be consumed by an inhouse ESB.We currently are able to create a multi-consumer queue using:
CODEBEGIN
DBMS_STREAMS_ADM.SET_UP_QUEUE(queue_table => 'strmadmin.streams_queue_table',
queue_name => 'strmadmin.streams_queue');
END;
but unfortunatly the ESB that we are using uses a jms component which seems to only be able to take mono-consumer queue.So we have created our queue usuing the following
CODEBEGIN
DBMS_AQADM.CREATE_QUEUE_TABLE( Queue_table => '"STRMADMIN"."STREAMS_QUEUE_TABLE"',
Queue_payload_type => 'SYS.ANYDATA',
storage_clause => 'PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 TABLESPACE
[code]...
This therefore permits us to have a queue/queuetable which pushes data to a single consumer.The probleme comes when we try to add a table rule using the following command:
CODEBEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES( table_name => 'source.t1',
streams_type => 'apply',
streams_name => 'streams_apply',
queue_name => 'strmadmin.streams_queue',
[code]...
We then get the following error:
CODEORA-06512: on line 2 24039. 00000 - "Queue %s not created in queue table for multiple consumers"
*Cause: Either an ADD_SUBSCRIBER, ALTER_SUBSCRIBER, or REMOVE_SUBSCRIBER procedure, or an ENQUEUE with a non-empty recipient list, was issued on a queue that was not created for multiple consumers.
*Action: Create the queue in a queue table that was created for multiple consumers and retry the call.
We are able to create the capture rule without any problem but without the apply rule, nothing seems to end up into the queue table.AQ is not a viable solution since it is troublesome when it comes to deletes and mass updates.
View 1 Replies
View Related
Mar 2, 2012
i'm trying to write a pl/sql to find all missing archived logs that are need for streams replication.
There is already a oracle metalink note for this. But yet it would give only the archive log name that contains my dba_capture.start_scn and we need to check if the files exist in disk or not!
The problem here is, when using ASM, dba_registered_archived_log view is truncation the file name and it is really difficult to pin point the logs. So is it fine to join this view with V$archived_log? is deleted and status column would do the trick? I modified the plsql as below. Is this fine/accurate?
CODEdbms_output.put_line('Capture will restart from SCN ' || lScn ||' in the following file:');
for cr in
(select decode (a.name, NULL, 'NOT FOUND', a.name) name, to_char(a.completion_time, 'hh42:mi:ss') completion_time from v$archived_log a,dba_registered_archived_log b where lscn between b.first_scn and b.next_scn
and a.deleted = 'YES' and a.status != 'A')
loop
f_rec :=1;
[code]......
View 1 Replies
View Related
Mar 21, 2011
Can we use Materialized views in Streams like how we use tables... ?
View 11 Replies
View Related
Mar 27, 2013
My setup is following:
CODEhost22 (Oracle Linux 5.9) | host30 (Windows XP) with SQL Server Express 2008
DB22 | Gateway to MSSQL | DB30
Local Local | |
Capture --> Apply ------> Gateway -----------------> DUMMYUSR.DUMMYTBL
Process Process | |
Capture process on DB22 captures changes made to local DUMMYUSR.DUMMYTBL table, enqueues it to capture queue and propagates to apply queue for local apply process, which, using database link DB30 supposed to apply these changes to table DUMMYUSR.DUMMYTBL on remote SQL Server database DB30.
The problem is that when apply process is executing transaction, it aborts with error:
QUOTE ORA-26713: remote object does not exists or is inaccessible.
ORA-26712: remote object is "DUMMYUSR"."DUMMYTBL"@"DB30".
In trace log I can see the following message:
QUOTE kniNoPrimaryKeyError: ORA-26713, remote obj doesn't exist,
DUMMYUSR.DUMMYTBL@DB30
[code]...
what can cause ORA-26713 error of apply process applying changes to SQL Server database through Oracle Database Gateway, if object to be modifyed exists and accessible from Oracle database for strmadmin user?
By the way, database link and tables were created with following commands: CODEDB22(Oracle): (connected as strmadmin) create database link DB30 connect to "dummyusr" identified by "dummypwd" using 'DB30';
DB22(Oracle): (connected as dummyusr)
create table DUMMYTBL (no number, name char(6), ddate date, payload varchar2(2013));
DB30(MSSQL): (connected as dummyusr)
create schema DUMMYUSR authorization dummyusr;
create table DUMMYUSR.DUMMYTBL (no float(53), name char(6), ddate datetime2(0), payload varchar(2013));
View 1 Replies
View Related
Jun 2, 2010
this Topic was already discussed in "Physical standby on oracle 11g - Reporting needs" but without finishing.
View 5 Replies
View Related
May 27, 2011
We have three unix servers with four databases (10gR2) containing "HP Operation Management Unix" (OMU) server messages for monitoring purpose, and we now want to transfer these data to one new database on a new server for reporting purpose.
The message table in each OMU database keeps the message row until it is "Acknowledged" or for maximum fourteen days, then it is moved to an historic table where it stays for another three days. Keeping data for only seventeen days are a performance issue.The new "Reporting database" is intended to hold messages data for the last 90 days.
I wonder which method to use to move/replicate data against the databases? Materialized view using database link, with view on top of the MVs. How to keep rows longer than the master (source) table, avoiding deletion when master row is deleted
Oracle Streams, with local capture and remote apply. How will this influence on the master database performance. There are about 10000 new messages in each OMU database every day. Is it possible having four streams connections against the reporting database ?
Or should I simply use database triggers which fires after insert and update and applies changes to the reporting database using database links ?
View 2 Replies
View Related
Aug 31, 2011
Till now my company used Share Plex for replication 6 DBs from different location to one server (used for BI operations).
1) Replication in in real time
2) All DBs are Oracle but different versions 9 and 10 .
3) We have horizontal and vertical replication. We replicate just some tables, for some tables just some columns and for some tables just records with some conditions
4) We have different charsets.
5) SharePlex have compare/repair tool to check it DBs are in sync mode
I started to look for Oracle Streams to use it instead of SharePlex. Do you know if it is able to handle that type of replications? Do you know about any Oracle Streams limitation which eliminates it?
View 5 Replies
View Related
Jan 21, 2013
Due to firewall restriction, i have to configure the streams in queue forwarding. Intermediate database will hold two different ip's (one from source and another to target) and will not talk each other.
View 1 Replies
View Related
Aug 15, 2012
I create following update conflict handlers, one after other. These are working correctly, my problem is when i create 2nd it replace first one, and when i create 3rd one it replace 2nd one. I want to put all three in action simultaneously. How can i do this.
{code}DECLAREcols DBMS_UTILITY.NAME_ARRAY;BEGINcols(1) := 'MODIFIED_DATE';cols(2) := 'MODIFIED_BY';cols(3) := 'RATE_CHANGE_DATE';cols(4) := 'PENSION_AMOUNT';cols(5) := 'EASYPAISA_WORK_ON';cols(6) := 'PPO_NO';cols(7) := 'NEW_NIC_NO'; cols(8) := 'OLD_NIC_NO';cols(9) := 'CLAIM_NO';DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(object_name => 'CORE_BUSINESS.CB_PENSIONER',method_name => 'OVERWRITE',resolution_column => 'CLAIM_NO',column_list => cols);END;{code} {code}DECLAREcols DBMS_UTILITY.NAME_ARRAY;BEGINcols(1) := 'MODIFIED_DATE';cols(2) := 'MODIFIED_BY';cols(3) := 'RATE_CHANGE_DATE';cols(4) := 'PENSION_AMOUNT';cols(5) := 'CLAIM_NO';DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(object_name =>
[code]....
View 3 Replies
View Related
Oct 10, 2013
,I am trying to set up Streams on a 11.2.0.3 on a Windows 2008R2 server. Due to an error in running propagation, i a, trying to delete both Capture and Apply queues. I have deleted the Apply queues but unable to drop the Catpure,
SQL>EXEC DBMS_APPLY_ADM.STOP_APPLY(apply_name =>'LAO_NLPG73_BLPU_APPLY'); SQL> select * from dba_apply; LAO_NLPG73_BLPU_APPLY NLPG73_BLPU_APPLY_Q STRMADMIN YES RULESET$_732 STRMADMIN STRMADMIN 301355 ABORTED 09/10/2013 17:34:21 1013 ORA-01013: user requested cancel of current operation CAPTURED STREAMS APPLY SQL> select * from dba_queues; STRMADMIN NLPG73_BLPU_APPLY_Q
[code]...
View 5 Replies
View Related
Nov 30, 2011
I have a test database with tables that occasionally has to be updated to the state of the production database tables. These databases both stream several tables to separate databases. (ie database A streams tables to database X and B to Y).
I export the needed schema daily from the production database "A". To update the test database "B", I import the needed tables from this export file. I then export the tables to be streamed from database B with the object_consistent=y parameter. I then import those tables to database Y using the streams_instantiation=y parameter.
So this is the process ...
1. Turn off capture on B
2. Turn off apply on Y
3. Drop the tables in B
4. Import tables from A to B
5. Prepare instantiation on tables in B (DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION)
6. Export from B with object_consistent=y
7. Drop tables in Y
8. Import from B to Y with streams_instantiation=y
9. Start apply on Y
10. Start capture on B
I ran this overnight through an OEM job and it appeared to work, just not how I expected. When I came in there was a lot of activity on both databases. All sessions were from the streams admin account. This continued for some time (long after the job had finished). During this time the "captured scn", "applied scn" & "last enqueued scn" values were quite different on the source side (didn't jot those down). I compared these ot the "last applied message number" on the target side and the valuse were quite a bit off.
After the activity subsided, the SCN values were all in the same ballpark. So I figure all the activity was changes getting applied from database B to database Y. I tested out the streaming and it's working fine.
I thought the parameters in the export and import statements were supposed to sync up the SCNs. That is I thought they would start off in the same state and there would be no need for applying changes. No one changed anything on the source database overnight.
I must not be understanding how the streams export/import parameters work.
View 5 Replies
View Related
Oct 8, 2013
I have requirement where replication should be done between two 11gR2 RAC on Standard edition.I have following queries,1. Does Standard edition support DDL capture?
Im asking this because on
[URL]......
it says "SE1/SE: no capture from redo" what that really mean ??2. Can it be possible to configure capture at schema level and skip only some of the tables / triggers ?
View 2 Replies
View Related
Nov 29, 2012
i am trying to replicate sdo_geometry datatype object using oracle11gr2 streams but the capture process didnt capture the change made on the table.
View 2 Replies
View Related
Feb 22, 2012
Actually am trying to replicate two db servers from one in hong kong and another in china. when am trying to establish the replication, am getting error 'ORA-04052: error occurred when looking up remote object' like this...
but the same way i have tried in my local network, it is working fine.i have tried schema replication through enterprise manager grid control..
View 6 Replies
View Related
Feb 9, 2013
Database Version - 10.2.0.4.0
OS - Red Hat Linux
We have configured oracle one way stream between two databases. Source database is capturing the changes (No downstream configured). Configuration was working fine but destination database was lagging behind very much i.e about 15 days behind the source database. We are ok with this but the problem is now that , as per client request we have restored previous backup and open the database with resetlog option in source database. After resetlog , archivelog sequence has been changed and stream is not working.
Can I apply the previous archivelog (before resetlog archivelogs ) in destination database anyway.Source database is a production database.
View 1 Replies
View Related
Nov 9, 2010
problem on oracle 11gR2 where i have to import data from a source database to an existing table without truncate or drop the target table in the target database.
we have found something called table_exist_action=append in impdp.
View 2 Replies
View Related
May 18, 2011
I have to cleanup data from our tables (Production Environment) that contain millions of rows. The question is apart from the solution of the partitioned tables what alternative recommended solution suggests Oracle?
To delete these tables by using a cursor PL/SQL block or to import all the database and in the tables that we want to remove the old rows to use the QUERY option of the data pump utility.
I have used both ways and i have to admit that datapump solution is much much faster than the deletion that suffers from I/O disk.The question again is which method from these two is more reliable and less risky for the health of the database.
View 5 Replies
View Related
Sep 3, 2012
I came across an implementation where data from DB2 tables are moved to Oracle tables, for BI solutioning, using some oracle procedures called from MS SQL DTS packages which are scheduled jobs.Just being curious, can this be done using OWB or ODI rather than the above detour. I suppose there are some changes being done in those procedures before the data is being loaded into Oracle tables, can't this be done using OWB/ODI? Can it be scheduled too as jobs using OWB/ODI?
View 1 Replies
View Related
Jul 25, 2012
We are using oracle as database in our application. For high availability we have a cluster of multiple nodes and the data is replicated using oracle streams. All the nodes in the cluster are active. We do not have any concept of stand-by.
Now we are planning to use oracle RMAN for backup and recovery. RMAN user's guide doesn't recommend any strategies for such deployments. It mainly focuses on primary/stand-by deployments.
View 1 Replies
View Related
May 29, 2013
In our schema we have corresponding audit tables for most of the production tables
Ex Table name Audit Table
EMP EMP_AU
DEPT DEPT_AU
Audit tables will have all the columns of production table along with audit columns AUDIT_DATE , AUDIT_OPERATION There are few production tables which are not having audit tables.I need to write a script to identify
1) Production tables where corresponding audit table is missing
2) Where there is column difference (In case any column missing in audit table) between Production table and Audit table
View 11 Replies
View Related
Sep 16, 2010
In Oracle database pl/sql 11.2: describe the differences between :
records,
tables, and
tables of records.
View 11 Replies
View Related
Aug 25, 2011
I've got table tab1 with sequence id column.
Also I've got tables tab2 and tab3 with indicatorid column in each table.
My need is to prevent inserting values into indicatorid column in tables tab2 and tab3 which are absent in id column in table tab1.
For example, at this moment id column in table tab1 has two values, 1 and 2. I can insert value 3 into indicatorid column in tables tab2 and tab3. I need to prevent from inserting value 3 into indicatorid column in tables tab2 and tab3 while it is absent in id column of tab1 table.
How do I perform it?
View 9 Replies
View Related
Jan 22, 2013
I have the Table name Location and synonym named Location in my DB. I am trying to create the proc where I am tryting to call the table. But its not working.
Example:
CREATE PROCEDURE TESTPROC
AS
BEGIN
DBMS_OUTPUT.PUT_LINE('testing');
select count(*) from LOCATION;
END;
Compile error:
Error(5,22): PL/SQL: ORA-00942: table or view does not exist
View 2 Replies
View Related