Platform: Windows 2003
Streams Set up: One way Streaming at table level
The error: ORA-26786: A row with key ("REPT_NUM", "STATE_CODE", "SURVEY_ID") = (067305669, 49, J) exists but has conflicting column(s) "DATE_TIME", "PREV_PARENT_ID" in table TOPCATI_JOLTS.UNIT ORA-01403: no data found
We are consistently getting this error every other day and some weeks more often, different records of course. On the capture site the application does a process called split cases. In this process the application will take an old PK case num insert a new PK case number with all of the data of the old case (Parent level). At the unit (Child level) the application will change all of the units to this new FK case number. This means the old case (parent) is left with no child units. This is all one transaction.
Is it possible that streams may be applying the LCR's out of order? especially since that whole process is one transaction.
A streams apply process which applies to a sql sever database is increasing its pga use continually until i stop the process and restart it. I need to stop it once every week or it will use too much of the pga and the database will hand causin paging etc.
CODEhost22 (Oracle Linux 5.9) | host30 (Windows XP) with SQL Server Express 2008 DB22 | Gateway to MSSQL | DB30 Local Local | | Capture --> Apply ------> Gateway -----------------> DUMMYUSR.DUMMYTBL Process Process | |
Capture process on DB22 captures changes made to local DUMMYUSR.DUMMYTBL table, enqueues it to capture queue and propagates to apply queue for local apply process, which, using database link DB30 supposed to apply these changes to table DUMMYUSR.DUMMYTBL on remote SQL Server database DB30.
The problem is that when apply process is executing transaction, it aborts with error: QUOTE ORA-26713: remote object does not exists or is inaccessible. ORA-26712: remote object is "DUMMYUSR"."DUMMYTBL"@"DB30". In trace log I can see the following message: QUOTE kniNoPrimaryKeyError: ORA-26713, remote obj doesn't exist, DUMMYUSR.DUMMYTBL@DB30 [code]...
what can cause ORA-26713 error of apply process applying changes to SQL Server database through Oracle Database Gateway, if object to be modifyed exists and accessible from Oracle database for strmadmin user?
By the way, database link and tables were created with following commands: CODEDB22(Oracle): (connected as strmadmin) create database link DB30 connect to "dummyusr" identified by "dummypwd" using 'DB30';
DB22(Oracle): (connected as dummyusr) create table DUMMYTBL (no number, name char(6), ddate date, payload varchar2(2013)); DB30(MSSQL): (connected as dummyusr) create schema DUMMYUSR authorization dummyusr; create table DUMMYUSR.DUMMYTBL (no float(53), name char(6), ddate datetime2(0), payload varchar(2013));
I encountered the following error while trying to setup streams replication at the database level using dbms_streams_adm.maintain_global. Desmond begin*ERROR at line 1:ORA-23616: Failure in executing block 6 for script.
E00C49DDDB27C899E040A8C04C0119DA withORA-06550: line 21, column 3:PL/SQL: ORA-00942: table or view does not existORA-06550: line 21, column 3:PL/SQL: SQL Statement ignoredORA-06550: line 23, column 3:PL/SQL: ORA-00942: table or viewORA-06512: at "SYS.DBMS_RECOVERABLE_SCRIPT", line 659ORA-06512: at "SYS.DBMS_RECOVERABLE_SCRIPT", line 682ORA-06512: at "SYS.DBMS_STREAMS_MT", line 2427ORA-06512: at "SYS.DBMS_STREAMS_ADM", line 3004ORA-06512: at line 2 SQL> select forward_block from dba_recoverable_script_blocks where script_id = ' [code]....
Am trying to implement Oracle Streams Replication (Using Metalink Note 733691.1).I have configured the steps, but in my alert log am getting the below error:
Check that the primary and standby are using a password file and remote_login_passwordfile is set to SHARED or EXCLUSIVE, and that the SYS password is same in the password files. returning error ORA-16191
am successfully able to connect db's both server
From 1st server sqlplus sys@2nddb as sysdba
From 2nd server sqlplus sys@1stdb as sysdba
while conning it is asking for password
Both DB's are created with Same Oracle sys user password. after this disabled case sensitivity , still the error persists.
but unfortunatly the ESB that we are using uses a jms component which seems to only be able to take mono-consumer queue.So we have created our queue usuing the following
This therefore permits us to have a queue/queuetable which pushes data to a single consumer.The probleme comes when we try to add a table rule using the following command:
CODEORA-06512: on line 2 24039. 00000 - "Queue %s not created in queue table for multiple consumers" *Cause: Either an ADD_SUBSCRIBER, ALTER_SUBSCRIBER, or REMOVE_SUBSCRIBER procedure, or an ENQUEUE with a non-empty recipient list, was issued on a queue that was not created for multiple consumers. *Action: Create the queue in a queue table that was created for multiple consumers and retry the call.
We are able to create the capture rule without any problem but without the apply rule, nothing seems to end up into the queue table.AQ is not a viable solution since it is troublesome when it comes to deletes and mass updates.
We have three unix servers with four databases (10gR2) containing "HP Operation Management Unix" (OMU) server messages for monitoring purpose, and we now want to transfer these data to one new database on a new server for reporting purpose.
The message table in each OMU database keeps the message row until it is "Acknowledged" or for maximum fourteen days, then it is moved to an historic table where it stays for another three days. Keeping data for only seventeen days are a performance issue.The new "Reporting database" is intended to hold messages data for the last 90 days.
I wonder which method to use to move/replicate data against the databases? Materialized view using database link, with view on top of the MVs. How to keep rows longer than the master (source) table, avoiding deletion when master row is deleted
Oracle Streams, with local capture and remote apply. How will this influence on the master database performance. There are about 10000 new messages in each OMU database every day. Is it possible having four streams connections against the reporting database ?
Or should I simply use database triggers which fires after insert and update and applies changes to the reporting database using database links ?
Till now my company used Share Plex for replication 6 DBs from different location to one server (used for BI operations).
1) Replication in in real time 2) All DBs are Oracle but different versions 9 and 10 . 3) We have horizontal and vertical replication. We replicate just some tables, for some tables just some columns and for some tables just records with some conditions 4) We have different charsets. 5) SharePlex have compare/repair tool to check it DBs are in sync mode
I started to look for Oracle Streams to use it instead of SharePlex. Do you know if it is able to handle that type of replications? Do you know about any Oracle Streams limitation which eliminates it?
I have requirement where replication should be done between two 11gR2 RAC on Standard edition.I have following queries,1. Does Standard edition support DDL capture?
Im asking this because on
[URL]......
it says "SE1/SE: no capture from redo" what that really mean ??2. Can it be possible to configure capture at schema level and skip only some of the tables / triggers ?
Actually am trying to replicate two db servers from one in hong kong and another in china. when am trying to establish the replication, am getting error 'ORA-04052: error occurred when looking up remote object' like this...
but the same way i have tried in my local network, it is working fine.i have tried schema replication through enterprise manager grid control..
We have a core banking database and this database includes our customer related tables and these tables are really huge.
And we have other database for some applications and this database needs fresh customer data which take place in core banking database. Our current method is materialized views but we have some performance problems about it.
What can give better performance for synchronizing the tables between databases?
i'm trying to write a pl/sql to find all missing archived logs that are need for streams replication.
There is already a oracle metalink note for this. But yet it would give only the archive log name that contains my dba_capture.start_scn and we need to check if the files exist in disk or not!
The problem here is, when using ASM, dba_registered_archived_log view is truncation the file name and it is really difficult to pin point the logs. So is it fine to join this view with V$archived_log? is deleted and status column would do the trick? I modified the plsql as below. Is this fine/accurate?
CODEdbms_output.put_line('Capture will restart from SCN ' || lScn ||' in the following file:'); for cr in (select decode (a.name, NULL, 'NOT FOUND', a.name) name, to_char(a.completion_time, 'hh42:mi:ss') completion_time from v$archived_log a,dba_registered_archived_log b where lscn between b.first_scn and b.next_scn and a.deleted = 'YES' and a.status != 'A') loop f_rec :=1; [code]......
Due to firewall restriction, i have to configure the streams in queue forwarding. Intermediate database will hold two different ip's (one from source and another to target) and will not talk each other.
I create following update conflict handlers, one after other. These are working correctly, my problem is when i create 2nd it replace first one, and when i create 3rd one it replace 2nd one. I want to put all three in action simultaneously. How can i do this.
,I am trying to set up Streams on a 11.2.0.3 on a Windows 2008R2 server. Due to an error in running propagation, i a, trying to delete both Capture and Apply queues. I have deleted the Apply queues but unable to drop the Catpure,
SQL>EXEC DBMS_APPLY_ADM.STOP_APPLY(apply_name =>'LAO_NLPG73_BLPU_APPLY'); SQL> select * from dba_apply; LAO_NLPG73_BLPU_APPLY NLPG73_BLPU_APPLY_Q STRMADMIN YES RULESET$_732 STRMADMIN STRMADMIN 301355 ABORTED 09/10/2013 17:34:21 1013 ORA-01013: user requested cancel of current operation CAPTURED STREAMS APPLY SQL> select * from dba_queues; STRMADMIN NLPG73_BLPU_APPLY_Q
I have a test database with tables that occasionally has to be updated to the state of the production database tables. These databases both stream several tables to separate databases. (ie database A streams tables to database X and B to Y).
I export the needed schema daily from the production database "A". To update the test database "B", I import the needed tables from this export file. I then export the tables to be streamed from database B with the object_consistent=y parameter. I then import those tables to database Y using the streams_instantiation=y parameter.
So this is the process ...
1. Turn off capture on B 2. Turn off apply on Y 3. Drop the tables in B 4. Import tables from A to B 5. Prepare instantiation on tables in B (DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION) 6. Export from B with object_consistent=y 7. Drop tables in Y 8. Import from B to Y with streams_instantiation=y 9. Start apply on Y 10. Start capture on B
I ran this overnight through an OEM job and it appeared to work, just not how I expected. When I came in there was a lot of activity on both databases. All sessions were from the streams admin account. This continued for some time (long after the job had finished). During this time the "captured scn", "applied scn" & "last enqueued scn" values were quite different on the source side (didn't jot those down). I compared these ot the "last applied message number" on the target side and the valuse were quite a bit off.
After the activity subsided, the SCN values were all in the same ballpark. So I figure all the activity was changes getting applied from database B to database Y. I tested out the streaming and it's working fine.
I thought the parameters in the export and import statements were supposed to sync up the SCNs. That is I thought they would start off in the same state and there would be no need for applying changes. No one changed anything on the source database overnight.
I must not be understanding how the streams export/import parameters work.
We have configured oracle one way stream between two databases. Source database is capturing the changes (No downstream configured). Configuration was working fine but destination database was lagging behind very much i.e about 15 days behind the source database. We are ok with this but the problem is now that , as per client request we have restored previous backup and open the database with resetlog option in source database. After resetlog , archivelog sequence has been changed and stream is not working.
Can I apply the previous archivelog (before resetlog archivelogs ) in destination database anyway.Source database is a production database.
We are using oracle as database in our application. For high availability we have a cluster of multiple nodes and the data is replicated using oracle streams. All the nodes in the cluster are active. We do not have any concept of stand-by.
Now we are planning to use oracle RMAN for backup and recovery. RMAN user's guide doesn't recommend any strategies for such deployments. It mainly focuses on primary/stand-by deployments.
I have tried to install oracle 10g on my Windows XP. But during the installation I have got these errors:
"Enterprise manager configuration failed due to the following error. Error instanting EM configuration files. Refer to the log file U:\...\orcl6\emConfig.log for more details"
"VTA-1000 : The browser could not be launched with the file or
Ok i have stored proc that is accessing another DB thru a DB link. Now i can run select statements on the remote DB without issue, but i recieve the ora-04052 error when i try to do it thru my stored proc.
here is the format of my query that it doesn't like
SELECT STATUS INTO V_STATUS FROM myschema.ticket@remote.db.com WHERE ID = V_ID;
CREATE OR REPLACE PACKAGE BODY pkg_das_stag_to_master AS PROCEDURE sp_load_dasstage_security AS cursor cur_stag is select Asset_id, ID_ISIN, ID_SEDOL1, ID_CUSIP, [code].........
error:
66/35 PLS-00302: component 'ASSET_ID' must be declared 66/35 PLS-00302: component 'ASSET_ID' must be declared 66/18 PL/SQL: ORA-00904: "TYP_SECURITY_VAR"."ASSET_ID": invalid identifier 63/1 PL/SQL: SQL Statement ignored
I am new to Oracle and I have just installed 11g on my 64-bit laptop and after downloading and installing SQL Developer I can nolonger log in using DBSNMP and my specified password both from SQL PLUS or Developer retaining this error message : ERROR: ORA-01033: ORACLE initialization or shutdown in progress.