ORA-26713 Streams Apply Error And Gateway
Mar 27, 2013
My setup is following:
CODEhost22 (Oracle Linux 5.9) | host30 (Windows XP) with SQL Server Express 2008
DB22 | Gateway to MSSQL | DB30
Local Local | |
Capture --> Apply ------> Gateway -----------------> DUMMYUSR.DUMMYTBL
Process Process | |
Capture process on DB22 captures changes made to local DUMMYUSR.DUMMYTBL table, enqueues it to capture queue and propagates to apply queue for local apply process, which, using database link DB30 supposed to apply these changes to table DUMMYUSR.DUMMYTBL on remote SQL Server database DB30.
The problem is that when apply process is executing transaction, it aborts with error:
QUOTE ORA-26713: remote object does not exists or is inaccessible.
ORA-26712: remote object is "DUMMYUSR"."DUMMYTBL"@"DB30".
In trace log I can see the following message:
QUOTE kniNoPrimaryKeyError: ORA-26713, remote obj doesn't exist,
DUMMYUSR.DUMMYTBL@DB30
[code]...
what can cause ORA-26713 error of apply process applying changes to SQL Server database through Oracle Database Gateway, if object to be modifyed exists and accessible from Oracle database for strmadmin user?
By the way, database link and tables were created with following commands: CODEDB22(Oracle): (connected as strmadmin) create database link DB30 connect to "dummyusr" identified by "dummypwd" using 'DB30';
DB22(Oracle): (connected as dummyusr)
create table DUMMYTBL (no number, name char(6), ddate date, payload varchar2(2013));
DB30(MSSQL): (connected as dummyusr)
create schema DUMMYUSR authorization dummyusr;
create table DUMMYUSR.DUMMYTBL (no float(53), name char(6), ddate datetime2(0), payload varchar(2013));
View 1 Replies
ADVERTISEMENT
Nov 1, 2013
A streams apply process which applies to a sql sever database is increasing its pga use continually until i stop the process and restart it. I need to stop it once every week or it will use too much of the pga and the database will hand causin paging etc.
View 1 Replies
View Related
Oct 10, 2013
,I am trying to set up Streams on a 11.2.0.3 on a Windows 2008R2 server. Due to an error in running propagation, i a, trying to delete both Capture and Apply queues. I have deleted the Apply queues but unable to drop the Catpure,
SQL>EXEC DBMS_APPLY_ADM.STOP_APPLY(apply_name =>'LAO_NLPG73_BLPU_APPLY'); SQL> select * from dba_apply; LAO_NLPG73_BLPU_APPLY NLPG73_BLPU_APPLY_Q STRMADMIN YES RULESET$_732 STRMADMIN STRMADMIN 301355 ABORTED 09/10/2013 17:34:21 1013 ORA-01013: user requested cancel of current operation CAPTURED STREAMS APPLY SQL> select * from dba_queues; STRMADMIN NLPG73_BLPU_APPLY_Q
[code]...
View 5 Replies
View Related
Feb 9, 2013
Database Version - 10.2.0.4.0
OS - Red Hat Linux
We have configured oracle one way stream between two databases. Source database is capturing the changes (No downstream configured). Configuration was working fine but destination database was lagging behind very much i.e about 15 days behind the source database. We are ok with this but the problem is now that , as per client request we have restored previous backup and open the database with resetlog option in source database. After resetlog , archivelog sequence has been changed and stream is not working.
Can I apply the previous archivelog (before resetlog archivelogs ) in destination database anyway.Source database is a production database.
View 1 Replies
View Related
Oct 4, 2010
Platform: Windows 2003
Streams Set up: One way Streaming at table level
The error: ORA-26786: A row with key ("REPT_NUM", "STATE_CODE", "SURVEY_ID") = (067305669, 49, J) exists but has conflicting column(s) "DATE_TIME", "PREV_PARENT_ID" in table TOPCATI_JOLTS.UNIT ORA-01403: no data found
We are consistently getting this error every other day and some weeks more often, different records of course. On the capture site the application does a process called split cases. In this process the application will take an old PK case num insert a new PK case number with all of the data of the old case (Parent level). At the unit (Child level) the application will change all of the units to this new FK case number. This means the old case (parent) is left with no child units. This is all one transaction.
Is it possible that streams may be applying the LCR's out of order? especially since that whole process is one transaction.
View 2 Replies
View Related
Jun 26, 2013
I encountered the following error while trying to setup streams replication at the database level using dbms_streams_adm.maintain_global. Desmond begin*ERROR at line 1:ORA-23616: Failure in executing block 6 for script.
E00C49DDDB27C899E040A8C04C0119DA withORA-06550: line 21, column 3:PL/SQL: ORA-00942: table or view does not existORA-06550: line 21, column 3:PL/SQL: SQL Statement ignoredORA-06550: line 23, column 3:PL/SQL: ORA-00942: table or viewORA-06512: at "SYS.DBMS_RECOVERABLE_SCRIPT", line 659ORA-06512: at "SYS.DBMS_RECOVERABLE_SCRIPT", line 682ORA-06512: at "SYS.DBMS_STREAMS_MT", line 2427ORA-06512: at "SYS.DBMS_STREAMS_ADM", line 3004ORA-06512: at line 2 SQL> select forward_block from dba_recoverable_script_blocks where script_id = '
[code]....
View 1 Replies
View Related
Aug 29, 2011
I have 11g RAC database in Solaris 10 - 64bit server. When i start messaging gateway agent in node1, it goes fine. But in node2, i am getting the following error.
java.lang.UnsatisfiedLinkError: /usr/app/oracle/product/11.1.0/lib/libocijdbc11.so: Can't load Sparc v9 64-bit .so on a Sparc 32-bit platform
I tried 467166.1 note but it didn't sort out the issue.
View 1 Replies
View Related
Mar 11, 2013
I want to connect ORACLE-11gR2 that is on linux with an MS sql server. DO i need to use some third party drivers like easy soft ? Can i configure the database gateway without any third party drivers ?
Secondly if my oracle is on linux, can i configure the gateway on a seperate windows machine(that will connect to linux oracle and ms sql server) as i have found the gateway installation setup only for windows version.
View 3 Replies
View Related
Sep 6, 2009
I'm trying to connect from Oracle 11g (11.2.0.1.0) to Sybase Adaptive Server Anywhere (9.0.2.3527). My Oracle environment is running on Linux Centos 5.3, the sybase database runs on a Windows Server.
All my attempts have failed so far - both through Oracle Database Gateway for Sybase and via Oracle Database Gateway for ODBC in combination with freeTDS. Using either way I'm ending up with apparently the same error:
ASA Error -611: Transact-SQL feature not supported
When I use db link created via dg4sybs (Oracle Database Gateway for Sybase) I get this:
SQL> SELECT * FROM aaa@hvx;
SELECT * FROM aaa@hvx
*
ERROR at line 1:
ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
[code]....
When I use db link created via dg4odbc (Oracle Database Gateway for ODBC) and freeTDS driver I get this:
SQL> SELECT * FROM aaa@hvx1;
SELECT * FROM aaa@hvx1
*
ERROR at line 1:
ORA-28511: lost RPC connection to heterogeneous remote agent using
SID=ORA-28511: lost RPC connection to heterogeneous remote agent using
[code]....
It seems both drivers use Transact-SQL instructions which are then denied by ASA. I'm not familiar with Sybase products at all, but as far as what my googling revealed the Transact-SQL is only supported in Adaptive Server Enterprise (enterprise-class version of Sybase's database). I couldn't figure out if there's a way how to disable Transact-SQL in the driver configuration. how to connect from Oracle (10g or 11g) to Sybase Adaptive Server Anywhere?
View 3 Replies
View Related
Feb 11, 2011
I have trouble reading a field called "Description" from a SQL Server 2008 DB using the gateway:
SQL> select "Description"
from opportunity@mscrm
select "Description"
from opportunity@mscrm
*
Error at line 1
ORA-00904: "Description": ungültiger Bezeichner
My first thought was, that "Description" was a reserved word in Oracle but it's not:
SQL> create table delete_this ("Description" varchar2 (50 char))
Table created.
SQL> select "Description" from delete_this
no rows selected.
So everythings fine here. Then I thought, that I might have mispelled the column name, but no:
SQL> select column_name
from v_mscrm_tab_columns
where table_name = 'opportunity'
and column_name like 'Desc%'
[code]....
View 3 Replies
View Related
Dec 2, 2010
We are planning on interfacing Oracle to SQL server connectivity, I'm researching on how to create a dblink to SQL server 2005. I'm a little confused between Generic connectivity and Transparent Gateways..
I think Transparent Gateway is not free and needs to be licensed, is it better in terms of performance? Also, do I need to configure this gateway on a seperate server?
I think Generic connectivity can be configured on the same Oracle database server itself.. but I see there could be some performance issues with it..
We are running on Oracle 10.2, looks like HSODBC is not available in 10g?
View 2 Replies
View Related
Dec 5, 2011
where can i download the oracle gateway for sql server? i can't find the download web site .
View 4 Replies
View Related
Sep 4, 2012
we have microsoft sql server, oracle gateway server (11G) and database server (10.2.0.5) we already make the full setup, and it's working.
we create materialized view into oracle server to read a data from sql server, but the refresh of MV take long time so we are planing to create the materialized view into oracle gateway server.
create MV into oracle gateway server?i mean i don't want to download oracle database instance for only one MV.
View 5 Replies
View Related
Nov 21, 2010
i get while getting a report in ASP..
Error like this
Error Code 1460: Timeout
Background: The gateway could not receive a timely response from the website you are trying to access, a DNS server, or another gateway server. This might indicate that the network is congested or that the website is experiencing technical difficulties.
View 4 Replies
View Related
Jan 27, 2013
Oracle 11gR2
Linux 6.x
Sybase 12 (on Windows)
unixODBC 3.2.x
Sybase Client 12
------------------------
Given the following files or structures that are important to the proper operation of the ODBC Gateway connecting to a remote Sybase database:
listener.ora
tnsnames.ora
initmydblink.ora ( my version of the initdg4odbc.ini )
dblink (not a file but the dblink in the Oracle database)
odbc.ini
odbcinst.ini
What files (and what properties within those files) may/must be changed if the remote Sybase db and system is different (i.e. the Sybase database source has been changed to something 'new')? For example, I can that the odbc.ini file is going to have to be alterer to point to the new database server IP address, database port, database name?
View 1 Replies
View Related
Jan 24, 2013
What are the pro's and con's to using the APEX Listener vs the Embedded PL/SQL gateway?
We just started using APEX and on a development server, we set it up using the EPG because this seemed like the easier thing to do. However, they now have a couple apps that are approaching production status and I now need to look at placing the APEX runtime on our production Oracle Database server. So I am trying to decide if I should continue using the EPG or if I should use the APEX Listener.
The server is Windows Server 2008 Enterprise R2 (64 Bit) with Oracle 11g R2 (11.2.0.3.0) also running at 64 bit. I am planning on using APEX 4.2.1.
I have also not found information about how to move APEX application from development to production. Do I export the schema from the dev server and import it to the production server? Do I have to recreate the same workspace name on the production server as well?
use a separate dev and production server setup for APEX?
View 3 Replies
View Related
Dec 5, 2011
how can i run dml statement on the oracle transparent gateway for sql server ,such as insert ,update,delete.
View 4 Replies
View Related
Feb 3, 2012
Source DB is logical standby 10.2.0.5
Destination DB is 11.2.0.3
Is it possible to set up streams on the logical DB to replicate some schemas to the 11gR2 DB?
View 10 Replies
View Related
Apr 18, 2013
Am trying to implement Oracle Streams Replication (Using Metalink Note 733691.1).I have configured the steps, but in my alert log am getting the below error:
Check that the primary and standby are using a password file and remote_login_passwordfile is set to SHARED or EXCLUSIVE, and that the SYS password is same in the password files. returning error ORA-16191
am successfully able to connect db's both server
From 1st server
sqlplus sys@2nddb as sysdba
From 2nd server
sqlplus sys@1stdb as sysdba
while conning it is asking for password
Both DB's are created with Same Oracle sys user password. after this disabled case sensitivity , still the error persists.
View 1 Replies
View Related
Mar 26, 2013
I am trying to setup a one way queue that will be consumed by an inhouse ESB.We currently are able to create a multi-consumer queue using:
CODEBEGIN
DBMS_STREAMS_ADM.SET_UP_QUEUE(queue_table => 'strmadmin.streams_queue_table',
queue_name => 'strmadmin.streams_queue');
END;
but unfortunatly the ESB that we are using uses a jms component which seems to only be able to take mono-consumer queue.So we have created our queue usuing the following
CODEBEGIN
DBMS_AQADM.CREATE_QUEUE_TABLE( Queue_table => '"STRMADMIN"."STREAMS_QUEUE_TABLE"',
Queue_payload_type => 'SYS.ANYDATA',
storage_clause => 'PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 TABLESPACE
[code]...
This therefore permits us to have a queue/queuetable which pushes data to a single consumer.The probleme comes when we try to add a table rule using the following command:
CODEBEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES( table_name => 'source.t1',
streams_type => 'apply',
streams_name => 'streams_apply',
queue_name => 'strmadmin.streams_queue',
[code]...
We then get the following error:
CODEORA-06512: on line 2 24039. 00000 - "Queue %s not created in queue table for multiple consumers"
*Cause: Either an ADD_SUBSCRIBER, ALTER_SUBSCRIBER, or REMOVE_SUBSCRIBER procedure, or an ENQUEUE with a non-empty recipient list, was issued on a queue that was not created for multiple consumers.
*Action: Create the queue in a queue table that was created for multiple consumers and retry the call.
We are able to create the capture rule without any problem but without the apply rule, nothing seems to end up into the queue table.AQ is not a viable solution since it is troublesome when it comes to deletes and mass updates.
View 1 Replies
View Related
Jan 28, 2013
We have a core banking database and this database includes our customer related tables and these tables are really huge.
And we have other database for some applications and this database needs fresh customer data which take place in core banking database. Our current method is materialized views but we have some performance problems about it.
What can give better performance for synchronizing the tables between databases?
-oracle streams?
-triggers?
-dbms_comparison?
View 3 Replies
View Related
Mar 2, 2012
i'm trying to write a pl/sql to find all missing archived logs that are need for streams replication.
There is already a oracle metalink note for this. But yet it would give only the archive log name that contains my dba_capture.start_scn and we need to check if the files exist in disk or not!
The problem here is, when using ASM, dba_registered_archived_log view is truncation the file name and it is really difficult to pin point the logs. So is it fine to join this view with V$archived_log? is deleted and status column would do the trick? I modified the plsql as below. Is this fine/accurate?
CODEdbms_output.put_line('Capture will restart from SCN ' || lScn ||' in the following file:');
for cr in
(select decode (a.name, NULL, 'NOT FOUND', a.name) name, to_char(a.completion_time, 'hh42:mi:ss') completion_time from v$archived_log a,dba_registered_archived_log b where lscn between b.first_scn and b.next_scn
and a.deleted = 'YES' and a.status != 'A')
loop
f_rec :=1;
[code]......
View 1 Replies
View Related
Mar 21, 2011
Can we use Materialized views in Streams like how we use tables... ?
View 11 Replies
View Related
Jun 2, 2010
this Topic was already discussed in "Physical standby on oracle 11g - Reporting needs" but without finishing.
View 5 Replies
View Related
May 27, 2011
We have three unix servers with four databases (10gR2) containing "HP Operation Management Unix" (OMU) server messages for monitoring purpose, and we now want to transfer these data to one new database on a new server for reporting purpose.
The message table in each OMU database keeps the message row until it is "Acknowledged" or for maximum fourteen days, then it is moved to an historic table where it stays for another three days. Keeping data for only seventeen days are a performance issue.The new "Reporting database" is intended to hold messages data for the last 90 days.
I wonder which method to use to move/replicate data against the databases? Materialized view using database link, with view on top of the MVs. How to keep rows longer than the master (source) table, avoiding deletion when master row is deleted
Oracle Streams, with local capture and remote apply. How will this influence on the master database performance. There are about 10000 new messages in each OMU database every day. Is it possible having four streams connections against the reporting database ?
Or should I simply use database triggers which fires after insert and update and applies changes to the reporting database using database links ?
View 2 Replies
View Related
Aug 31, 2011
Till now my company used Share Plex for replication 6 DBs from different location to one server (used for BI operations).
1) Replication in in real time
2) All DBs are Oracle but different versions 9 and 10 .
3) We have horizontal and vertical replication. We replicate just some tables, for some tables just some columns and for some tables just records with some conditions
4) We have different charsets.
5) SharePlex have compare/repair tool to check it DBs are in sync mode
I started to look for Oracle Streams to use it instead of SharePlex. Do you know if it is able to handle that type of replications? Do you know about any Oracle Streams limitation which eliminates it?
View 5 Replies
View Related
Jan 21, 2013
Due to firewall restriction, i have to configure the streams in queue forwarding. Intermediate database will hold two different ip's (one from source and another to target) and will not talk each other.
View 1 Replies
View Related
Aug 15, 2012
I create following update conflict handlers, one after other. These are working correctly, my problem is when i create 2nd it replace first one, and when i create 3rd one it replace 2nd one. I want to put all three in action simultaneously. How can i do this.
{code}DECLAREcols DBMS_UTILITY.NAME_ARRAY;BEGINcols(1) := 'MODIFIED_DATE';cols(2) := 'MODIFIED_BY';cols(3) := 'RATE_CHANGE_DATE';cols(4) := 'PENSION_AMOUNT';cols(5) := 'EASYPAISA_WORK_ON';cols(6) := 'PPO_NO';cols(7) := 'NEW_NIC_NO'; cols(8) := 'OLD_NIC_NO';cols(9) := 'CLAIM_NO';DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(object_name => 'CORE_BUSINESS.CB_PENSIONER',method_name => 'OVERWRITE',resolution_column => 'CLAIM_NO',column_list => cols);END;{code} {code}DECLAREcols DBMS_UTILITY.NAME_ARRAY;BEGINcols(1) := 'MODIFIED_DATE';cols(2) := 'MODIFIED_BY';cols(3) := 'RATE_CHANGE_DATE';cols(4) := 'PENSION_AMOUNT';cols(5) := 'CLAIM_NO';DBMS_APPLY_ADM.SET_UPDATE_CONFLICT_HANDLER(object_name =>
[code]....
View 3 Replies
View Related
Nov 30, 2011
I have a test database with tables that occasionally has to be updated to the state of the production database tables. These databases both stream several tables to separate databases. (ie database A streams tables to database X and B to Y).
I export the needed schema daily from the production database "A". To update the test database "B", I import the needed tables from this export file. I then export the tables to be streamed from database B with the object_consistent=y parameter. I then import those tables to database Y using the streams_instantiation=y parameter.
So this is the process ...
1. Turn off capture on B
2. Turn off apply on Y
3. Drop the tables in B
4. Import tables from A to B
5. Prepare instantiation on tables in B (DBMS_CAPTURE_ADM.PREPARE_TABLE_INSTANTIATION)
6. Export from B with object_consistent=y
7. Drop tables in Y
8. Import from B to Y with streams_instantiation=y
9. Start apply on Y
10. Start capture on B
I ran this overnight through an OEM job and it appeared to work, just not how I expected. When I came in there was a lot of activity on both databases. All sessions were from the streams admin account. This continued for some time (long after the job had finished). During this time the "captured scn", "applied scn" & "last enqueued scn" values were quite different on the source side (didn't jot those down). I compared these ot the "last applied message number" on the target side and the valuse were quite a bit off.
After the activity subsided, the SCN values were all in the same ballpark. So I figure all the activity was changes getting applied from database B to database Y. I tested out the streaming and it's working fine.
I thought the parameters in the export and import statements were supposed to sync up the SCNs. That is I thought they would start off in the same state and there would be no need for applying changes. No one changed anything on the source database overnight.
I must not be understanding how the streams export/import parameters work.
View 5 Replies
View Related
Oct 8, 2013
I have requirement where replication should be done between two 11gR2 RAC on Standard edition.I have following queries,1. Does Standard edition support DDL capture?
Im asking this because on
[URL]......
it says "SE1/SE: no capture from redo" what that really mean ??2. Can it be possible to configure capture at schema level and skip only some of the tables / triggers ?
View 2 Replies
View Related