Replication :: Oracle 8i - Web Application Working With Replicated Database
Feb 26, 2008
I'm looking for solutions to make real time replications from an Oracle 8i database.
For more details :
- we have an existing database with Oracle 8i
- we want to replicate this database: the replicated database will be used by a web application.
- we need the replication to be in real-time (we wish to have the shortest lag)
For the moment i didn't find much information on the web and it seems that replication solutions for the version 8i are quiet limited.
Some more details :
- The responsible want imperatively that the web application work with a replicated database, without interacting directly with the original one.
- An evolution of oracle should be done but within some years so were trying to find a solution rapidly with 8i.
I would like to know the Replication method which is fast and the best approach,we need two schemas to be moved/replicated to a new reporting database.It appears that data is to be flown in one way,do we proceed with Materialized view replication or please clarify about Oracle Streams and Advanced replication. what are the factors to decide the replication method.
I was considering a solution to maintain a replicated copy of a database in a remote office. However we are using SE One edition of oracle, so native support for dataguard is not available. There definitely should be some scripting solutions for this task, but I can't find any to date.
I have created Replication process for a single table using the below mentioned script.
connect sys/afccv@afccv as sysdba show parameter open_cursor create user STRMADMIN identified by STRMADMIN; ALTER USER STRMADMIN DEFAULT TABLESPACE USERS TEMPORARY TABLESPACE TEMP QUOTA UNLIMITED ON USERS; GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE,DBA to STRMADMIN; execute DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE('STRMADMIN'); connect sys/vxmldb@vxmldb as sysdba
[code]....
According to Alert log file Replication process started( Refer Alert log content below
Wed Oct 05 12:45:53 2011 Streams CAPTURE CP01 for STRMADMIN_CAPTURE started with pid=256, OS id=8692 Starting persistent Logminer Session with sid = 41 for Streams Capture STRMADMIN_CAPTURE Wed Oct 05 12:46:20 2011 LOGMINER: Parameters summary for session# = 41
[code]....
Now When I am checking the apply status then it is showing Dequeue message
SQL> l 1* select apply_name,state,DEQUEUE_TIME from V$STREAMS_APPLY_READER SQL> / APPLY_NAME STATE DEQUEUE_TIME ------------------------------ ----------------- -------------------- STRMADMIN_APPLY DEQUEUE MESSAGES
when I am checking the number of rows on both tables (source and destination) , at destination it is not applying any thing.
i need to set up a central server with all the master tables and two other local database which will hold the updatable materialized view of the master table...the databases must be synchronized with central server..and user will work on the materialized view database...
I have enabled SSO for my application. It was working in 4.1 version. I have upgraded to 4.2 now and the same code doesn't work now.I have created an authentication scheme where i check if a particular person is an employee based on a flag. If he is, then I return 1 to the authorization scheme i have created. This always returns 0 even if the flag is set to 'Y'. I have tried hard coding the user_id as well. It doesn't work.
I tried printing the :APP_USER but it does not give any output.
We are facing an surprising problem in oracle 10g database. Previously we are able to connect our Oracle 10g database using os authentication with "sqlplus / as sysdba" command. Last wednesday in our linux server maximum number of processes have overflowed and we need to increase the soft limit of our linux server. After that without restarting database every applications [OID 10g] are working fine. But, we are not able to connect with system using OS authentication. It is showing following.
$ export ORACLE_HOME=/a01/OID$ export ORACLE_SID=OID$ export PATH=$PATH:/a01/OID/bin$ sqlplus / as sysdbaSQL*Plus: Release 10.1.0.5.0 - Production on Tue Sep 10 06:45:08 2013Copyright (c) 1982, 2005, Oracle. All rights reserved.Connected to an idle instance.SQL>
Whereas I can connect with instance after providing @OID [SID]$ sqlplus sys@OID as sysdbaSQL*Plus: Release 10.1.0.5.0 - Production on Tue Sep 10 06:47:07 2013Copyright (c) 1982, 2005, Oracle. All rights reserved.Enter password:Connected to:Oracle Database 10g Enterprise Edition Release 10.1.0.5.0 - ProductionWith the Partitioning, OLAP and Data Mining optionsSQL> What am I missing. How can I connect with system with "sqlplus / as sysdba" command?
We have three unix servers with four databases (10gR2) containing "HP Operation Management Unix" (OMU) server messages for monitoring purpose, and we now want to transfer these data to one new database on a new server for reporting purpose.
The message table in each OMU database keeps the message row until it is "Acknowledged" or for maximum fourteen days, then it is moved to an historic table where it stays for another three days. Keeping data for only seventeen days are a performance issue.The new "Reporting database" is intended to hold messages data for the last 90 days.
I wonder which method to use to move/replicate data against the databases? Materialized view using database link, with view on top of the MVs. How to keep rows longer than the master (source) table, avoiding deletion when master row is deleted
Oracle Streams, with local capture and remote apply. How will this influence on the master database performance. There are about 10000 new messages in each OMU database every day. Is it possible having four streams connections against the reporting database ?
Or should I simply use database triggers which fires after insert and update and applies changes to the reporting database using database links ?
I'm trying to connect Oracle 11g database from Oracle 9i database, by creating dblink on Oracle 9i database. But my session got hang while i perform this. Here i'm giving the steps i followed on my both the database. Here Oracle 9i database is my source db and Oracle 11g is my target db.
Following steps/setting on Oralce 11g Database
SQL> SHOW PARAMETER SEC_CASE_SENSITIVE_LOGON
NAME TYPE VALUE ------------------------------------ ----------- ------------------------------ sec_case_sensitive_logon boolean TRUE
SQL> alter user SSA identified by SSA;
User altered.
Password File recreted on Oracle 11g Database Server
DROP DATABASE LINK "SSA11G.SSA_DB1.WBTEA.COM"; CREATE DATABASE LINK "SSA11G.SSA_DB1.WBTEA.COM" CONNECT TO SSA IDENTIFIED BY SSA USING 'DEVDATA.DEV11G';
Actually am trying to replicate two db servers from one in hong kong and another in china. when am trying to establish the replication, am getting error 'ORA-04052: error occurred when looking up remote object' like this...
but the same way i have tried in my local network, it is working fine.i have tried schema replication through enterprise manager grid control..
Data Base Configuration assistant for create new database not working in oracle 8.1.7 in windows server 2000 / server 2003 how can i create new database
At the moment, we were loading the file in our system serially. This is a very old and established system.We would like to incorporate parallel loading for our loaders to load data into the database.
Most of the issues would be due to multiple inserts happening due to the files being loaded in parallel. For some reasons, we cannot give regular commits untill the entire batch of items is processed in case the process needs to rollback. A file can contain different set of batch of items clubbed together for loading.
The issue here is untill the first file finishes loading and commits, the second file would just hang. In fact, mulitiple files might hang for the first file to finish. what can I do to overcome this?I tried to used "lock table t1 in SHARE ROW EXCLUSIVE mode nowait". When the leading process is doing inserts, the failing process will fail with a resource busy and acquire with NOWAIT specified. We would catch this exception and redirect that batch to an error file to be reloaded at a later date.
I have materialized view replication setup in an Oracle 10g environment. Inserts and Updates are being propogated as expected. When a record in the master table is deleted, there are no entries being written to the materialized view logs, and hence, the delete is not propogated to the materialized view on the remote server.
The Materialized View is defined with FAST Refresh and the refesh occurs avery 5 minutes. This allows me to see what entries are written to the mv logs before they are consumed. I thought that all DML statements get written to the mv logs and am lost to explain this behavior. The matser table definition is not very complex. It has a PK defined and two UK Constaints but does not even have any FK constraints defined. There are a couple of triggers defined on this table for Insert, Update and Delete which write audit table records but I can't see how this could affect things. There are no errors being generated on the master table side in the DB or that the application sees.
I want to configured A Synchronous replication in oracle 8i. Noe using GUI i am bale to create replication but whatever i modify it remain in the transaction i have to manually run the job ,
I also tried to configured continuous asynchronous configuration by setting delay rate 500000. but it was also not working.
Is it possible to replicate table data on real time from sql server (2005 32 bit or sql server 2000 32 bit)to oracle 10g running on linux 64 bit? If yes then what are the steps.
It will be one way replication from sql server to oracle. Which option is best sql server dts or Oracle Stream replication to replicate table data.
i am installing oracle database 8.1.7 on dell server power edge 2650 first time database successfully installed but when i want to crate new database by Database Configuration Assistant it is not working for new database creation.
Post the upgrade , applications that were existing before the upgrade are not working properly.
Errors faced:1.When we edit any page in application - we get no data found error2.
When we run any application - We get no data found error We are able to create new applications without any issues.Luckily , we had the export backup of one application taken before upgrade. We imported that back( only one application ) and that application is working fine. We imported the application in apex.oracle.com as well and get the same error[URL]... occurs during page 'Edit' as well.
I have an application built in Apex 4.2It's been running fine for several weeksI've not made any changes to the codeSuddenly it developed what seems to be a cache problem.If I enter a search criteria; it displays results. I enter a different search criteria and it brings back the same results as the first criteria.I enter a search criteria in a different field and I still get the results from the first searchI go from Internet Explorer to Mozilla.
DIfferent search criteria, but same issue. Google chrome - same issueThe obvious answer is to clear the cache.I have a process set up that is clear Cache for Items , on submit after computations and validations and then I list all the items individually.This process has been there all along.
I never had this problem before.If I log out and come back in, same issue. What ever I enter as the first search criteria is the results no matter how I search
Email no longer working for users after upgrading from 4.0 to 4.1. New 'principal' APEX_040100 added to same ACL as previous version's principal APEX_040000. In fact, both still exist in ACL.
Error code in APEX Admin Mail Queue: ORA-24247: network access denied by access control list (ACL)
What could be causing the email facility within APEX to no longer work?
I have an application where I use the zero session ID for public users. The problem is, I have a region that contains a report linking to PDF files contained in a BLOB and I use GET_BLOB_FILE_SRC to link to the PDFs. I've included the report in my global content (page 0) and want it to be available to all visitors. The problem is, it only works when a user has authenticated. It throws a PAGE NOT FOUND error for public users.
I have created a custom download procedure to download uploaded files from custom table records which are displayed via links and i call my procedure as #OWNER#.my_file?p_file=#ID#. Now the problem this procedure call works in my development system not any other system.
is there any grant i need to provide but i don't want to grant to public also apex 4.1.1 create or replace PROCEDURE
my_file(p_file in number) AS v_mime VARCHAR2(500); v_length NUMBER; v_file_name VARCHAR2(400); Lob_loc BLOB;BEGIN SELECT trim(MIME_TYPE), BLOB_CONTENT, trim(filename),DBMS_LOB.GETLENGTH(blob_content) INTO v_mime,lob_loc,v_file_name,v_length FROM EMP.FILE_attachments WHERE id = p_file; owa_util.mime_header( nvl(v_mime,'application/octet'), FALSE ); htp.p('Content-length: ' || v_length); htp.p('Content-Disposition: attachment; filename="'||replace(replace(substr(v_file_name,instr(v_file_name,'/')+1),chr(10),null),chr(13),null)|| '"'); owa_util.http_header_close; wpg_docload.download_file( Lob_loc ); ---------------------------------
I am facing very strange situation. In one of my Interactive report, Filter option is not working (Processing symbol is appearing on top of the page, but nothing happens.). Is there any setting or something like that?
I had a complex drill-down dashboard where I was setting page items through $s(Page Item, value) from the Chart Links. But when I did the same thing in APEX 4.2 EA on [URL]...it is simply opening a blank page. whether in 4.2 that Java API has been removed?
How I can maintain a replication scheme from a production database and a standby. I was watching the advanced replication methods of Oracle, but what I want is in the evening to run a process and modify the database incrementally and thus leave until the next night.
And the server I want to allocate to the standby database, also implements other processes, so my settings would be:
Production: Oracle Database 11g Linux 5.5 Standby: Windows 2003 Database 11g
Maybe that data is important, let me make clear that what I want is that the database is updated incrementally.
I have a table MYTABLE in database mydb1 duplicated via materialized view and materialized view log and refresh_snapshot commands to a MYTABLE on mydb2 database.
I like to duplicate this table MYTABLE to a third database mydb2, using the same method (materialized view and refresh_snapshot command).
Is it possible ? What's hapend to the materialized view log where I launch a refresh_snapshots on mydb2 ? How is this materialized view log truncated ?
There is a database db1 which has user U1 in in it contains T1 as table.
Likely,
There is also another database db2 which also has a user named U2 containing table T2 in it.
Now
I want to use the concept of JOINS and Join Table T1 of database named DB1 and Table T2 of database named DB2 and access from database named DB3 using Materialized View Concept.
what shall i do to access tables of DB1 and DB2 from database DB3 using Materialized View.