I have the task to migrate the total databases(Exact copy to be moved to another server).The current server is going for format.After I did the following steps I am getting the tablespaces(databases)-4 sizes same ,but I am facing issue like some default tablespaces i.e temp,system are not matching.
temp tablespace *************** current server - 4.0(approximately) Migrating server - 160 MB
System tablespace ***************** current server - 580 MB Migrating server - 220 MB
Also I checked the tables are also matching for the 4 databases.Also Provide the solution or method which is correct.
steps done for migrating(By me) ******************************** EXPORTING DATA USING DATAPUMP *********************************
1 From command prompt MKDIR 'c:oraclexeapp mp';
2 From SQL prompt conn system/kotak;
3 create or replace directory dmpdir as 'c:oraclexeapp mp';
4 grant read,write on directory dmpdir to kotak;
5 From command prompt
expdp system/kotak@xe full=Y directory=dmpdir dumpfile=xe.dmp logfile=expdpxe.log; IMPORTING DATA USING DATAPUMP ***************************** in another server machine
1 From SQL prompt conn system/kotak;
2 create or replace directory dmpdir as 'c:oraclexeapp mp';
We want to replicate the data between the databases.We have 4 databases in a network.If there will be any change in database 1,e.g. updation in any table,it should automatically replicate on other 3 databases.or user will change something in database 2 ,it should replicate on other 3 databases and vice versa. All 4 databases have same schema and same configuration.
we want to use database link to connect a Database for operating the select,update or ... commands,our destination database is WE8ISO8859P1 and current database is AR8MSWIN1256 cahrset, but when we operate a command to view data,all NonEnglish characters appear odd wich we can not recognize the appeared text, also if we use convert function no change would make, view right charachters with our database link.
it doesnt work with convert founction
select convert(menu_name,'US7ASCII','WE8ISO8859P1'), convert(menu_name,'ar8mswin1256','WE8ISO8859P1'), convert((convert(menu_name,'US7ASCII','WE8ISO8859P 1')),'ar8mswin1256','WE8ISO8859P1'), menu_name from T$R_MENU@"TO201.US.ORACLE.COM" WHERE MENU_ID=601011;
result is EU?iY ?C?ICa? OCOaI? E???? ?C?IC?? ?C??I? EU?iY ?C?ICa? OCOaI? E???? ?C?IC?? ?C??I?
I have one primary database server and one physical standby database serve. but i am unable to fix "ORA-16607" while enabling configuration. during log switch redo data is being applied to the phy standby side but some thing wrong happens in BROKER
here are details........about my configuration
Primary database name PRIM Physical Database Name STAN net service PRIM net service STAN
DGMGRL> show configuration
Configuration Name: test Enabled: YES Protection Mode: MaxPerformance Fast-Start Failover: DISABLED Databases: prim - Primary database stan - Physical standby database
Current status for "test": Warning: ORA-16607: one or more databases have failed
====================================================== Alert log file First para show no problem for for redo to be transmitted to stndby ====================================================== LNS1 started with pid=48, OS id=5408 Thu Sep 21 21:32:03 2006 Thread 1 advanced to log sequence 59 Current log# 1 seq# 59 mem# 0: /u01/app/oracle/oradata/PRIM/redo01.log
I have one primary database and one physical standby database on my data guard environment.
Now i want one more physical standby database in my data guard environment, meaning i want 2 physical standby. since this is a production environment, i need to take great care. However, downtime will be mandatory.
[u]init.ora for production:[/u]
pup2.__db_cache_size=6241124352 pup2.__java_pool_size=33554432 pup2.__large_pool_size=134217728 pup2.__oracle_base='/oracle/app/oracle'#ORACLE_BASE set from environment pup2.__pga_aggregate_target=5771362304 [code]......
init.ora for my first primary standby
pup2.__db_cache_size=6509559808 pup2.__java_pool_size=33554432 pup2.__large_pool_size=67108864 pup2.__oracle_base='/oracle/app/oracle'#ORACLE_BASE set from environment pup2.__pga_aggregate_target=5502926848 [code].......
oracle 11.2.0.3 I have insert only tables that receive srecords from multiple processes at a rate of about 200/second. Each transaction can have up to 100 records. I have another set of processes that queries this table for the latest data. These processes run anywhere from once a minute to once an hour. Processes do not get all of the data. They get data based on a type field.
Both of these are from java middle tiers. The process that queries data (The subscriber) does so at the request of many remote servers (there will be vast numbers). I am not allowed to expose these downstream databases to the internet (they are not oracle DBs anyway) so I cannot use streams or golden gate
So basicallyInsert Process: multiple sessions that combined insert records up to 200/second. There will be between 1-100 records per commit.
Query Process: Downtream process makes a request to my middle tier. This middle tier runs a query to get the latest data and passes it back. This design is set and I cannot change it.
1. right now we capture the insert time of the record. However, at this rate of inserts some processes will commit faster than others. So I cant use a 'greater than my insert time' query. 2. streams/golden gate won't work. can't register these DBs. 3. don't want to serialize my inserts because since I am not sure I can keep up with the insert rate. I don't even know what the specs will be for the production hardware. I have to actually deliver this before its decided. So I am being conservative. 4. I really want to avoid updates on this table if possible. In part due to my limited ability to test. 5. due to the number of downstream processes it is possible that it will request data and for some reason fail to insert the data locally. So the downstream application will keep track of the latest data it received. This means that a subscriber may need to request the same data again.
Is there a way to set up change data capture with multiple subscribers to handle this? if my subscribers are just queries? All the queries come from the same servers(there will be several, but all the same thing). If so, when I performance test this are there any wait issues I should keep an eye on?
i could copy data from tableA to tableB with the scenario below
1) Currently we have no unique key / primary key defined on both of this tables, due to the nature of data, so insert/update will not work 2) We cant truncate/insert as user will be accessing this tables when the copying process takes place, so we dont want to end up having a scenario where the table does not contain data at a certain time
Will materialize view full refresh work in refreshing a table so we could avoid the problem faced in point 1 & 2.
I have two tables namely PERSON and WIFE. I want to make WIFE's data available in PERSON table while keeping entries of WIFE maintained and at the same time adding some the values of PERSON against the data of wife.
PERSON Table PK NAME ADDRESS IS_MARRIED 1 John ab city Y 2 Varvatos cd town N 3 Smith ef town Y 4 Henry gh city Y 5 Lynda gh city Y
WIFE table
PK PERSON_ID (FK) NAME 1 1 Alice 2 3 Rosy 3 4 Lynda
Now i want to copy data of WIFE table into PERSON table like this PERSON table
PK NAME ADDRESS IS_MARRIED 1 John ab city Y 2 Varvatos cd town N 3 Smith ef town Y 4 Henry gh city Y 5 Lynda gh city Y 6 Alice ab city Y 7 Rosy ef town Y
As in the given example you might have noticed that ADDRESS of wife is same as of her spouse and same goes for IS_MARRIED column. Moreover, the PK is also not duplicated. How to go about this?
Another important factor is Lynda already exits in PERSON table, therefore, i certainly don't want to duplicate her entry.
I am developing an application for making the travel entries for my organization employees.
For that purpose I have 3 datablocks named: 1) Travel header 2) Employee Detail 3) Travel Detail which are interlinked through constraints.
I want to give 2 options ie. either single booking entry or Group booking entry.
The form is working properly for single booking entry. But when I want to make a group booking wherein many employees have to travel between common locations I have to fill all the travel details again for all employees. I want to avoid that since it wont give a feasibility for this application. I can copy the travel details I have entered for the 1st employee. I am uploading the form layout.
I have this remote database A and database B. DB A has 10 views and DB B has 10 tables. I have to pull out data from views of DB A and load into tables of DB B at regular intervals. How do I do this job?
I have recently configure Data guard with Database 10g (10.2.0.4-64 bits) on Windows 2007 server.My Data Gurad Configuration show Success status with 2 databases on same (or local) location.My questions are
1-When I query SHOW PARAMETER LOG ARCH
DG_CONFIG(PRMDB)
ONLY 1 (PRIMARY DATABASE IS DISPLAYED ONLY NOT 2 DTATABASES e.g. DG_CONFIG(PRMDB, STLDB)
2- How to check the log applied interval or time (either transaction by transaction, timing etc)
Triggers for oracle database. What I am trying to do is copy original data from one table to another table prior to an update, insert, or delete occurring. Basically we are trying to keep a transactional history, with out having to restore the data. Here is what I have created to date, however it is not executing consistently.
CREATE OR REPLACE TRIGGER DATA_COPY BEFORE INSERT OR DELETE OR UPDATE ON "DB1"."TABLE_1" [code].......
I have a diskgroup with normal redundancy level (2 two failure groups) working fine. I desire add a new failgroup to diskgroup (changing the redundancy level to high).
Which ASM process will manage the copy of data to new failgroup? Are there some way to control or tuning this process?
I have control block. In which user enters two values. Next is data block in which two items have property 'copy value from item' set to 'control block.item1' and 'control block.item2' resp. If thees items are made database items then in execute query will they be sent as parameters in addition to default where clause? if I check last query then query does not hv these parameters but the records returned are already filtered by control block items.
I am very new to oracle and SQL.I am trying to create a store proc that will copy 14 day of data into a table and then truncate the original table. When i compile following code....
CREATE OR REPLACE PROCEDURE STOPROC_TRUNCATE ( dateNum IN NUMBER ) IS BEGIN create table AUDIT_14Days as select * from AUDIT where TIMESTAMP >= (SYSDATE - dateNum); truncate table AUDIT drop storage; [code]....
i need to copy data from certain table's from one DB to another at random intervals. Table structure for the one's getting copied can be same in both the DB.
After reading various posts here i have understood that it can be done using Oracle Replication and Oracle stream.
how these 2 methods work and how they are different from each other.
I have 2 tables with same no of column and range partition based on date.At the end of month i want to copy the data of one table to another table.Instead of copy the data i want to copy the data of one partition completely to another table partition ..
I'm cloning a database from an existing dbca template. The clone is failing with datafile does not exist, because during the copying step, all the datafiles' name are changed, how do I force it to force dbca to keep the same names from the original database?
i have a table with 15 coulums and containing millions of rows which is being updated everyday.Now i have created a similar report table with only the coulums i need to report on from the main table.what plsql script or if there is any better alternative do i need to write to copy the data from the coulums i need from the main table to the new report table. the new report table will be be updated every 01:00am with the data coming from the main table and the update is automated.
I need any good suggestion basically I have two different location
1- Factory 2- Head office
I have developed PRODUCTION module in factory and it is working fine and now I want to send data on daily basis to head office therefore I develop a form which will create backup in export format (.dmp) then user will send via email to head office.
Backup file should be save in pre-defined location then user will use a form which I developed for loading data into head office there are two different buttons in this form;
First is used to load data, actually first I load data in a temporary user which creates whenever user will press this button. Second is used to copy data in application user but first it checks if data exists then update otherwise insert.