I have to migrate the current production database to test 10.2.0.4 on windows. Any non-export way to upgrade 9i to 10 g?
i have following steps
1) ALTER DATABASE BACKUP CONTROLFILE TO TRACE 2) shutdown oracle 9i database on server A 3) copy database file, controlfile, redolog, and other files to new server B 4) alter the controlfile backup with new location of bdump, udump, and log file and data file locations 5) user oradim ORADIM -NEW -SID SID [-INTPWD PASSWORD ]-MAXUSERS USERS -STARTMODE AUTO -PFILE ORACLE_HOME\DATABASE\INITSID.ORA 6) start dabase in upgrade mode 7) run catpat.sql and util102.sql 8) take backup 9) open database for users
I would like to know that how can i automate the export from production to test server. I need direction to create process to import data from production (server A) to test server (server B).
I want to automate the import from production to test.
1) export the production schema 2) import in to test server?
How can i automate that currently i am doing it manually as follow:
1) expdb the production schema 2) kill all connection on the test server to test schema 3) drop test user cascade; 4) recreate user; 5) impdb the production schema to test:
but i want it to automated or scheduled so i don't; have to log in every night!!
we have a production database 'X'. Now i have created a test database 'T' and did'nt configured another listener to it! The issue is when i cam connecting to oracle through sqlplus i am directly connecting to Test database 'T' but not the production database 'X'----ofcourse i can login to production DB afterwards. but initially i want to access the production database 'X'.
I need to refresh a PROD database into TEST database. The PROD and TEST runs on 10g. I need a full refresh. Is there any pre req's which i should keep in mind ?.
I am trying to refresh the validation database with the old production backup i.e. our requirement. I have given the rman script i have executed and the output error message amd the rman configuration setting. Plus in the next post ill post the current sucessfully running RMAN script for your reference.
rman TARGET sys/passwd@Production CATALOG rman_pmxp/rman@catlog AUXILIARY sys/passwd @validation RMAN> show all 2> ; RMAN configuration parameters are: CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 90 DAYS; CONFIGURE BACKUP OPTIMIZATION OFF; # default CONFIGURE DEFAULT DEVICE TYPE TO DISK; CONFIGURE CONTROLFILE AUTOBACKUP ON; CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO 'H:\backups\DB_Sunday_Backup\Prod_%d_%F.rman.ctl'; CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE 'SBT_TAPE' TO '%F'; [code]...
I got following error when i am gathered stats on Schema level.
SQL> EXEC dbms_stats.gather_schema_stats(ownname=>'KDB', estimate_percent=>dbms_stats.auto_sample_size,cascade=>TRUE, force=>TRUE); BEGIN dbms_stats.gather_schema_stats(ownname=>'MFDB', estimate_percent=>dbms_stats.auto_sample_size,cascade=>TRUE, force=>TRUE); END; * ERROR at line 1: ORA-20003: Specified bug number (9196440) does not exist ORA-06512: at "SYS.DBMS_STATS", line 15342 ORA-06512: at "SYS.DBMS_STATS", line 15688 ORA-06512: at "SYS.DBMS_STATS", line 15766 ORA-06512: at "SYS.DBMS_STATS", line 15725 ORA-06512: at line 1
BEGIN dbms_stats.gather_schema_stats(ownname=>'CDOMIG_DATA', estimate_percent=>dbms_stats.auto_sample_size,cascade=>TRUE,DEGREE=>10); END; * ERROR at line 1: ORA-20003: Specified bug number (9196440) does not exist ORA-06512: at "SYS.DBMS_STATS", line 15342 ORA-06512: at "SYS.DBMS_STATS", line 15688 ORA-06512: at "SYS.DBMS_STATS", line 15766 ORA-06512: at "SYS.DBMS_STATS", line 15725 ORA-06512: at line 1
i've perform a stress test of a database in 2 configuration :
1) Single instance 11.2 with 8gb Sga e 4gb Pga on Linux, 6core with instance caging =4, resource_manager=defaut_plan
2) 2 Nodes Rac 11.2 with 8gb Sga e 4gb Pga in both nodes on Linux, 32core with instance caging =4, resource_manager=defaut_plan.
I've used Swingbench and the transaction per minute are equal, Avg transaction per s in rac are Greater then single instance (4ms vs 40ms).I know that rac is not for high performance.My question is: is rac config best than single instance and if yes what is the %?
I want to create a database trigger which will test the database link, if it is ok then it will use dblink and do its work.
If it fails then it will send the data into its own server logfile.
I Wrote:
CREATE OR REPLACE TRIGGER PERMIT.TESTTRG AFTER INSERT OR UPDATE ON PERMIT.TR_LP_M_H_COMPANY_25072012 REFERENCING NEW AS New OLD AS Old FOR EACH ROW Declare l number; nIgn PLS_INTEGER; nRows PLS_INTEGER := 0;
[code]....
When I execute it it is giving error:
LINE/COL ERROR -------- ----------------------------------------------------------------- 4/11 PLS-00201: identifier 'EXEC_SQL.CONNTYPE' must be declared 4/11 PL/SQL: Item ignored 8/2 PLS-00320: the declaration of the type of this expression is incomplete or malformed
We are planning to consolidate our Oracle Production DB into one server. We are basically a windows shop. Is it feasible to run 8 production Oracle DB in one windows server. All the DB are not really transaction intensive DB. 2 DB in the size of 300GB and others all DB falls under average size of 40GB.
I can take care of the HD slicing so Oracle does not enter into IO bottleneck. We are planning to go for external NAS or SAN for storage.
My main concern is on processor usage. The processor we are thinking about is Intel Xeon Quad Core x 2nos. Will there be a processor bottleneck or is there way in Oracle to assign processor usage(I belive there is no much tweaking options here)
I need to use Data Pump for the first time on my production Database.Currently on Testing Database, when i am taking schema level export there are no errors or warnings in the log file but when i importing it gives fallowing ORA in the import log file. i searched on google,the only way i found is to recompile the invalid objects. how to avoid this warnings in log file.
"ORA-39082: Object type ALTER_PROCEDURE:"QUANTISV4"."P_CTM_ABN_INVST_EQUITY" created with compilation warnings"
We have quite a number of sessions in database MES (production) coming from another machine.
From v$session, the program is oracle@WID27 (TNS V1-V3). This WID27 (hostname) consists of quite a number of development databases inside. We have to trace which jobs are actually triggering this, as WID27 are not suppose to connect to production databases.
How can we tell whether the sessions came in is from dblink or from the machine itself?
While trying to copy a database, i used the command: RECOVER DATABASE USING BACKUP CONTROLFILE until cancel; it gave:
ORA-00279: change 7866782806751 generated at 07/12/2010 19:39:23 needed for thread 1 ORA-00289: suggestion : /moteurs/oracle/r11110/R11110/archivelog/2010_07_13/o1_mf_1_1158_%u_.arc ORA-00280: change 7866782806751 for thread 1 is in sequence #1158
Specify log: {<RET>=suggested | filename | AUTO | CANCEL} CANCEL ORA-01547: warning: RECOVER succeeded but OPEN RESETLOGS would get error below ORA-01194: file 1 needs more recovery to be consistent ORA-01110: data file 1: '/data01/oracle/r11110/data/SYSTEM01.dbf'
ORA-01112: media recovery not started what can i do in this case of issue?
I need to copy all the database objects from one database to another database.I have 32 bit windows environment I tried export and put it in a dump file but dump file size exceeds the limit, my database size is about 300 GB.
What would be efficient way to replicate all the database objects move to another box ..?
i have one database name tom on my one pc its name is rose1 now i have to place this my tom database to on my another pc its name is rose2. i do not want to use rman.
How can i place this tom database from rose1 pc to rose2 pc and my this tom database is in archive mode and my this tom database is in rose1 pc and my rose1 os is windows xp 32bit and i want to transfer this database to on my rose2 pc and the os of rose2 pc is windows 7 64 bit.
2) Created a Source Directory --> DR1 3) Created a Target Directory --> DR2 4) Created a DB Link CREATE DATABASE LINK LINK1 CONNECT TO <username of the Remote DB> IDENTIFIED BY <Password> USING '<Remote DB Name taken in the TNSFILE>. 5) In the Local Server I had written the below command.
create or replace procedure proc1 is cursor c1 is select recid,substr(name,37) "ARCH_FILE" from v$archived_log; var1 c1%rowtype; begin open c1; fetch c1 into var1; [code]...
It is working, but not coping the Files from Local To the Remote.
I tried to convert a physical SuseLinux with Oracle 9i Machine into an virtual Machine (esxi Server 5.0). (VMware vCenter Converter ) Before started I shutdown database:
su – oracle sqlplus ‘/ as sysdba’ shutdown and the run converter.
But i have problem with copy /u01 and /u02 (database files) "Error: Unable to clone the volume mounted on '/u01' "
So i want clone machine without /u01 and /u02 then copy file. What I should do to be correctly?
I will have to proceed with Oracle 9 database refresh from production server to integration server. 5 biggest schemas must be exported and imported. They constitute 97% space used in a database. This is very big database so I would like to be sure that everything will go smoothly. That is why i want to ask you some questions.
Have you got any clues for me before I start with exp/imp? From my side i will tell you that I will have to exp/imp schema by schema because there is small space both on production and integration disk for a dump. First thing I thought are dependencies between schemas that are exported and that which are not, and also between schemas that are exported/imported one by one.
This is procedure that I plan:
For every schema that is to be refreshed { 1. Export schema with ROWS=N CONSTRAINTS=Y 2. EXPORT schema with ROWS=y CONSTRAINTS=N 3. Import schema from step one 4. Disable all the foreign key constraints using ALTER TABLE DISABLE CONSTRAINT. 5. Import schema with rows } ALTER TABLE ENABLE CONSTRAINT
With above procedure i think that I will avoid problems with dependencies between schemas exported/imported one by one. But my concern is if there are any dependencies between those schemas and schemas that are not exported. Is there an way to check it before refresh ?