I want to automate the import from production to test.
1) export the production schema 2) import in to test server?
How can i automate that currently i am doing it manually as follow:
1) expdb the production schema 2) kill all connection on the test server to test schema 3) drop test user cascade; 4) recreate user; 5) impdb the production schema to test:
but i want it to automated or scheduled so i don't; have to log in every night!!
I will have to proceed with Oracle 9 database refresh from production server to integration server. 5 biggest schemas must be exported and imported. They constitute 97% space used in a database. This is very big database so I would like to be sure that everything will go smoothly. That is why i want to ask you some questions.
Have you got any clues for me before I start with exp/imp? From my side i will tell you that I will have to exp/imp schema by schema because there is small space both on production and integration disk for a dump. First thing I thought are dependencies between schemas that are exported and that which are not, and also between schemas that are exported/imported one by one.
This is procedure that I plan:
For every schema that is to be refreshed { 1. Export schema with ROWS=N CONSTRAINTS=Y 2. EXPORT schema with ROWS=y CONSTRAINTS=N 3. Import schema from step one 4. Disable all the foreign key constraints using ALTER TABLE DISABLE CONSTRAINT. 5. Import schema with rows } ALTER TABLE ENABLE CONSTRAINT
With above procedure i think that I will avoid problems with dependencies between schemas exported/imported one by one. But my concern is if there are any dependencies between those schemas and schemas that are not exported. Is there an way to check it before refresh ?
I'm try to import a table of data (character set: CL8ISO8859P5) to another database (character set: AL32UTF8) using exp/imp utility.After the import, all Cyrillic text was corrupted!
We have quite a number of sessions in database MES (production) coming from another machine.
From v$session, the program is oracle@WID27 (TNS V1-V3). This WID27 (hostname) consists of quite a number of development databases inside. We have to trace which jobs are actually triggering this, as WID27 are not suppose to connect to production databases.
How can we tell whether the sessions came in is from dblink or from the machine itself?
I need to use Data Pump for the first time on my production Database.Currently on Testing Database, when i am taking schema level export there are no errors or warnings in the log file but when i importing it gives fallowing ORA in the import log file. i searched on google,the only way i found is to recompile the invalid objects. how to avoid this warnings in log file.
"ORA-39082: Object type ALTER_PROCEDURE:"QUANTISV4"."P_CTM_ABN_INVST_EQUITY" created with compilation warnings"
I would like to know that how can i automate the export from production to test server. I need direction to create process to import data from production (server A) to test server (server B).
According to my understanding , if Disk1 Fails Disk4 facilitates normal operations. When there is space crunch it operates in reduced redundancy . Am i right ?
2.I have got 4 Disks in one group (i.e from Disk1 To Disk4 ) i have not defined any failure group and as per my understanding all disks will be added to its own failure group without mirroring and striping.
I want to OFF tablspace AUTOEXTEND on a prodution system, we have many RAC databses and that will be done on all stations. i have got a document from net which was written on 29-Jun-2007 and it says that if need to OFF the AUTOEXTEND of a TABLESPACE so you need to ist make it off on the underlying datafiles of that tablespace so this doc is for Oracle 8.1.7.2.0
We are planning to consolidate our Oracle Production DB into one server. We are basically a windows shop. Is it feasible to run 8 production Oracle DB in one windows server. All the DB are not really transaction intensive DB. 2 DB in the size of 300GB and others all DB falls under average size of 40GB.
I can take care of the HD slicing so Oracle does not enter into IO bottleneck. We are planning to go for external NAS or SAN for storage.
My main concern is on processor usage. The processor we are thinking about is Intel Xeon Quad Core x 2nos. Will there be a processor bottleneck or is there way in Oracle to assign processor usage(I belive there is no much tweaking options here)
We have an Oracle Server database of Size 50 GB having 10 GB Data. And Planning to have a new Database Server of 200GB . So my question is after moving all the 10 GB data to 200 GB Database Server, will the performance of the system come down? Will it reduce the speed?
Now i have one problem. I have two database oracle 10g and install on two server (call is A and B). I already create database link between two database. Each database has over 200 tables.
Basically structure two database is same, but database A have data real time, database B is stand by server. Now i want to synchronize all data of database A to database B. What should i do?
We have a requirement from the customer to start using data and index compression in our 11g database.. Is this something available in Oracle 10g,11g without any additional costs? We are not sure if this will work with our application so we will have to test it in-house, is it possible to compress the existing table data/index to test it out?
i am continuously inserting data to oracle database after some time like 2 hours oracle disconnects,it creates erros like
ERROR: ORA-01034: ORACLE not available ORA-27101: shared memory realm does not exist Process ID: 0 Session ID: 0 Serial number: 0
after restarting the database with shutdown immediate ans startup if i start inserting records it will show erros like
ORA-01653: unable to extend table SYSTEM.GLT_PROT_TRAFFIC_SUM_VOIP by 8192 in tablespace GLCOMM
but i have created the Tablespace with BigFile Auto Extend and max size unlimited, i am having 400GB disk space created Redo logs with 15gb,i have tried serveral times reinstalling the oracle but problem is not solved.
same problem is happenging with small datafiles also,
operating system: windows server 2008 R2 standard oracle server : 11g oracle client : 64 bit
I am having problem with my tnsnames.ora file.seems the file cannot handle connection information for more than one DB. It worked fine in the beginning, after that, whenever i had to connect to a DB, i had to keep switching and renaming 5 files, with respect to whichever database I want to connect to. this is the entries in the file, for 2 D's and it wont work, with just one in each it works, but not for two...
I am sure, nothing wrong with teh IP/virtualname, cause with just one of these in the file, they work excellent.I tried leaving line between the connection strings for each DB, without leavin lines, etc..
Import: Release 10.2.0.1.0 - Production on Wednesday, 17 March, 2010 11:07:02 Copyright (c) 2003, 2005, Oracle. All rights reserved. Connected to: Oracle Database 10g Release 10.2.0.1.0 - Production Starting "MUBA"."SYS_IMPORT_TABLE_01": muba/******** tables=FUNCTION_NO directory=testdump NETWORK_LINK=DBLINK1 Estimate in progress using BLOCKS method... Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
We are migratingfrom [IBM P5 - AIX 5.3 - Oracle DB 10.2.0.1 64bit]to [IBM P7 - AIX 6.1 - Oracle DB 10.2.0.5 64bit]
The new database is already up and running, our next step now is the following: 1. Create Tablespaces (done) 2. Export the Database from P5 3. Import the dump to P7.
We have this following options:
I. Use a Windows XP workstation 32bit with Oracle 10.2.0.1 Database software to export from P5, then import the dump to P7. (Export/Import Utility 10.2.0.1 Windows 32bit) II. Use the Export Utility of the P7 (10.2.0.5 AIX 64bit) to do the export and import. III. Use the Export Utility of P5 (10.2.0.1 AIX 64bit) then use the Import Utility of P7 (10.2.0.5 AIX 64bit)
BTW, our colleague tried to do the following: A. Use Windows - Export Utility 10.1.0.2.0 - 32bit to make a dump of our database (Oracle 10.2.0.1.0 AIX 64bit) B. Use Windows - Import Utility 10.1.0.2.0 - 32bit to import the dump file from step A to our new database (Oracle 10.2.0.1.0 AIX 64bit)
But after issuing imp system/password@NEWDB file=(a.dmp, b.dmp... c.dmp) full=y ignore=y statistics=none
The import seems to hang here...importing SYSTEM's objects into SYSTEM