In production database 30 tables are there in different schemas.
In some tables data is existed for 5 years in some tables data is existed 8 years in some tables data is existed for 6 years.
And all these tables are having billions and millions of records.Let us assume the tables in the production database are
Table1 -- 8 years data -- No. Of Records 3538969000 Table2 -- 6 years data -- No. Of Records 592844435 Table3 -- 3 years data -- No. Of Records 33224993 Table4 -- 4 years data -- No. Of Records 52361756 Table5 -- 5 years data -- No. Of Records 8948567 Table15 -- 6 years data -- No. Of Records 308476987
Now I want to trnasfer these 15 tables data to test database.Based on the following conditions.
For Table1 I want to transfer 6 years data to test database, keeep 2 years data in production and delete all 6 years data from production databse.
For Table2 I want to transfer 4 years data to test database, keeep 2 years data in production and delete all 4 years data from production databse.
For Table3 I want to transfer 2 years data to test database keeep 1 years data in production and delete all 2 years data from production databse.
This will be done periodically.
I.e. suppose if we run the script now and transfer the requested data to test database,again after one year the data got increased in production then again we have to run the script and transfer data to test database based on some conditions,so it should work for long time. what is the good and fast process to transfer the data.
Im doing a project for my college and i need to transfer database. The thing is that i have created tables with databases and linked it to VB using ADODC.
I have done all this in the Computer at my home. So i have done all the project in my computer. I want to show the same project in my college, but i cant do it as the database is stored in my personal computer at home. Is there any ways i could transfer the database from my house to college like storing it in a pendrive. I used sqplus oracle 9i to create the tables.
I wanted to transfer my database from Oracle 11.1.0. on windows 2003 server 32 bit to Oracle 11.2.0. windows server 2008 64 bit
i have done following
1)On new server of 11.2.0 windows 2008 64 bit use rman backup from Oracle 11.1.0. on windows 2003 server 32 bit to restore the database and recover the database when i try to open the database Alter database open it gave me message that i should start my database as upgrade I start the database with upgrade
STARTUP UPGRADE and then run following
@?/rdbms/admin/utlirp.sql
and then run following
@?/rdbms/admin/utlu112i.sql
and then run
@?/rdbms/admin/catupgrd.sql
during run of catupgrd.sql after around 30 minutes of run, i got error ORA-03113: end-of-file on communication channel
but i reconnected myself with database again and shutdown the database and open it in normal mode and it has opened there were thousands of uncompiled objects and i recompiled them.
Only uncompiled objects left 60 are 'OLAPSYS' (Locked user) USER and 30 of PUBLIC user (all Synonyms)
Have i taken correct steps?
What should i do now to check that everything will be alright in future
I have an oracle DB version 11.2 running on oracle enterprise linux 5.9. How to transfer data from the oracle DB to a flat file on a windows server. What i have done so far is to use utl_file to create a csv file on the oracle server and am now attempting to transfer this file.
I was going to use scp or rcp but am unable to get this to work(was looking at filezilla). Another option i can use is ftp as i have a UNIX script which i can run to do this. All this is done through an oracle package which is run hourly through dbms_scheduler. I have been using sp_host_command to run unix commands directly from pl/sql so can use this to run a unix script for last resort if i cant find an easier way to automate this.
We transferred our Oracle database 11.1.0.7 from windows 2003 enterprise edition 32 bit to windows 2008 enterprise edition server 64 bit.Database is working fine but we have 53 uncompiled objects which are related to OLAPSYS and public as follows
I have written make files that compile .pc files in unix. This was for several projects that use an oralib source code directory.Just running proc on one target .pc file works fine on unix. I am trying to use proc - Oracle 10.2.0 - in windows and I keep getting:
Quote:unable to open include file #include <stdio.h> and other C library headers.
I am doing all development under cygwin, this way I can write a makefile just like under unix instead of using nmake.All C library headers are in /usr/include When I run proc on Solaris as that:
proc program.pc No problems, and I do get program.c.
However in windows I get the previous error message. I have tried to do proc include=/user/include program.pc and proc include=/user/include parse=full program.pc but I still get the same error message.
I have database A (Working in Live environment) and Database B copy of Database (Not live) I have Restored whole database (A) RMAN backup file on Database (B) Previous week now i don't want to change anything in any schema and want to import only updated and new records in the table in Database B
There are around 20 schema If for example i have everything in new database B all required database objects like Procedure,functions, packages with indexes in all tables and data in tables, i just want to add new data and updated data.
create directory TEST13 as 'C:\temp'; create table test102(id number(4),aaa bfile); insert into test102 values(1234,bfilename('TEST13','sg1.pdf')); select * from test102; 1234 (BFILE) select id, DBMS_LOB.GETLENGTH(aaa) from test102; ORA-22288: file or LOB operation GETLENGTH failed [code]...
I stored some pdf files in c:\temp in local machine. DBA suggests not to put the files on DB server. Is there a way to read the pdf files without storing them in DB server?
I can delete the old archives with extension arc. Today, when I consult V$RECOVERY_FILE_DEST, there are files with today date. Can I delete these files, without danger the database?
I'm a SAP consultant working in SQL on NT platforms. This is the first conversion from Oracle that I have done. My client has provided us with a "Cold" backup of the Oracle dbase on a HD formatted in Unix, I have the partition mounted and I'm able to view the files. I have the ORDATA folder with all the .DBF files.
Q: How do I extract the data from the .DBF files. I need to export to something workable with SQL.
Original database was on Unix, I'm operating on Windows platform.
I am running the following command to backup my archive logs and want to keep at least two hours of .arc files on disk.
rman < connect target login/password connect catalog login/password show all; sql "alter system archive log current"; backup filesperset 5 archivelog until time 'sysdate-2/24' format $FILE_DEST_ARCH delete input ; resync catalog; EOT
On some occassions, there will not be any archive logs that meet this criteria and I get the following error:
RMAN-00571: =========================================================== RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS =============== RMAN-00571: =========================================================== RMAN-03002: failure of backup command at 12/01/2010 12:50:58 RMAN-06004: ORACLE error from recovery catalog database: RMAN-20242: specificati on does not match any archive log in the recovery catalog
Is there anything I can do to the rman command above that will prevent this error from happening but yet keeping same functionality?
The trace directory is full of trc files with the text below:
--------Dumping Sorted Master Trigger List -------- Trigger Owner : INPUT Trigger Name : CUS_TST --------Dumping Trigger Sublists --------
There is like a file generated every minute, and i cant stop it from happening.I have tried setting the trace_enabled parameter to FALSE but no success.
we're planning a data migration from an application (oracle-based) to another (also with oracle db).
the origin is a ca. 80 GB database. so lots of millions of records are to be migrated. (before loading records into the destination tables, they have to be transformed).
the current concept is to receive all origin data in xml files, load them in a staging area (an own migration scheme in oracle), transform and load them into the destination tables.
we have three days for the whole migration (including extract from origin database, transform, load, backup after completion...).
my question is, that a migration with xml-files is a good concept. i think xml processing is much slower than doing the same with csv files. my proposal to migrate an oracle dump (so we got the original data in our staging area) was declined.
is migration mass-data with xml files good or are there performance or other issues?
if there is a way to create a custom log files or if there is way to utilize the built in log function of Oracle. Here is the code with what I am trying to do:
IF (_CONDITION IS NOT NULL) THEN IF (V_CONDITION <> 'NS' OR V_CONDITION <> 'NE' OR V_CONDITION <> 'AR' OR V_CONDITION <> 'OH' OR V_CONDITION <> 'SV') THEN
[Code]....
Ok, so where you see V_CONDITION := 'XX'; I would like to have the value of V_CONDITION be input into a log file as apposed to changing the value to XX. So I would have something like:
RAISE BAD_CONDITION_CODE;
The problem is that I am not sure how to create the actual log file that the value of V_CONDITION will be stored in.
How can I open any pdf(Acrobat Reader) file thru Oracle Forms 6i through procedure. I have written procedure to open .Doc & .XLS files by using OLE2.Create_Object but unable to do for .PDF files.
I m attaching the code in txt file , how i have done it for word open_word is the name of the procedure/program unit
I have oracle base version and our client version. In oracle base version we have "products.fmb" file In our client version "item.fmb" file.it was created using the base version.After that many people made changes to client version. Now I want to compare those two files. But in our system there is no form builder. Is it possible to compare without FORM builder.
I want to export my oracle table to (Excel) Format . I am using Oracle SQL Developer 3.2. I know option to export table through GUI. But i want to achive through script level or procedure level.(SQL Developer 3.2). how to achive this i tried for several method but not get the proper output .
I got a couple of reports that are made as .CSV files from PL/SQL. However I am lacking some design features. Is it possible to define text + background color and height + width of fields? (I am using utl_file package.)
We use Oracle Managed files for storing datafiles in ASM diskgroups. To add datafiles to a tablespace we usually issue SQL > ALTER TABLESPACE CADL_WM_TBS ADD DATAFILE '+DATA' SIZE 10g AUTOEXTEND Off;
Tablespace altered.And the new datafile will be created at the below location +DATA/orcl/datafile/cadl_wm_tbs.893.767888027But my db_create_file_dest is set only as DATA SQL > show parameter db_create_file
NAME TYPE VALUE ------------------------------------ ----------- ------------------------------ db_create_file_dest string +DATA
Although the above datafile got created in the desired location (ie inside +DATA/<dbname>/datafile/ directory), how did this happen without us setting the db_create_file_dest parameter to +DATA/orcl/datafile ?
We have a customer that was backing up his data by copying the oracle data XE folder (all .dbf files) to a backup drive. His server crashed and he formatted it and reinstalled the database 11g from scratch.
Now he needs to re-attach the .DBF files, is there a way?