I have to create a function which takes in 3 parameters as varchar2(SQL,From,Replace)
1) SQL is the actual SQL 2) Find is the table name to find 3) Replace is the table name to replace the found table.
Moreover there are 19 sets of Find,Replace and the above function loop through all the 19 sets to replace all the tablename in FInd with Replace. I could write the function but the problem is how do i define these sets of table names.
I tried define the tablenames as Scrpt(1,1) = "PS_CGF_TBL" Script(1,2)= "PS_CGF_C_TBL" ... ... Script(19,1) = "PS_CGF_AC_TBL" Script(19,2) = "PS_CGF_XX_TBL"
and then tried to loop it by declaring an integer as index as
For Index = 1 To 19 string = ReplaceAll(SQL, Script(Index, 1), Script(Index, 2)) Next.
I am studing Multidimensional Nested table and have the below code:
DECLARE TYPE table_type1 IS TABLE OF INTEGER; TYPE table_type2 IS TABLE OF TABLE_TYPE1; table_tab1 table_type1 := table_type1(); table_tab2 table_type2 := table_type2(table_tab1); BEGIN FOR i IN 1 .. 2 LOOP table_tab2.extend; table_tab2(i) := table_type1();
[Code]...
exception when others then dbms_output.put_line(sqlerrm);END; This code is working fine as of now.But,If i comment below code(table_tab2 is also extended latter):
I don´t care about the order of the values in the row. In other words, I want to get disjoint sets of data connected by any of both values.Every pair in the input table is unique.
I have seen in the web that it is possible to do using connect by and hierarchical retrieving but I've been trying to make a lot of combinationts and I can reproduce the output.
whether Oracle has any capability of automatically checking which lossless compression algorithm it should apply by analyzing a data stream on data load? Does Oracle have any compression advisors/wizards that would make recommendations as to type and level of compression?
i have configured physical standby in my local system, to check logshipping i created a table at primary db, wen i tried to check in standby, it says table does not exist..below are primary & standby alert entries..
Primary alert log
Fatal NI connect error 12514, connecting to: (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=172.16.0.98)(PORT=1522))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=STAND)(SERVER=dedicat ed)(CID=(PROGRAM=d:oracle11gappadministratorproduct11.1.0db_1inORACLE.EXE)(HOST=A960M)(USER=SYSTEM))(SERVER=dedicated))) VERSION INFORMATION: TNS for 64-bit Windows: Version 11.1.0.6.0 - Production
I have few tables in Oracle 9i/10g , and they already have data in them. I am trying to migrate the data coming from various source systems into these Oracle tables. There is a chance that after loading I might get some unwanted data into these tables.
How do I remove just that data which I have loaded recently, and do not disturb the original data it already has.
Need to backup those tables and reload the data back if there is any problem, but I am looking at a different approach. I just don't want to change the existing system, as lot of users use the system.
I have database A (Working in Live environment) and Database B copy of Database (Not live) I have Restored whole database (A) RMAN backup file on Database (B) Previous week now i don't want to change anything in any schema and want to import only updated and new records in the table in Database B
There are around 20 schema If for example i have everything in new database B all required database objects like Procedure,functions, packages with indexes in all tables and data in tables, i just want to add new data and updated data.
I'm trying to add edmx file in my project for first time. I want to choose the oracle provider ODP.net but cannot find Oracle in the data source list. I have oracle 11g installed , odp and odt installed and can access it from the solution as well. I saw the Oracle listed under data source when I tried to connect the solution to the database through server explorer. The solution is connected to Oracle database through ODP.
Statspack has been configured for Active Dataguard on Primary database.We got an spike of Buffer busy waits for about 5 min in Active Dataguard, this was causing worse Application SQL's response time during this 5 min window.Below is what i got from statspack report for one hour
Snapshot Snap Id Snap Time Sessions Curs/Sess Comment ~~~~~~~~ ---------- ------------------ -------- --------- ------------------- Begin Snap: 18611 21-Feb-13 22:00:02 236 2.2 End Snap: 18613 21-Feb-13 23:00:02 237 2.1 Elapsed: 60.00 (mins) [code]...
Why there could sudden spike of demand on UNDO data in Active Data Guard ?
when i press when button pressed trigger, i want first the form will delete all the previous data and then populate the data from the table, that's why i used clear_block first, but this clear_code is not working here. my coding is given below
go_block('show'); clear_block(NO_VALIDATE); declare cursor c1 is select * from qtr_demand order by 1; begin [code]..........
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
1) Split values from "INST" Column : suppose 23 2) Find all values from "NUM" column for above splitted value i.e 23 ,
Eg:
For Inst : 23 , It's corresponding "NUM" values are : 1234,1298
3) Save these values into
A table Y : INST, NUM are column names.
INST NUM 23 1234,1298
1) I have a thousand records in Table X , and for all of those records i need to split and save data into Table Y.Hence, I need to do this task with best possible performance.
2) After this whenever a new data comes in Table X, above 'split & save' operation should automatically be called and append corresponding data wherever possible..
We are migrating a proc application as described below.
Old Env: UNIX Old DB: Oracle 8i
New Env: Linux New DB: Oracle 11g
New modules are successfully compiled in Linux environment. But we are facing issues in writing the output of VARCHAR datatype to a file.
find below the extract of code. EXEC SQL BEGIN DECLARE SECTION; varchar mcolmnvarchar[4]; EXEC SQL END DECLARE SECTION;
EXEC SQL DECLARE crs CURSOR FOR SELECT NVL(colmn,' ') FROM table1
memset(mcolmnvarchar.arr,'�',4); //Was added for only Linux migration. Not present in unix env.
EXEC SQL FETCH c1 INTO :mcolmnvarchar;
cout << "Data at Stage one"<< mcolmnvarchar << endl; mcolmnvarchar.arr[mcolmnvarchar.len]='�'; cout << "Data at Stage two"<< mcolmnvarchar << endl; fprintf(fptr,"%-4s",mcolmnvarchar.arr);
Above code works absolutely fine in Unix env with Oracle 8i. But with Linux env & Oracle 11g it is not working. No compilation or run time errors. Data at Stage one prints the output of database properly. But after null terminator code, Data at Stage two statement prints without any value. Value is lost after null terminator code.
I am trying to transfer data from Oracle database table into another table resides in a SQL server 2008. A database link is already set, and data can be selected/queried using
select * from a_table@db_link
the thing is when trying to insert Arabic data from Oracle to a SQL server table
insert into a_table@db_link values( 'N''some data in Arabic'' , '11-oct-00', 'some data in English' ); commit;
its inserted alright, but when we try to display the data it shows like this instead of displaying Arabic, the english data is alright, but also date show as garbage?!!
when trying to insert arabic data directly through SQL server its fine though.I suggested that we transfer data through ODBC to flat files like Acess and then to SQL server but the team rejected it since they're going to do it daily and the data is huge( we are talking more than 28000 records).
I need to export only the data from schemas or tables, how to do that with Oracle Data Pump? when we use schemas parameter this export all schema, not only the data right?
I am working in a bank as an system consultant, i have a SAN Storage Area and oracle as below.
SAN 1
This interface includes the DATA FILES of the oracle tablespace
SAN 2
SAN1 Mirrors the DATA FILES of the oracle tablespace to SAN 2
1. Can i rely on real time data recovery from SAN2 ? 2. if SAN1 (Data Files are currupted) will the SAN2 Data Files will be currupted as well. 3. If the SAN2 is currupted then what Oracle Features can be used to have uncurrupted data.
i did everything writen but when i do *SQL>alter database recover managed standby database disconnect from session;*
i go and look in the standby database AlertLog file ,and thats whats writen
*ORA-01157: cannot identify/lock data file 1 - see DBWR trace file ORA-01110: data file 1: 'F:ORACLEPRODUCT10.2.0ORADATADBSYSTEM01.DBF' ORA-27041: unable to open file OSD-04002: غير قادر على فتح الملف O/S-Error: (OS 3) The system cannot find the path specified.
[code]....
strange thing that it realises the primary database in drive F and it goes to it but i dont understand what could be the reason of this ,although im doing this command while primary database is shutdown!
I configure logical standby online .when I execute dbms_logstdby.buid,first
SQL> EXECUTE DBMS_LOGSTDBY.BUILD;
it was blcoked by other sesson,then i kill the holding session,but no work.then i cancel this step and execute it again . the error is
SQL> EXECUTE DBMS_LOGSTDBY.BUILD; BEGIN DBMS_LOGSTDBY.BUILD; END; * ERROR at line 1: ORA-01354: Supplemental log data must be added to run this command ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 3669 ORA-06512: at "SYS.DBMS_LOGMNR_INTERNAL", line 3755 ORA-06512: at "SYS.DBMS_LOGMNR_D", line 12 ORA-06512: at line 1 ORA-06512: at "SYS.DBMS_INTERNAL_LOGSTDBY", line 370 ORA-06512: at "SYS.DBMS_LOGSTDBY", line 157
I have set up a cross platform (Microsoft Windows IA (32-bit) -> Linux x86 64-bit) data guard and it worked fine.Then I did a switch over (which again worked) and found out the data is not getting replicated at all.. checked the data files available from the new primary database and found out they are in the windows format as below..
SQL> select name from v$datafile;
NAME -------------------------------------------------------------------------------- D:ORACLEAPPADMINISTRATORORADATAMFSSYSTEM01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSSYSAUX01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSUNDOTBS01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSUSERS01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSRMANRMAN_TS01.DBF
and physically they were created at '/home/app/oracle/product/11.2.0/db_1/dbs/' and as
1) scn differs wrt primary in standby (i checked, 1day difference), how to make scn same?
2)i created a table in primary, its not refelecting in standby, (below i ve pasted alertlog entries)
ORA-27041: unable to open file OSD-04002: unable to open file O/S-Error: (OS 2) The system cannot find the file specified. Errors in file d:oracle11gappadministratordiag dbmsstandstand racestand_dbw0_6916.trc: ORA-01157: cannot identify/lock data file 2 - see DBWR trace file ORA-01110: data file 2: 'D:ORACLE11GAPPADMINISTRATORORADATASTANDSYSAUX01.DBF'
[code]....
3)wen i try to open standby database in read only mode gives below error..
ERROR at line 1: ORA-16004: backup database requires recovery ORA-01157: cannot identify/lock data file 1 - see DBWR trace file ORA-01110: data file 1: 'D:ORACLE11GAPPADMINISTRATORORADATASTANDSYSTEM01.DBF'
Let us say, if i enter empno 10 (which is not there in database) in FIND Screen --> press FIND Button, it's showing up 'QUERY CAUSED NO RECORDS', Till this point it's working fine;. But after this, if i press CTR+F11 in block A, it's not pulling records. only this case it's not pulling records.
But if i enter something else in FIND Screen, if it returns any data, then if i press CTR+F11,it's pulling all records.
why it's failing to pull records if i try to query data in first case only.
I created a data warehouse in oracle 10g n with three Dimension and one cube after that it crates 4 tables . How to use an insert sql statement to insert data in those tables n how to access them.