i had a problem with ODI 11g while i was loading some ";" delimited file.
The record is this one:
Header
PIATTAFORMA; FUNZIONE; OPERAZIONE; SUB_OPERAZIONE; DESC_FUNZIONE; SYSTEM; SUBSYSTEM; DES_OPERAZIONE; PROGRESSIVO; TIPOLOGIA; COMMIT; LOGGATA; MOTIVO_ESCLUSIONE
Data
UCAMP-PWS;CDBXF001;;;Inquiry su GEBA;CD;BX;;;Lettura;NO;NO;Funzione senza dati cliente
The field DES_OPERAZIONE is empty for that set of records, but is not a problem cause is a description, my target table is created with all VARCHAR2 except for one field that is Number (PROGRESSIVO in the example) and in this case is empty too, still not a problem cause is not some key or other.
What's happens to me is that if i have the field DES_OPERAZIONE empty, the record will be not loaded but without any message of error or warning. I used the NVL on the field to say if empty then 'ND' but doesn't work. The strange thing is that if i leave empty the field before for example for him doesn't matter, if i use NVL on the field before it works. On teh field DES_OPERAZIONE not !
I don´t care about the order of the values in the row. In other words, I want to get disjoint sets of data connected by any of both values.Every pair in the input table is unique.
I have seen in the web that it is possible to do using connect by and hierarchical retrieving but I've been trying to make a lot of combinationts and I can reproduce the output.
I'm trying to add edmx file in my project for first time. I want to choose the oracle provider ODP.net but cannot find Oracle in the data source list. I have oracle 11g installed , odp and odt installed and can access it from the solution as well. I saw the Oracle listed under data source when I tried to connect the solution to the database through server explorer. The solution is connected to Oracle database through ODP.
I need to export only the data from schemas or tables, how to do that with Oracle Data Pump? when we use schemas parameter this export all schema, not only the data right?
I have set up a cross platform (Microsoft Windows IA (32-bit) -> Linux x86 64-bit) data guard and it worked fine.Then I did a switch over (which again worked) and found out the data is not getting replicated at all.. checked the data files available from the new primary database and found out they are in the windows format as below..
SQL> select name from v$datafile;
NAME -------------------------------------------------------------------------------- D:ORACLEAPPADMINISTRATORORADATAMFSSYSTEM01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSSYSAUX01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSUNDOTBS01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSUSERS01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSRMANRMAN_TS01.DBF
and physically they were created at '/home/app/oracle/product/11.2.0/db_1/dbs/' and as
Log shipping stopped logging/shipping into the Standby Database in one of our Oracle Data Guard Servers about two month ago and there was no change carried out as at that time. The Data Guard has only one Standby database. I have been managing the log shipping and recovery manually.
I would like to know if it is possible to create an automated standby database disaster recovery and high-availability solution (like dataguard) using hp data protector on oracle?
We need to transfer data from oracle 10g to Oracle 9i in the following condition.
There will be two database server , one is online server where online user fill the form which is generated by java, spring , hibernate and using database 10 g. at day end i need to execute a process that transferring data from online server to offline server that is in oracle database 9i. This process is scheduled. Some security reason client do not kept this two database on same network. My challenge is that transfer data from online server to offline server with applying client security norms.I have option like:
1) Using Oracle replication method, creating materialized view on remote server , refreshing it at regular interval. but database connectivity is not contineous, should i go for that ?
2) Write java application on intermediate server where we write process to get the connection of this two database servers. From java application we call the procedure for selecting data from Oracle 10g and insert into oracle 9i database and using flag on both data to identified how many rows are transfered and how many remaining for trasfer.
We have requirement such that whenever stored procedure is executed, their resultant records has to be stored in excel file ( Just like an reports ).No third party tool or reporting tools are used.
is there any option in oracle (Stored procedure or built in packages ) which can create excel file with the resultant records.
Excel File Name : Product 01 Excel columns File Send on 13/02/2011
Arrival Date Product Code Gate Pass Quantity Inpection 01/02/2011 00002 Y 2 Y 03/02/2011 00001 Y 10 Y 04/02/2011 00005 Y 14 Y 03/02/2011 00006 Y 74 Y
File Send on 14/02/2011
Arrival Date Product Code Gate Pass Quantity Inpection 01/02/2011 00002 Y 2 Y 03/02/2011 00001 Y 10 Y 04/02/2011 00005 Y 14 Y 03/02/2011 00006 Y 74 Y ---New Updated Data 05/02/2011 00002 Y 2 Y 06/02/2011 00001 Y 10 Y 05/02/2011 00005 Y 14 Y 05/02/2011 00006 Y 74 Y
I just want to insert data according to my structure But if again the same file send with updated data it will only update the new data because previous data is imported.
Oracle Structure Arrival Date Date, Product Code char(5), Quantity Number
Now we are having 100+ sql queries and we making all those queries as procedures.after that we want to schedule those procedures and get data to export into excel file.
so we are planning to use utl_file to get data export excel. we may have rows of 30000 above.is it utl_file will be able upload all these rows into excel.any performance issue will come.
I have a XML data that needs to be stored in oracle table and it is read from a java application. what datatype will suit for XML storage varchar2 or CLOB. The length of XML will be less than 4000.
Oracle9i Enterprise Edition Release 9.2.0.8.0 - 64bit Production PL/SQL Release 9.2.0.8.0 - Production CORE 9.2.0.8.0 Production TNS for HPUX: Version 9.2.0.8.0 - Production NLSRTL Version 9.2.0.8.0 - Production
I have this query
select dept_id,qc_subtype_id,equip_code,drive_id from (select distinct dept_id, decode(qc_subtype_id,
when we have a primary key on 4 columns and we have, say 20 million rows and we want to add one extra row. How does oracle check whether the data on the primary key is unique to the record being added compared to the 20 million rows. Does it actually compare the record being added to all the rows present in the table?
I need to create a stored procedure in Oracle 9i which will automatically delete data one by one from a particular table and then by means of same procedure insert record one by one in same table.
Can i get some documents on oracle RAC database encryption.?what are the pros and cons of encryption?Does this comes with oracle Database or something we need to buy from oracle sales persons?