I'm a SAP consultant working in SQL on NT platforms. This is the first conversion from Oracle that I have done. My client has provided us with a "Cold" backup of the Oracle dbase on a HD formatted in Unix, I have the partition mounted and I'm able to view the files. I have the ORDATA folder with all the .DBF files.
Q: How do I extract the data from the .DBF files. I need to export to something workable with SQL.
Original database was on Unix, I'm operating on Windows platform.
Im having a problem with writing an appropriate query for a report in my web application. I need it to extract data from three related tables:
CAR( PK CAR_ID INT NOT NULL, TYPE VARCHAR NOT NULL) REPAIR_CENTER( PK REPAIR_CENTER_ID INT NOT NULL, NAME VARCHAR NOT NULL) [code]...
I need the report to display only available cars. Available cars must have these characteristics:
1. if the CAR_REPAIR table is empty, displays all entries from CAR table... 2. if car has multiple entries in the CAR_REPAIR table display only the latest DATE_RETURN if its lower than todays date (SYSDATE), otherwise don't display that car... 3. don't display cars that are in the CAR_REPAIR table and have DATE_RETURN value of NULL
Is there a way where i can extract the data from Oracle Express 6 (OLAP data) into an excel or any other format, so that the same can be loaded onto a MS Sql or a normal Oracle database.
We are getting following error when we are trying to extract data from ASM.
GGS ERROR 500 Oracle GoldenGate Capture for Oracle, ext_1.prm: Getting attributes for ASM file +DATA/testgg/onlinelog/group_1.257.742844671, SQL <BEGIN dbms_diskgroup.getfileattr('+DATA/testgg/onlinelog/group_1.257.742844671', :filetype, :filesize, :lblksize); END;>: (6550) ORA-06550: line 1, column 7: PLS-00201: identifier 'DBMS_DISKGROUP.GETFILEATTR' must be declared ORA-06550: line 1, column 7: PL/SQL: Statement ignoredNot able to establish initial position for begin time 2011-02-16 16:42:05.
How to extract the data from XML using the xsd file. attached files.
Explanation: first check the EmailMessage tage from order_conf.xml compared with Email.xml(<xsd:element name="EmailMessage">) if exists then go to next node. EmailMessage(exists tag in order xml file) ->next <ns1:emailNotificationype> this tag should be follow under the EmailMessage tag(<xsd:element ref="emailNotificationype">) in Email.xml ->next <ns1:orderNotification> -> check this tag in <xsd:element name="orderNotification"> in Email.xml. -> next <ns1:templateFormatInfo> -> it should follow under <xsd:element name="orderNotification"> in Email.xml. -> next <ns1:templateFormatInfo> -> it should follow these tages <xsd:element name="templateFormatInfo"> <xsd:element ref="templatecode"/> <xsd:element ref="templateversion"/>
I have a table which have two columns date on hourly basis and response time. I want to pull the previous date's data on hourly basis with the corresponding response time. The data will be loaded to the table every midnight.
eg: Today's date 23/10/2012 I want to pull data from 22/10/12 00 to 22/10/12 23
The below query is pulling the date as required but I am not able to pull the response time.
with a as (select min(trunc(lhour)) as mindate, max(trunc(lhour)) as maxdate from AVG_HR) SELECT to_char(maxdate + (level/25), 'dd/mm/yyyy hh24') as dates FROM a CONNECT BY LEVEL <= (1)*24 ;
I have a requirement to extract the data from a table using the UTL file utilities.
My problem is, Say i have a table t1 with column C1,C2, C3, C4, C5. This table t1 gets loaded everyday. i need to pickup the data only that which has changed/inserted in the last load. How can i achieve this ? There is no timestamp in this table.
select t1.c1 as gr1, t2.c1 as gr2, t1.c2 from test_data t1,test_data t2 where t1.c1<>t2.c1 and t1.c2=t2.c2 and (select count(*) from test_data t3 where t3.c1=t1.c1)= (select count(*) from test_data t4 where t4.c1=t2.c1) order by 1 asc, 2 asc
but I don't find the way to refilter to group the data as expected. The idea is find subsets and show the set of data and values in column c1.
Is it possible for Access to extract data from an Oracle database and upload it directly?
Currently we have a business process where data is being extracted in scheduled queries (30+) to Excel spreadsheets, then manually edited to remove heading lines and imported to an Access database. I see an opportunity to automate a time consuming manual activity by having the Access db extract the data and directly upload it.
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
In our application, we are allowing user to upload data using excel sheet in UI. We are using PHP script in UI and using SQL Loader to load data from excel sheet to temp_table.
The temp_table has a primary key.
Here my question is , Is there any way to put some batch id for every upload in that table in automatic way ? so that we can easily extract the data by using batch id . we are using Oracle 11g.
I am working in a bank as an system consultant, i have a SAN Storage Area and oracle as below.
SAN 1
This interface includes the DATA FILES of the oracle tablespace
SAN 2
SAN1 Mirrors the DATA FILES of the oracle tablespace to SAN 2
1. Can i rely on real time data recovery from SAN2 ? 2. if SAN1 (Data Files are currupted) will the SAN2 Data Files will be currupted as well. 3. If the SAN2 is currupted then what Oracle Features can be used to have uncurrupted data.
I have set up a cross platform (Microsoft Windows IA (32-bit) -> Linux x86 64-bit) data guard and it worked fine.Then I did a switch over (which again worked) and found out the data is not getting replicated at all.. checked the data files available from the new primary database and found out they are in the windows format as below..
SQL> select name from v$datafile;
NAME -------------------------------------------------------------------------------- D:ORACLEAPPADMINISTRATORORADATAMFSSYSTEM01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSSYSAUX01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSUNDOTBS01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSUSERS01.DBF D:ORACLEAPPADMINISTRATORORADATAMFSRMANRMAN_TS01.DBF
and physically they were created at '/home/app/oracle/product/11.2.0/db_1/dbs/' and as
we're planning a data migration from an application (oracle-based) to another (also with oracle db).
the origin is a ca. 80 GB database. so lots of millions of records are to be migrated. (before loading records into the destination tables, they have to be transformed).
the current concept is to receive all origin data in xml files, load them in a staging area (an own migration scheme in oracle), transform and load them into the destination tables.
we have three days for the whole migration (including extract from origin database, transform, load, backup after completion...).
my question is, that a migration with xml-files is a good concept. i think xml processing is much slower than doing the same with csv files. my proposal to migrate an oracle dump (so we got the original data in our staging area) was declined.
is migration mass-data with xml files good or are there performance or other issues?
We use Oracle Managed files for storing datafiles in ASM diskgroups. To add datafiles to a tablespace we usually issue SQL > ALTER TABLESPACE CADL_WM_TBS ADD DATAFILE '+DATA' SIZE 10g AUTOEXTEND Off;
Tablespace altered.And the new datafile will be created at the below location +DATA/orcl/datafile/cadl_wm_tbs.893.767888027But my db_create_file_dest is set only as DATA SQL > show parameter db_create_file
NAME TYPE VALUE ------------------------------------ ----------- ------------------------------ db_create_file_dest string +DATA
Although the above datafile got created in the desired location (ie inside +DATA/<dbname>/datafile/ directory), how did this happen without us setting the db_create_file_dest parameter to +DATA/orcl/datafile ?
I was trying to load data from XML files to an Oracle database table.I followed these below steps to load that file data into a table. Created XML_DIR1 as oracle directory where i have kept all XML files.
Create table import_rpt_xml of xmltypexmltype store as binary xml; insert into import_rpt_xmlvalues (xmltype (bfilename('XML_DIR1','I-Yamanouchi-20040525-501.xml'),nls_charset_id('AL32UTF8')));
This insert shows below error: Error starting at line 80 in command:insert into import_rpt_xmlvalues(xmltype(bfilename('XML_DIR1', 'I-Yamanouchi-20040525-501.SGM'), nls_charset_id('AL32UTF8')))
Error report:SQL Error: ORA-31061: XDB error: XML event errorORA-19202: Error occurred in XML processingIn line 69 of orastream:LPX-00217: invalid character 142 (U+008E) I tried to look into my XML and got that it has some Japanese characters in it.
this to deal with japanese characters in XML. I don't want to miss those characters. My databse NLS_CHARACTERSET is 'AL32UTF8'.