I have data in multiple oracle tables. I have to create a extract flat file after applying some validation and business logic on it and store it in unix server with naming convention FF_RMS_SC_<<YYYYMMDDhhmm>>.txt.This job will be scheduled to run daily to create the flat file. I guess pl/sql and unix needs to be used.
I have data in multiple oracle tables. I have to create a extract flat file after applying some validation and business logic on it and store it in unix server with naming convention FF_RMS_SC_<<YYYYMMDDhhmm>>.txt.This job will be scheduled to run daily to create the flat file.
A big table of size more than 4 GB from 10g DB needed to be extracted/exported into a text file,the column delimiter is "&|" and row delimiter is "$#".I cannot do it from TOAD as it is hanging while extraction of big table.
How to extract the data from XML using the xsd file. attached files.
Explanation: first check the EmailMessage tage from order_conf.xml compared with Email.xml(<xsd:element name="EmailMessage">) if exists then go to next node. EmailMessage(exists tag in order xml file) ->next <ns1:emailNotificationype> this tag should be follow under the EmailMessage tag(<xsd:element ref="emailNotificationype">) in Email.xml ->next <ns1:orderNotification> -> check this tag in <xsd:element name="orderNotification"> in Email.xml. -> next <ns1:templateFormatInfo> -> it should follow under <xsd:element name="orderNotification"> in Email.xml. -> next <ns1:templateFormatInfo> -> it should follow these tages <xsd:element name="templateFormatInfo"> <xsd:element ref="templatecode"/> <xsd:element ref="templateversion"/>
I have a requirement to extract the data from a table using the UTL file utilities.
My problem is, Say i have a table t1 with column C1,C2, C3, C4, C5. This table t1 gets loaded everyday. i need to pickup the data only that which has changed/inserted in the last load. How can i achieve this ? There is no timestamp in this table.
Upgrading from 10.1.0.2 to 10.1.0.5. Enterprise Manager requires 'newest' version of Oracle JDBC drive.Downloaded what I believe to the correct file (classes12.jar). I'm unclear what to do with this, my readings have pointed me in the following direction:
1) copy to c:oracleproduct10.1.0db_1jre1.4.1in 2) extract
here is the problem...tried:
1) just clicking on it (nothing)
2) c:program filesjavajre1.6.0_03injavaw -jar classes12.jar Error: Failed to load Main-Class manfest atrribute from c:oracleproduct10.1.0db_1jre1.4.1inclasses12.jar
Is my location correct, I've been hunting everywhere..making no progress.
I have a Bash script that counts the rows of a csv file, extracts the fields and makes inserts in a sql file. Then it logs into SqlPlus and calls the insert file. The sql file looks like this:
I rely on "WHENEVER SQLERROR EXIT" for things to go the right path. However sometimes because of the contents of the CVS files (which I can't control) some rows don't get inserted but SqlPlus doesn't see that as an error, doesn't exit and I end up with the wrong number of rows being informed in the second insert.Is there some kind of "if-then-else" construct in Sql? After all the inserts are made, do a "select count (*)" and compare that number to the one informed by the script. If they match, make the final insert and commit; else exit.
I am creating prim and phy standby on same WIN machine. I have done with all settings. I can do TNSPING to both oracle_sid.I am trying to connect as sysdba to UP the phy standby. But it gives below error:
Enter user-name: sys as sysdba Enter password: ERROR: ORA-12560: TNS:protocol adapter error
So i tried creating password file for phy standby. But gives below error:
Unable to find error file %ORACLE_HOME%\RDBMS\opw<lang>.msb
When a creating a pdf report in reports 6, under oracle 10g, the error message displayed is: The file is damaged and could not be repaired. If I change the report definition from a pdf to a txt file, the error does not display, but the output has the margings distorted.. It only happens when using reports 6
how to create a file in a folder based on todays date. i need to know how to define a variable in sqlplus and assign a value to it.Here is the code below. The code gets executed without creating a spool file.
DEFINE _DATE = replace('C:\_sysdate_EU001.csv', '_sysdate_', TO_CHAR(SYSDATE, 'DD-MON-YYYY')) spool _DATE set serveroutput on size 100000 select * from dual; spool off
using pl/sql code i am creating text file on specific path of database server. i need to compress this file .how to compress the text file using pl/sql?
Next week I will be getting an input file which will contain over 1000 data columns to be loaded into ORACLE. It's about 6,400 characters in length.
My question is...has ever created a huge ctl file like this to be used for SQLLoader, using so many columns? I will be sending certain columns(data) to certain tables, so it's not just going into 1. It will be about 6 tables.
I'm trying to create a store procedure that will accept a username from a flat file but i don't know how to do read file into store procedure.
Below is a sample store procedure by itself i created to add user which created okay but when i execute I got the error displayed below.
create or replace procedure addUsers(userNam in varchar2) is begin EXECUTE IMMEDIATE 'CREATE USER'||userNam||'IDENTIFIED BY "pass1234" DEFAULT TABLESPACE USERS'||'QUOTA "1M" ON USERS'|| 'PASSWORD EXPIRE'; end addUsers; /
Is there any option available in DBMS_METADATA.GET_DDL in such a way that I can extract the script (user creation+grants)only for that particular schema?
Create tabledrop table age_rate; CREATE TABLE age_rate(age_0_4 NUMBER(4),age_5_20 NUMBER(4),age_21_34 NUMBER(4),age_35_44 NUMBER(4)); -------------------------------Insertion INSERT INTO age_rateSELECT 45, 50, 60, 90 FROM dual UNION ALLSELECT 45, 50, 60, 88 FROM dual UNION ALLSELECT 40, 50, 60, 90 FROM dual UNION ALLSELECT 5, 50, 60, 88 FROM dual ; -------------------------------Query on table SELECT * FROM age_rate; Query Output age_0_4 age_5_20 age_21_34 age_35_44 45 50 60 9045 50 60 8840 50 60 905 50 60 88 Required outputRate Min_age Max_age
----The below rate is for age band 0_445 0 445 0 440 0 45 0 4
--The below rate is for age band 5_2050 5 2050 5 2050 5 2050 5 20
--The below rate is for age band 21_3460 21 3460 21 3460 21 3460 21 34
--The below rate is for age band 35_4490 35 4488 35 4490 35 4488 35 44
Rules--I have all data in rows so each column in row create separate rows and add 2 columns automatically Min_age and Max_age and insert value on these column on basis of column name for example if column name like age_0_4 then put 0 in min_age and put 4 in max_age means values for Min_age and Max_age extract from the basis of column name. I don't know if it is possible or not
I am using Oracle database version 11.2.1 and would like to extract the level change and level start date where reason_code is 'PROMO' split by ID.
The test script is below:
create table test( id number, start_date date, reason_code varchar2(10), level number ) insert into test values(001, '01-JAN-13', 'PROMO', 2); [code]....
The expected output would be:
Fields - ID, old_level, old_level_start_date, new_level, new_level_start date
Here i face probelm that he numbers must be follw by DOT "." , this is not correct if the statment only conatines numbers without DOT that not extract. As the
SELECT REGEXP_SUBSTR ( 'hello to 8898989898989 jkjk nnnm mnj' , '([0-9]+.[0-9]*)' || -- Starts with digit(s) (may or may not have digits after .) '|' || -- or '(.[0-9]+)' -- starts with decimal point ) AS result FROM dual ;
but mean i have to add . after numbers . as follow
SELECT REGEXP_SUBSTR ( 'hello to 8898989898989 jkjk nnnm mnj' , '([0-9]+.[0-9]*)' || -- Starts with digit(s) (may or may not have digits after .) '|' || -- or '(.[0-9]+)' -- starts with decimal point ) AS result FROM dual ;
I'm a SAP consultant working in SQL on NT platforms. This is the first conversion from Oracle that I have done. My client has provided us with a "Cold" backup of the Oracle dbase on a HD formatted in Unix, I have the partition mounted and I'm able to view the files. I have the ORDATA folder with all the .DBF files.
Q: How do I extract the data from the .DBF files. I need to export to something workable with SQL.
Original database was on Unix, I'm operating on Windows platform.
Am trying to extract data from oracle onto flatfile(.txt), am using UTL_FILE but, NULLs in oracle tables are getting converted into space and if i try loading into table it is getting loaded as space,
I need to extract DDL's without storage parameters. if i use the export and import using indexfile or if i try to extract using the DBMS_METADATA.GET_DDL package , in both ways my output is with the storage parameters