I trying to create a procedure that could update one or more different table into clob data - but only those tables that contain your search (indexed) word.
I have a 3 tables: item, common_lookup, member
>
CREATE INDEX i_item_barcode ON item (item_barcode) INDEXTYPE IS ctxsys.context;
CREATE INDEX i_item_title ON item (item_title) INDEXTYPE IS ctxsys.context;
CREATE INDEX i_item_subtitle ON item (item_subtitle) INDEXTYPE IS ctxsys.context;
CREATE INDEX i_common_lookup_meaning ON common_lookup (common_lookup_meaning) INDEXTYPE IS ctxsys.context;
CREATE INDEX i_account_number ON member (account_number) INDEXTYPE IS ctxsys.context;
[Code]....
Now is the question.. If it possible to create the procedure search_execute which can add data depending on subbmited parameters ?
If it possible, can You give me an example (dbms_output/update_clob), for search_execute to recive something like that:
>
search_execute ('Wars', item, item_title)
>
>
select * from clob data :
1 1068 ASIN: B00003CX5P 1014 Star Wars - Episode I The Phantom Menace (CLOB) (BLOB) PG MPAA 22-mar-2005 3 16-lut-2012 17:31:55 3 16-lut-2012 17:31:55
2 1069 ASIN: B00006HBUJ 1014 Star Wars - Episode II Attack of the Clones (CLOB) (BLOB) PG MPAA 22-mar-2005 3 16-lut-
[Code]....
I try write something like this (it is not compile - and I know where is mistake, I tried to figure it out in several different ways - This is just my way of thinking expression) :
>
CREATE OR REPLACE PROCEDURE search_execute (v_text IN VARCHAR2, v_column_name IN VARCHAR2, v_table_name IN VARCHAR2)
IS
v_clob CLOB;
v_tmp_clob CLOB;
CURSOR c1 IS (SELECT * FROM v_table_name WHERE contains (v_column_name, v_text) > 0);
CURSOR c2 IS (SELECT column_name FROM user_tab_cols WHERE table_name = v_table_name);
BEGIN
Im facing some issues with my form, getting stuck... prob desc below : I have Table A which has columns; car_year, car_type, line_num, line_text. Table B has country, car_year, car_type, line_num, data_entry_amt.
Table A contains data which gets updated once every year only. Contains what year model is the car, what type of car it is.. Line_num has numbers starting from 1 which indicates the different part number and line_text has description for that line_num. eg : 2010 Toyota 1 Windshield 2010 Toyota 2 Door 2010 Toyota 3 Tire 2010 BMW 1 Windshield 2010 BMW 2 Door 2010 BMW 3 Tire 2010 BMW 4 Rear_mirror
Table B contains specific data related to table A, Contains country where car details n prices are, car_year, car_type, line_num, and amount($) for that part. for example : Australia 2010 Toyota 1 400.50 Australia 2010 Toyota 2 200.40 Australia 2010 Toyota 3 308.25
So in year 2010, in Australia, Toyota's Door was sold at $200.40 Now, Table A will have similar data for this year and users will enter data for table B throughout this year. I tried master-detail form for this but it doesnt work. Because every year line_num change in table A and therefore cant implement a fixed number or rows on the form for amount for table B.
How to use dynamic listing but im not familiar with it. So how i should go about doing this. My form has structure has below :
i used sql loader to import data from csv file to my db.but every time the columns places are changed.o i need dynamic way to insert data into correct column in the table.
in csv file contains column name and i insert this data to temp table, after that i want to read data over column name.also i read the column names from (All_Tab_Columns) to make combination of column name between temp table and All_Tab_Columns table to insert data to right place...
the code works "fine" here but the xml in the fb_xml table is not EXECUTED it simply escaped by oracle , i know i should add something so that the XML have to execute and generate.
We get data from our customers which we load into temporary tables.The goal is to consolidate this data into one single table.
Following are the rules:
1) final table should have all the columns from all the tables. If there are common column(s) then add only one column with that name.
2) the join would be based on all the common columns
3) if there is a common row, we merge the row into one (example, the row with DOMAIN = ACME.COM)
4) There could be 'N' number of tables
Following is the most realistic data.
1) T1/T2/T3 has the sample data which cover most of our test cases
2) We are expected to transform the data from T1/T2/T3 as depicted in table T4.
3) we might have more than 3 tables in our production environment, so the query should work for N tables.
4) I have given the explanation of how each row should be derived to be inserted in T4
5) the only information we have to work with is the TABLE_NAME(s) and its metadata from USER_TAB_COLUMNS
DROP TABLE T1; DROP TABLE T2; DROP TABLE T3; DROP TABLE T4;
[code].....
Explanation for each row:
row1) This row comes from T1 and T2 (not T3 because HOSTNAME would not match) row2) This row comes from T1 and T3 (not T2 because HOSTNAME would not match) row3) This row comes from T1 and T3 row4) This row comes from T2 and T3 row5) This row comes from T3
i'm working in an Oracle 10g database on an IBM AIX server.
I have 3 tables (tables A, B and C).
Table A has columns -- product, rate and expiration date.
Table B has columns -- product, rate and deductible.
Table C has columns -- product, rider, gender, age and rate.
I also have a Master table which is used to store the data from Tables A, B and C via the insert statement.
I'm trying to create a dynamic SQL insert statement using a shell script to insert data from the columns in Tables A, B and C into my Master table. Master table does contains all columns from Tables A, B and C, although a column name could be spelled differently. For example, Master table contains a column named "deduct", while Table B has the same column spelled as "deductible".
I build the dynamic query using a for loop in my shell script (see below).
The problem is that i can't get the correct columns in the Master table in the dynamic SQL for the insert because depending on the table i'm selection from, the columns are different. So how do i get the correct columns in the SQL for the Master table?
Example Shell Script
--Archive_Rates.txt contains: Table A, Table B, Table C (but the next time my process runs, Archive_Rates might contain Table D, Table E and Table F -- each which have different column...but all columns are still in the Master table)
for tbl in `more Archive_Rates.txt` do echo 'BEGIN WORK; ' > rc1.sql echo ' ' >> rc1.sql echo 'insert into Master' >> rc1.sql echo '(prod, rate, rate_exp) ' >> rc1.sql
I am trying to execute dynamic SQL in Stored Function and I don't know how to do this.
Explanation:
In the function I am calling pr_createtab is procedure which will create a physical table and return the table name in the out variable v_tbl_nm.
I need to query on this dynamic table and return the result as return result. But i am not able to do it.
Here T_web_loylty_report_table is a type.
CREATE OR REPLACE function CDW_DSS.f_ReturnTable(i_mrkt_id in number, i_cmpgn_year in number) return T_web_loylty_report_table is v_tbl_nm varchar2(50); i_cntry_cd varchar2(20); v_sql_str varchar2(32567); [code]......
I do not know the table name but have a query that for sure returns the table name and I want to select a column value from the table into my PLSQL variable.
i am practicing to get Dynamic columns as i have learnt from Orafaq SQL/PLSQL forum .i am using Default EMP table...first one is running smoothly as below:
I have 2 tables , which can be mapped with a common column and the second table's rest of the columns should become like columns(like pivot) and in the second table itself 1 more column called value , this sum of the value becomes a value to the pivot column value. For this I am using CASE structure and I am writing the individual case for each column , but I am sure after some extent case don't work at all and I would like to do it dynamically( when ever the new entry will come into the source table , my proc has to pick automatically the new value and put the new value as part of pivot structure) , so pls share your inputs and if possible provide some sample code.
I want to execute a dynamic query which is stored in a Table.
Output of that query should be stored in database server.
Is there any way i can create a dynamic procedure? I have created a sample code but issue is i cannot make the below data type dynamic as per the query.
en com_fund_info_m%ROWTYPE; CREATE TABLE TEMP (SQLSTATEMENT VARCHAR2(100)) DECLARE TYPE r_cursor IS REF CURSOR; c_emp r_cursor; en com_fund_info_m%ROWTYPE;
create table src(id number,val number,data varchar2(100)) insert into src values (1,1,'SUN'); insert into src values (2,2,'WED'); insert into src values (3,3,'MON'); create table trg(id number,val number,data varchar2(100)) required rows to be inserted in the target table.
insert into trg values (1,1,'SUNDAY'); insert into trg values (2,0,NULL); insert into trg values (2,0,NULL); insert into trg values (3,0,NULL); insert into trg values (3,0,NULL);insert into trg values (3,0,NULL);
{code} based on the column value of the source table src's column val , i need to populate my target table trg . If the value of val is 1 then only one target row is created in the target .If the value of val in the source table src is 2 then the target is populated with 2 rows .The values of the target columns are mapped as follow:
1)id -as it is
2)val - if the val of src is 1 then map the val as it is .If the value of val is more than one then create as many rows as the value of val ,id will be as it is and the value of val and data will be null
3)data - if the val of src is 1 then expand the abbreviation else null .
primary key constraint on transaction_dtl_bk is affecting the insertion of next correct rows.
CREATE OR REPLACE PROCEDURE NP_DB.san_po_nt_wnpg_1 ( dt DATE ) IS v_sql_error VARCHAR2 (100); -- added by sanjiv v_sqlcode VARCHAR2 (100); ---- added by sanjiv added by sanjiv
Oracle 11gI have a large table of 125 million records - t3_universe. This table never gets updated or altered once loaded, but holds data that we receive from a lead company. I need to select records from this large table that fit certain demographic criteria and insert those into a smaller table - T3_Leads - that will be updated with regard to when the lead is mailed and for other relevant information. select records from this 125 million record table to insert into the smaller table.
I have tried a variety of things - views, materialized views, direct insert into smaller table...I think I am probably missing other approaches. My current attempt has been to create a View using the query that selects the records as shown below. Then use a second query that inserts into T3_Leads from this View V_Market. This is very slow. Can I just use an Insert Into T3_Leads with this query - it did not seem to work with the WITH clause? My Index on the large table is t3_universe_composite and includes zip_code, address_key, household_key.
CREATE VIEW V_Market asWITH got_pairs AS ( SELECT /*+ INDEX_FFS(t3_universe t3_universe_composite) */ l.zip_code, l.zip_plus_4, l.p1_givenname, l.surname, l.address, l.city, l.state, l.household_key, l.hh_type as l_hh_type, l.address_key, l.narrowband_income, l.p1_ms, l.p1_gender, l.p1_exact_age, l.p1_personkey, e.hh_type as filler_data, 1.p1_seq_no, l.p2_seq_no , ROW_NUMBER () OVER ( PARTITION BY l.address_key ORDER BY l.hh_verification_date DESC ) AS r_num FROM t3_universe e JOIN t3_universe l ON l.address_key = e.address_key AND l.zip_code = e.zip_code AND l.p1_gender != e.p1_gender
I have a plsql block construct where i want to use for loop dynamically , the query which for cursor for for loop will accept the table name from parameter and join them to return the result. the resultant data will iterate in loop and do the execution.
how to play around with NDS dynamic sql and I'm trying to add a column on the fly.Basically the procedure is trying to take a table name, column name, and eventually a data type and adds it to a table.
It works fine without the bind variable for the column name, accepting the table name on the fly.As soon as it tries to use the column name I get an ORA-00904 invalid identifier exception.
Here is the procedure I'm using
CODEcreate or replace procedure test(tbl_name varchar2, col_name varchar2) IS qry varchar2(500); begin
My scenario is I need to insert into History table when a record is been updated into a tabular form(insert the updated record along with the additional columns Action_by,Action_type(Like Update or delete) and Action Date Into History table i.e History table contains all the records as the main table which is been visible in tabular form along with these additional columns ...Action_by,action_type and action_date.
So now i dont want to create a befor/after update trigger on base table rather i would like to create a generic procedure which will insert the updated record into history table taking the page alias and pade ID as the parameters(GENERIC procedure is nothing but whcih applies to all the tabular forms(Tables) contained int he application ).
I need to take a snapshot of a table before insert or update happens to that table.... in oracle 10g. I am reading the MV docs from oracle and below link..
[URL].......
how MV should be written for this and how to schedule it in dbms_jobs for auto refresh?
assuming that t1 is the table where DML operation are goin to happen so before any insert or update, snapshot has to be taken, and I am assuming that to do this it would look something like this?
create materialized view my_view refresh fast as select * from t1;
Step 1.Insert all records from the external table into the export table. Truncate the export table first
Step 2.Read in a record from the export map table
Step 3.Search through export table records looking for the key words BRANCH =. Compare the branch code with the branch code form the map table
Step 4.If a match is found mark all records in the export table for the worksheet with the global ID from the export map table as follows..The first line of a worksheet is marked by the words WKSHTS..The last line of the work sheet is marked by the words COMPANY CONFIDENTIAL..We will need to capture the line break so also mark the next line after the COMPANY CONFIDENTIAL line
Step 5.Continue with Steps 2 - 4 until all records have been processed from the export map table.
first I have to create a procedure ti insert data from external table to export table.Global id will be blank.it will be updated by the mapping table's Global Id when The EB COLUMN's data(i.e 8p,2Betc ) will match with the BRANC=NA,2Betc of the datasheet loaded from the external table.. FOLLOWING IS THE SAMPLE DATASHEET
WKSHTS AAAAA BBBBBBBBBBB ELECTRONICS INC. TIME REPORT-DATE PAGE SORT - BR, SLSREP AEC FIELD SALES REPRESENTATIVE 16:14 09/21/12 1 BRANCH = 2B EMPLOYEE NAME SALVAAG, GREGG Days in the Month 28 [code]....
THERE ARE 2 pages..I have to split this LONG REPORT STORED IN WKSHT_LINE COLUMN OF EXPORT TABLE to 2 records..like wise 500 pages are there means 500 records.. AND THEN FIND BRANCH= after that which two words will come i.e NA,2B etc if it will MATCH WITH MAPPING TABLE"S EB COLUMN"S DATA,THEN MAPPING TABLE's GLOBAL ID WILL BE UPDATED TO EXPORT TABLE's GLOBAL ID WHICH IS BLANK
I am having a table with 4 columns as mentioned below
For a particular prod the value greater less than 5 should be rounded to 5 and value greater than 5 should be rounded to 10. And the rounded quantity should be adjusted with in a product starting with order by of rank with in a prod else leave it
I have taken all the records in to a cursor. Once after rounding the request of 1st rank and adjusting the values of next rank is done. Trying to round the value for 2nd rank as done for 1st rank. Its not taking the recently updated value(i,e adjusted value in rounding of 1st rank).
This is because of using a cursor having a value which is of old value. Is there any way to handle such scenario's where cursor records gets dynamically updated when a table record is updated.
I will have to take the Employee Names and create a table structure. Number of employee names can vary from day to day. So, whenever I execute my procedure with Table type, I will have to build the table columns with the employee names.
Trying to auto insert the newest records from one table into another Table. I have a vendor provided table that is part of my database (running Oracle 9i) so I can't change the underlying structure to it or their process stops fluxing. However, I can add a trigger to it. What I want to do is this:
When the vendor's software inserts a new row (through their own automated process) I want to insert data from that same new record into another table of my own. (where of course I can re-format it, etc., and make the data my own)
The original vendor table does not have a insertion timestamp field to work off of.What is the best way to trigger an insert off the latest inserted record? It works to replace all the records in the entire vendor table but I only want to insert one record at a time.