Create Trigger So That Whenever Record In Employee Table Deleted?
Dec 7, 2009
I'm trying to create a trigger so that whenever a record in the Employee table is deleted, a trigger will automatically delete corresponding records in the Job History table, then the Employee record is archived to EmployeeArchive before it is deleted. It compiles but with warnings. Here's what I've got.
CREATE TABLE EmployeeArchive
(EmployeeID Int, FirstName Char, LastName Char,
EMail Char, PhoneNumber Int, HireDate Date, JobID Char, Salary Int,
Commission Int, ManagerID Int, DepartmentID Char);
[Code]....
View 11 Replies
Nov 23, 2008
I gave this a shot on my own, and failed, so I am putting the question to you:
I am looking to create a trigger on the "Account" table
CREATE TABLE Account(Acct_no number, BillAmt number, OpenDate Date, SSN number,
Primary Key (Acct_no),
Foreign Key(SSN) references ResponsibleParty(SSN));
which will update the BillAmt field whenever one of the account's respective records is inserted/updated/deleted from the following tables:
CREATE TABLE Plan(Plan_id number, Cost number, Minutes number,
primary key(Plan_id));
CREATE TABLE Feature(Feature_id number, Description varchar(20), Cost number,
primary key(Feature_id));
These tables are linked through the following table:
CREATE TABLE Phone_Line(MTN number, cStart Date, cEnd Date,
rDate Date, Status char, ESN number, Acct_no number, Plan_id number,
primary key(MTN),
foreign key(ESN) references Equiptment(ESN), foreign key( Acct_no) references Account(Acct_no),
foreign key(Plan_id) references Plan(Plan_id));
View 26 Replies
View Related
Oct 21, 2011
I have a csv file which is in this format -
qwerty SCHEMATIC FILE
; Version 2007.7.1
Project,qwerty Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,
abc,25150-28407.dat,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1
xyz, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0
Product,00094416505678,19133,"24-36""X80"" FOLD DR VIA MEH",,4.5,81,3, 8421504,,0,,L.T.L. WHOLESALE.,abc,,0,0,0,
Product,00094416502345,37154,"24-36""X80"" xyz WHITE",,5,81,3, 8421504,,0,,L.T.L. WHOLESALE.,abc,,0,0,
Product,00094416501111,83120,"24-36""X80"" abc WH",,5,81,3, 8421504,,0,,L.T.L. WHOLESALE.,abc,,0
Can I create an external table for loading this file based on selection of fields driven by the record identifier which is the first field of each record, except for the firt two?
For e.g In the above file, for records which say 'abc' in it's first field, I want to load the 2,3,6,8 fields delimited by a comma (the values may as well be null in those fields) and for records which say 'Product' in their first fields, I want to load 5,8,9 and 10 fields from the same file. Basically, I want to know if we can use instr function while choosing the fields line by line based on a search criteria in the file. There will be less columns in my table than the number of fields in the csv file, so I guess I got to mention 'MISSING FIELD VALUES ARE NULL' option. There is another challenge too - I have to skip loading the first two lines from the file to the table.
I have written a big pl/sql proc for doing the same, using utl_file.get_line option, but it is still untested and I would be extremely happy to believe that this can be achieved by creating an external table too.
View 11 Replies
View Related
Apr 21, 2010
I have the following case to solve:
Example Table:
Nr_ordPos1Pos2Itemqty
O4018510000107 170,00
O4018520000107 30,00
O40651010000107 500,00
O40651020000107 50,00
O4114510000107 300,00
O31141010000107 50,00
O3114520000107 50,00
I need to create a query that returns record by record a field qty_progr with the cumulate qty considering previous records. The result should be the following:
Nr_ordPos1Pos2Itemqty qty_progr
O4018510000107 170,00 170,00
O4018520000107 30,00 200,00
O40651010000107 500,00 700,00
O40651020000107 50,00 750,00
O4114510000107 300,00 1050,00
O31141010000107 50,00 1100,00
O3114520000107 50,00 1150,00
View 8 Replies
View Related