i'm trying since some hours to get information from inside a trigger from another table of the same schema.My trigger table is PSE_BKB.NUM_PHANTOM_BP
First i tried simple solution ...
CREATE OR REPLACE TRIGGER PSE_BKB.NUM_TR_PHANTOM_BP
BEFORE DELETE OR INSERT OR UPDATE
ON PSE_BKB.NUM_PHANTOM_BP
REFERENCING NEW AS NEW OLD AS OLD
[code]...
The last version i tried was with a cursor definition inside the trigger as in the code-block below.For debugging purposes i've inserted a RAISE_APPLICATION_ERROR in the inner loop - see below. The variable v_obj_key is never set, like in all other variations i tried - i alway see the predefined 'gugus' from the declare section.
It seems oracle cannot read from other tables at this point. The :old.phantom_key is set (in this simple example there should come one obj_key back).
trigger:
CREATE OR REPLACE TRIGGER PSE_BKB.NUM_TR_PHANTOM_BP
BEFORE DELETE OR INSERT OR UPDATE
ON PSE_BKB.NUM_PHANTOM_BP
REFERENCING NEW AS NEW OLD AS OLD
We have an application is must be connected to our database for specific requirements in our company but this application has a very bug thing as we must write the super DB password "Like HR password as example" clear in some files and these files must be shared so developers can use the HR password to do any action !!! I know that this application is a problem but we have to install
I can do this by creating trigger on each table will restrict DML. As example: if the operating system user is XXX, the trigger restrict the action. But not logic at all to create more than 1000 triggers on schema (This will impact badly on DB performance).
So, i need to create one trigger to fire before doing any DML on all schema tables. As example: If "MMM" the administrator operating system user trying to do insert action, he can do the action. BUT If "DEV" the developer operating system user trying to do insert action, the trigger must fire here to restrict this action.
Be noted also, i need this trigger not depend on any specific tool like Toad as any user can simply rename the exe file for toad then he can pass the trigger. At least, trigger must depend on (Operating system user & Action_type)
A user is using an ad hoc tool similar to SQL Developer called PeopleSoft Application Designer.
He creates a connection to the db, then issues an alter session set current_schema = 'restricted_schema'. The connected user does not have direct privileges on the "restricted_schema" which they call SYSADM.
After changing the schema context in that manner he creates objects in SYSADM. A schema trigger is then fired and grants privileges on the new objects created in SYSADM. Doing the same in either SQL Plus or SQL Developer does not fire the schema trigger.
I think SQL Plus and SQL Dev are working as they should. Altering the session like that does not change your identity - just the schema context. But, when you examine v_$session, the connection with this other tool looks exactly the same as one from SQL Plus or SQL Dev when changing the schema context in the session.
Instead of trying to figure out what this other tool is doing, is there any way for that schema trigger to fire when using this process from one of our tools?
I have created trigger on database level in system schema. While i am creating new tables in system schema, trigger logged the entry but when i am creating table in scott schema it is not working for that.
CREATE OR REPLACE TRIGGER ddltrigger AFTER DDL ON DATABASE BEGIN INSERT INTO aud_log (user_name, ddl_date, ddl_type, object_type, owner, object_name ) VALUES (ora_login_user, SYSDATE, ora_sysevent, ora_dict_obj_type, ora_dict_obj_owner, ora_dict_obj_name ); END;
I have created 2 different databases and each have their own schema user for eg we have database ALIVE with user allive and the other database RLIVE with user rllive , actually we have implemented new module in ALIVE database and in the process we created many tables , synonyms , index and other objects now we want to list out all the tables ,sequences as scripts which are not present in RLIVE and create them in RLIVE as new objects.
I would like to create a trigger that will fire whenever any user will access to a Schema of Oracle DB (for each and every login). regardless the access will be through an application or SQLPLUS and this trigger must insert the below information into a table.
1) IP address 2) Machine Name 3) login time 4) logout time 5) name of accessed schema
writing this trigger and creating the table that will hold the required data.
I have two schema named ODI_MASTER and ODI_WORK.Under ODI_WORK there are some tables like TEMP1, TEMP2 Further more when any new tables will create under ODI_WORK, then i need automatically grant select permission to ODI _MASTER schema.
for this purpose i choose trigger, and a Stored procedure.
CREATE OR REPLACE TRIGGER ODI_WORK.TRIG1 AFTER CREATE ON ODI_WORK.SCHEMA ENABLE call sp1 (ora_login_user)
[code]...
I searched a lot over blogs, if EXECUTE IMMEDIATE commands exist under Trigger it gets problem. Insert/update/delete statement on trigger seems no problem.
I am trying to get a row count(*) for all the tables in my schema. The NUM_ROWS column in DBA_TABLES is not appropriate in this case because they are as good as the last analyze. So I need to get real time counts.
I tried the following code but I can't seem to catch my error.
DECLARE
l_sql varchar2(150); cursor tablelist is select table_name from dba_tables where owner = 'ME';
User1 is having 10000 tables in his schema...How can i grant "select" on a all tables of a user1 to another schema(user2) so that in future when user1 will create tables , the user2 will have "select" access on those tables automatically.
I dont want user2 to have "select any table" privillege.
User2 should not have "drop" privillege on his own tables.
I want to take backup of selected data from tables of a schema (as huge data, that is not used, causing slow query performance) I planned to create a seprate backup schema and tablespace to store the data from these tables. Then write procedures that can move the data to and fro among table of those schema. And create partitioned index on those backup tables.
I have this pdf (2 Volumes) from Oracle University courses, "Oracle Fusion Middleware 11g: Build Applications with Oracle Forms" In order to read the examples you must have "Summit Office Supply Schema" fmbs, mmb, pll, tables.(dmp)
indicatively some tables (not confuse it with other summit user) customers, departments, warehouses, orders, employees, order_items, product_information, inventories
I need to set up a trigger to check and enforce that the age of an employee at hire date is older than 18 when a new record is inserted. The age is on the PERSON table, and the hire date is on the EMPLOYEE table.
My attempt:
CREATE TRIGGER AGE_HIRE BEFORE INSERT OR UPDATE ON EMPLOYEE FOR EACH ROW BEGIN IF FLOOR EXTRACT (YEAR FROM (PERSON.BIRTHDATE - EMPLOYEE.HIREDATE)) > 18 THEN DROP NEW ELSE INSERT NEW FROM PERSON, EMPLOYEE END;
I would like to be able to sort through a schema's tables to find if particular vendor numbers exist. The decode cross tab works fine, except that it needs static data for each table. How can I code a PL/SQL to populate the table_name and give a count for each row? See example below:
col table_name format a15 set numwidth 10 set lines 1000 set pages 50
select b1.table_name, count(case when a.vendor='86444' then 1 else null end) "86444", count(case when a.vendor='86445' then 1 else null end) "86445", count(case when a.vendor='86469' then 1 else null end) "86469", count(case when a.vendor='86470' then 1 else null end) "86470", [code]........
I'm trying to export a relatively large database but it's a bit more complicated than that.For one schema I need a full export / import (data included).
For another 10 schemas I need them empty, with the exception of a table in some of them which needs to be exported / imported with all data inside.Is it possible to do this with datapump utility (impdp, expdp)?
Afterwards I will be running some scripts to populate the DB instance with critical data / metadata.
I've got a schema that I've truncated all tables. I have a full schema export I took awhile back, and I'm wanting to import this into the schema to basically 'reset' it.
First time run, I got the :
ORA-39151: Table "xyz.tablename" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
I've been reading through, and see suggestions to add to the par file:
CONTENT=DATA_ONLY TABLE_EXISTS_ACTION=APPEND
And I've seen others use the option for:
table_exists_action=replace
I basically want to put the data back into the tables, and have the indexes rebuilt.....
I want to create a block of code that would search in all tables in a schema for a column_name where its data_length is like 4000 let's say. This data_length is actually dedicated for a comment column. If found, all not null column with like 4000 data_length will be changed by string "Comment has been removed". I have the following script below but it seemed lacking.
begin for rec in (select table_name, column_name from user_tab_columns where column_name like ?data_length? = 4000 order by 1,2) loop begin execute immediate 'update '||rec.table_name||' set '||rec.column_name||' = ''Comment has been removed'' where '||rec.column_name||' is not null'; commit; [code]........
We are adding support for barcode scanner in one part of our information system. So that the mechanics can add parts to an work orders bill of material themselves using barcode scanner to scan the part and enter the qty.
I can via SQL add a part to the bill of materials and reserve the part. But then the part needs to be issued to make it disappear from the stock.
I have found a table that contains information about the part and the bill of material, it has a column named "QTY ISSUED" I have tried via an update command to set the qty issued = 1 for the reserved part. The table is updated, and via SQL everything looks fine, but the part doesn't disappear from the stock. So my guess is that there is a trigger or function somewhere that I need for this.
I have to write a "after update trigger". Here, i have to update the stock table by other inventory tables (by complex query). I have written trigger below. how to make it correct?
create or replace trigger trg_stk_upd_pur after update on O_STOCK_EFFECTS REFERENCING NEW AS new OLD AS old FOR EACH ROW
efficient & fast method for Audit trailing tables in a Schema for any insert /update /deletes for a table. I do not want to use db triggers because of performance issues.
presently we have as system which does audit trail as below:
1. If a user changes a column value for a table ( update/insert/delete) then we call a db package and pass the parameters like table name,col name,user, operation (ins/update/delete),old value, new value,date modified,user modified etc
2. the called package will insert a record in the audit trail table with the parameters passed
3. The audit trail table is used for report generation .
Our Testing DB is running in No archive log mode. I did a schema level import by dropping the existing user that contain tables, recreate the user and finished the import. Now they want the old tables back.Is there is any way to recover the old tables?
I would like to create a trigger on a table which populates a log table. In addition to using the table where the trigger will exist, I would like to populate a couple more fields in the log table with with data from 2 other tables.
e.g.
NAME_TABLE -reg_id -name
ADDRESS_TABLE *trigger to be fired when a new record is created here. -reg_id -srv_id
PROCESS_TABLE -srv_id -start_time -end_time
This is what I would like the logging table to look like:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Master table "MVANMANNEKES"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded Starting "MVANMANNEKES"."SYS_IMPORT_SCHEMA_01": mvanmannekes/******** schemas=cmsstagingb remap_tablespace=cmsliveb_data:cmslivea_data
A single master schema where many developers are accessing. all share same password.
now i would like to trace all the changes made by each users. so i create a individual users for all and grant permission to access that schema.do i have a possibility of auditing the changes did by each user for that particular schema
We have an application with many separate databases (one per customer). Given they share the same business requirements (service hours, change mgmt etc), we're interested in potentially consolidating the separate DBs (which are relatively small) into separate schemas within a fewer no of databases to reduce the overhead.
Our issue is that the application is hard-coded to use a specific administrator and application connection user name. Changing this is unfortunately not an option.
Given this limitation, is there any possibility to map a generic user into a customer-specific schema based on the database service that they connect to? Each customer connects to different database services but may use the same user name. We considered using private synonyms but this seems to acheive the opposite (i.e. many different users could connect and map to a single users schema). One thing to point out is that where there is a single user name, it is acceptable for a single password to be used across the different customer DBs as they will be a single admin/user.
I would like to create a table in another schema(CBF) as already exist in my schema(TLC) without data but related indexes,synonyms and grants should be include.
How could I do this without using export import. I am using TOAD 9.0.1.