Security :: Viewing Report Of Audit Trail From Dba_audit_trail
Apr 11, 2013
in my environmnet audit is working audit_trail=db,extended . i am also viewing report of audit trail from dba_audit_trail or aud$. But problem is that i have to generate report on which object of schema what audit is running .
or from which tables we can get information of following commands.
AUDIT ALL BY xx_test BY ACCESS;
AUDIT SELECT TABLE, UPDATE TABLE, INSERT TABLE, DELETE TABLE BY xx_test BY ACCESS;
AUDIT EXECUTE PROCEDURE BY xx_test BY ACCESS;
I am trying to setup logon/logoff auditing for our databases which reside in 9i and 10G on sun solaris servers. I am asked to turn on auditing sending the audit data to syslog! How exactly do you do that?
In Oracle Database 11.2.0.2, to delete audit trails after the audit records have been inserted into Oracle Audit Vault, is it necessary to schedule Oracle Audit Vault jobs to clean up audit trails on a scheduled basis, or AV automatically cleans up audit trails after the audit records have been inserted into the Audit Vault? I know there is a DBMS_AUDIT_MGMT package, but in 11gR2, the deletion of audit trails isn't done automatically?
I installed Audit Vault Server 12 (not install firewall) in a oracle linux vmware and activated an agent for Oracle 11g release 2 in windows 7 x64 vmware according to Oracle® Audit Vault and Database Firewall Installation Guide and Administrator’s Guide Release 12.1.0 as follows:
1) ALTER SYSTEM SET AUDIT_TRAIL=XML, EXTENDED SCOPE=SPFILE; Database restart
2) Register the Oracle Database Host Machine
3) Deploy Agent and Request Activation on the Host Machine
4) Create user accounts on the secured targets and set up Oracle AVDF user privileges on an Oracle Database secured target.
5) Register Secured Targets in the Audit Vault Server with user acount of stpe 4:jdbc:oracle:thin:@//IP:1521/orcl
6) Configure an Audit Trail in the Audit Vault Server : TABLE - sys.aud$ or DVSYS.audit_trail$, DIRECTORY - directory of audit trail xml saved.
I turned off firewall just in case.Administrator web page of AVDF showed only messages of "request completed" after configuring an audit trail in the Audit Vault Server.But, collection state was a red downward arrow, and even auditor web page showed same state.I couldn't show audit trails in the auditor web page.
I have to find all the 'failed log ins' through audit report. then it has to be uploaded to a table. The script, either in windows or unix should be reusable and can read files one by one.
I am using oracle developer 10g. I want to know the status of the printer where i want to print. If the running report is printing or in queue then a record is to be inserted into a table as audit-trail of printing. Idon't want to do it manually.
[oracle@RSASPGERP02 ~]$ cd / [oracle@RSASPGERP02 ~]$ . .bash_profile [oracle@RSASPGERP02 ~]$ sqlplus / as sysdba sql*plus:Release 10.2.0.4.0 - production Error: ORA-09925:Unable to create audit trail file
[code]....
its standby archieving problem ,the problem appears when try to connect directly or through telenet and we try to login directly using oracle user we receving following message and login fail, "GDM could not write to your authorization file,this could mean that you are out of disk space or that your home directory could not be opened for writting"
I am trying to understand how to enable some audit so we can capture OEM logins as well.
Here is my setup. Lets say my DB that i am auditing is called audit_db (audit trail set to DB) sitting on host called host_db. and we have grid control agent on this box.now my grid control is as this. Lets say my OMS and repository is on host called OMS_host.
we run query aginst dba_audit_session to get info as to who tried failed login attemps and stuff.
Now to the part that is not working.
-- this is the good part When i intentionaly login to the audit_db with sqlplus client from my laptop with wrong username/password that is captured. we get the username,os_username,userhost,terminal.
here is the sample output
username is the wrong user that i tried to login as os_username is the my local username(ad account) userhost is my_laptop_name terminal is laptop_name
from above we can figure who was trying to login(failed).
-- this is the bad part But lets say i try to login to audit_db through grid control and use wrong username/password.that gets captured too(but not all of it). we get the username,os_username,userhost,terminal.
here is the sample output
username is the wrong user that i tried to login as os_username is the user of OMS repository db(oracle) userhost is oms_host terminal is unknown
Now with the above info, we cannot figure out who tried to login with bad login credential.
Can viewing and direct printing the report become same? I always find some differences not in the data but the layout of the report. the layout of report that is viewed on the monitor through internet explorer is slightly displaced and small fonts compared with the report printed directly to the printer without viewing it on the screen.
Let us say I want to audit data updates, deletes on existing table EMP_TAB that has a few hundred thousands of records.I created a shadow table Emp_tab_audit and added few audit columns
Emp_tab ( Empno NUMBER NOT NULL, Ename VARCHAR2(10), Job VARCHAR2(9),
[code]...
I am mostly interested in UPDATES and DELETES but I decided to add INSERTS to have full history for each eomplyee in one table (audit schema) instead of querying two tables all the time (production table and audit table) to see the changes.
I created this AFTER INSERT, UPDATE, DELETE trigger.decided to copy the :NEW values for INSERT and UPDATE and :OLD values for DELETE. attached.
so when insert happens, the first audit row is created in EMP_TAB_AUDIT. update happens, the 2nd new row is created in EMP_TAB_AUDIT.
The problem I am facing is the old records that curently exist. If someone updates an old row I am copying the :NEW values so I won't have a copy of the :OLD values unless I create 2 ROWS (one for the old and one for the new).
Do you think I should copy all the hundreds of thousands of records to the AUDIT tables for this to work.
******************************************************************* CREATE OR REPLACE TRIGGER TRG_EMP_AUDIT AFTER INSERT OR DELETE OR UPDATE ON EMP_TAB FOR EACH ROW DECLARE v_operation VARCHAR2(10) := NULL;
getting logon timestamp for our auditing process. In some website, it says to get the logon timestamp I have to select it from timestamp column of dba_audit_session but when I do this, some results has a logoff earlier than logon. Is timestamp column really the logon timestamp?
Oracle Audit Vault 10.2.3.2 & Linux Red Hat 5 on a V.M. box
I'm new to the Audit Vault and am experiencing some issues. Right now my biggest question is how does A.V. deal with TNS issues (allowing the collectors to find remote target db's) when we do not add any tns entries to the local tnsnames file?
We've recently added more space and our sysadmins have moved all of our AV data to the new disk space and have supposedly updated pointers allowing us to continue seamlessly. However now my collectors won't start, they are complaining with message below. These had started previously before the space add.
============================================================== Dec 13, 2010 11:44:35 AM Thread-10 FINEST: resp.getData:<?xml version='1.0' encoding='UTF-8'?> <auditException errKey="av.auditservice.DAO_INITIALIZATION_FAILED.9" ><nestedException message="ORA-12154: TNS:could not resolve the connect identifier specified " exceptionClass="java.sql.SQLException"/></auditException> ===============================================================
i want to trace user activities, I want to generate the file with SQL statement generated by particular user. I tried to look in to sys.AUD$ but all i get is logging logout and locations but no SQL Text.
I am using 'Novell Sentinel Log Manager' to collect/fetch logs from my Oracle 11g R2.To enable auditing, first I did following:
login as sys, then SQL> create user testuser identified by "testuser"; SQL> grant connect to testuser SQL> grant dba to sharf SQL> grant CREATE SESSION to testuser; SQL> grant select on v_$session to testuser; SQL> grant select on v_$version to testuser; SQL> grant select on SYS.DBA_AUDIT_TRAIL to testuser; SQL> grant select_catalog_role to testuser; SQL> grant select any dictionary to testuser;
Now logon/logof of user 'testuser' are logged , as well as if testuser drops a table or creates a table, its also logged . but when 'testuser' insert a new record, this information does not logged ;( while I need to know exactly what was added SQL> insert into emp (empid, name, salary) values (10002, 'Ron', 6000)
likewise if 'testuser' modify/update an existing record it also does not logged. SQL> update emp set salary=700 where empid=10001;
which sql statements I have to execute to start auditing 'insert' and 'update', so that I know what was added/inserted and exactly what was updated/ changed/modify by user 'testuser'.
I want to audit user connection on my reporting database, and send a report to application team on monthly basis, with a list of users who are not connected for a month and remove them.
What would be best method, i know there is LOGON trigger, or database level auditing.
My database is hung while generating the AWR report. I observe the mmon is running for long time. is it okay if i kill, will that have any impact to database.