I am researching a performance problem on an Oracle Preprd DB in a RAC cluster using AQ. The queue table has 88 records, is about 900Meg in size, takes 90+ seconds to do a select count(*). In Prod the same table is about 44 records, 80 Meg in size, and takes about 9 seconds to query the table. The DB is at 10.2.0.4 running on a LINUX/Sun host. In the USER_DATA column I am seeing an entry in the STR_VALUE that displays 'Unable to successfully deliver a message after "MaxDeliveryCnt" attempts. Please verify that the onMessage() method of the MDB does not throw a RuntimeException' for each record in the table. I don't have direct access to the PROD DB so I can not verify this message in that environment. My question, is this table holding onto records that should be cleared out and should this table be dropped and rebuilt to reduce its size. I have seen this technique improve performance for other non-queueing tables. But I don't know if this is possible with a queue table.
I am trying to drop a user but i get the following error
* ERROR at line 1: ORA-00604: error occurred at recursive SQL level 1 ORA-24005: must use DBMS_AQADM.DROP_QUEUE_TABLE to drop queue tables
A couple of questions on this error:
- I did a search on the forum and the thread [URL] appears to have a solution to it by using
DBMS_AQADM.DROP_QUEUE_TABLE(queue_table => 'DEF$_AQCALL', force =>TRUE);
What i am not sure of is which queue will the above statement drop if run as sys and there are multiple schemas with different schema names but with the same queue name?
- We recently rebuilt the database server and prior to the rebuild i have always managed to drop a user without the above error. Is it likely that some setting somewhere is changed?
I have noticed that Oracle text related objects, particularily the $I tables are some of the largest objects in our database. I have been actively pursuing utilizing Oracle advanced compression in our databases for OLTP table compression and LOB object compression. I have been unable to find any documentation or notes on if it is advisable to implement either table OLTP or LOB compression for Oracle text objects.
I am trying to propagate messages between two queues that are in the same database.I did exactly as it says here: URL.....
The message is propagated successfully.The problem is that it doesn't to dequeue the message with the default subscriber of the destination queue.
The only difference between what I did and what is in the example above is that I didn't create a database link. I don't think I need to because the queues are in the same database, and the problem is not the propagation because as I said the message has moved to the destination queue.
Hybrid Columnar Compression is dependent on the underlying storage system. See Oracle Database Licensing Information for more information.
The below is from the Oracle® Database PL/SQL Packages and Types Reference Compression Constant Compression Level Description COMP_FOR_OLTP 2 OLTP compression COMP_FOR_QUERY_HIGH 4 High compression level for query operations COMP_FOR_QUERY_LOW 8 Low compression level for query operations COMP_FOR_ARCHIVE_HIGH 16 High compression level for archive operations COMP_FOR_ARCHIVE_LOW 32 Low compression level for archive operations
To use Compression Level 4 or higher do we have to have ZFS or Pillar storage ?
I have few BDs replicated using Advanced Replication, some tables are read only (Basic Replication) and another ones are Updatable (Advanced Replication).
The infraestucture is Materialized View Replication, my trouble is, I do not know how to should treat the updatable tables that have foreign keys, I read in another FAQs that you should replicate the indexes using the same way to replicate tables (dmbs_repcat.create_master_repobject)
Which is the correct manner to treat the foreign keys with this kind of updatable snapshots,
I need to Modify the column(MSGID) data type from RAW to BLOB for a Queue Table, I'm getting the following error.
BANNER --------------- Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production PL/SQL Release 11.2.0.2.0 - Production CORE11.2.0.2.0Production
YUV >
YUV > DESC PDA_REPORT_MESSAGE_QTAB Name Null?Type ----------------------------------------------------- -------- ------------------------------------ Q_NAME VARCHAR2(30) MSGID NOT NULL RAW(16)
YUV > YUV > YUV > ALTER TABLE PDA_REPORT_MESSAGE_QTAB MODIFY (MSGID BLOB); ALTER TABLE PDA_REPORT_MESSAGE_QTAB MODIFY (MSGID BLOB) * ERROR at line 1: ORA-22858: invalid alteration of datatype
YUV > YUV > YUV > YUV > YUV > ALTER TABLE PDA_REPORT_MESSAGE_QTAB ADD (MSGID_NEW BLOB); ALTER TABLE PDA_REPORT_MESSAGE_QTAB ADD (MSGID_NEW BLOB) * ERROR at line 1: ORA-24005: Inappropriate utilities used to perform DDL on AQ table PDADBA.PDA_REPORT_MESSAGE_QTAB YUV >
I want to send an email notification using oracle aq for events like when a message goes to error queue. Similarly,Can I send an email notification when I create a queue or drop a queue?
Due to firewall restriction, i have to configure the streams in queue forwarding. Intermediate database will hold two different ip's (one from source and another to target) and will not talk each other.
I am calling a report from a parameter form - the report is launched using a run report button, that all works great. What I want to do now is to add a 'Cancel Report' button to the form so that the end user can cancel the report and remove it from the reports queue.
,I am trying to set up Streams on a 11.2.0.3 on a Windows 2008R2 server. Due to an error in running propagation, i a, trying to delete both Capture and Apply queues. I have deleted the Apply queues but unable to drop the Catpure,
SQL>EXEC DBMS_APPLY_ADM.STOP_APPLY(apply_name =>'LAO_NLPG73_BLPU_APPLY'); SQL> select * from dba_apply; LAO_NLPG73_BLPU_APPLY NLPG73_BLPU_APPLY_Q STRMADMIN YES RULESET$_732 STRMADMIN STRMADMIN 301355 ABORTED 09/10/2013 17:34:21 1013 ORA-01013: user requested cancel of current operation CAPTURED STREAMS APPLY SQL> select * from dba_queues; STRMADMIN NLPG73_BLPU_APPLY_Q
I am using Java print API (javax.print package) to send a bunch of documents for printing. Below is the code section that I am using to print documents through java program. When the document is sent for printing, I see the Job name is created properly in the print spooling queue on windows machine. But, when i go to the actual printer, the job name is different on the printer than what i saw in spooling. Since I am printing 100s of documents in batch, it gets very difficult to identify which document did not print, in case of issues. I also used the java.awt.print package . The java.awt. print. PrinterJob has a method setJobName(String). When i used this package, I got the name appear properly in both places. But I wish to use javax.print with the document name appear on printer queue.
public void printDocument(File pDoc, PrintService pService, DocFlavor pFlavor) throws Exception { logMessage(true, "Printing Doc::" + pDoc.getAbsolutePath()); FileInputStream is = new FileInputStream(pDoc); // Create the print job DocPrintJob job = pService.createPrintJob(); //Set print request attributes with file name as job [code].....
problem on oracle 11gR2 where i have to import data from a source database to an existing table without truncate or drop the target table in the target database.
we have found something called table_exist_action=append in impdp.
I have to cleanup data from our tables (Production Environment) that contain millions of rows. The question is apart from the solution of the partitioned tables what alternative recommended solution suggests Oracle?
To delete these tables by using a cursor PL/SQL block or to import all the database and in the tables that we want to remove the old rows to use the QUERY option of the data pump utility.
I have used both ways and i have to admit that datapump solution is much much faster than the deletion that suffers from I/O disk.The question again is which method from these two is more reliable and less risky for the health of the database.
I came across an implementation where data from DB2 tables are moved to Oracle tables, for BI solutioning, using some oracle procedures called from MS SQL DTS packages which are scheduled jobs.Just being curious, can this be done using OWB or ODI rather than the above detour. I suppose there are some changes being done in those procedures before the data is being loaded into Oracle tables, can't this be done using OWB/ODI? Can it be scheduled too as jobs using OWB/ODI?
In our schema we have corresponding audit tables for most of the production tables
Ex Table name Audit Table EMP EMP_AU DEPT DEPT_AU
Audit tables will have all the columns of production table along with audit columns AUDIT_DATE , AUDIT_OPERATION There are few production tables which are not having audit tables.I need to write a script to identify
1) Production tables where corresponding audit table is missing
2) Where there is column difference (In case any column missing in audit table) between Production table and Audit table
Also I've got tables tab2 and tab3 with indicatorid column in each table.
My need is to prevent inserting values into indicatorid column in tables tab2 and tab3 which are absent in id column in table tab1.
For example, at this moment id column in table tab1 has two values, 1 and 2. I can insert value 3 into indicatorid column in tables tab2 and tab3. I need to prevent from inserting value 3 into indicatorid column in tables tab2 and tab3 while it is absent in id column of tab1 table.
I have the Table name Location and synonym named Location in my DB. I am trying to create the proc where I am tryting to call the table. But its not working.
Example: CREATE PROCEDURE TESTPROC AS BEGIN DBMS_OUTPUT.PUT_LINE('testing'); select count(*) from LOCATION; END;
Compile error: Error(5,22): PL/SQL: ORA-00942: table or view does not exist
can we link a table that contains a foreign key with a sequence table? i have a table that has a sequence bookid and i want to link this table to another table that contains a foreign key of bookid.can i link this two table and how?
if the command is successful:>alter table my_table shrink;The segment will be defragmented and the High Water Mark will be moved.But what is the importance of the HWM?
Whats the difference between commands? >alter table my_table shrink; -- move HWM >alter table my_table shrink compact; -- not move HWV
I have one table in a particular database and will have to create another table in a different database. I will have to join these two tables from two different DB's.
How should I go about doing this? Is this a good practice? or is it always better to use a single DB?
What are the disadvantages of joining two tables from two different DB's?
Data migration for three tables. I have three table which are
1.npi_p_mig contain four fields (pr_id,mi_id,qty,sl_dt,fac_code) 2.np_detail(pr_id,mil_id,qty,sl_dt,facility) 3.np_ref_tab(facility,fac_code),
I need to migrated the data from based on two tables (np_detail,np_ref_tab) to new table npi_P_mig(pr_id,mi_id,qty,sl_dt,fac_code) table. i need sql script to migrate above two table to new table (npi_P_mig) .
I'm trying to do a sum over 2 different tables but can't get it to work...This is the idea:I have a table A with client ID, time-id (per day), purchase amount and segment code.
In another table (let call it B) I have a lot of client ID's and also their purchase amount, time-id and segment code. I want to sum the purchase amount for every client from table A and B for clients with certain segment code from table B.
This is what I have now:
select client_id, purchase_amountA+ purchase_amountB from tableA, tableB where A.client_id = B.client_id and time_id between 20090101 and 20091001 and B.segment_code = 'A'
This does the job, but it selects only client_id's which are in both tables. I want to select all client_id from table B with segment_code 'A' and add the purchase_amount from table A to their purchase amount from table B, at least, if they have any purchase amount in table A.
I have 2 tables SEC_MASTER_HISTA and SEC_MASTER_HISTB.
Now, I need to compare the data of the two tables column-wise.
Ideally the 2 tables should have the same security_alias values but in my case they do not as the two tables belong to 2 diff client models. There is however a main SECURITY_MASTERA and SECURITY_MASTERB tables which have the security_alias recorded and a primary_asset_id column value which can act as a link between SEC_MASTER_HISTA and SEC_MASTER_HISTB. But, I have not been able to figure out the exact query which will be ideal.
Attached are the table structures and the data it contains.
Note: I need to compare the Coupon and Freq column values of SEC_MASTER_HISTA and SEC_MASTER_HISTB.