SSO Preventing Multiple Concurrent Oracle Sessions From Loading
Nov 16, 2012
We are experiencing a problem with SSO causing 2nd or 3rd concurrent Oracle sessions to hang. The Oracle application hangs during loading and the task manager has to be used to close the application.
I have tested logging onto our application servers using SSO and I cannot load more than 3 concurrent Oracle sessions. When I bypass the SSO and logon to the same server I can load more than 20.
I have a stored procedure that is run from a command within our Clarity application.
The procedure involves some SQL Reads and SQL Inserts.
We have experienced users running the SP at the same time (slim chance to do this) and it creating duplicate entries.
if there is a clever way of preventing the same SP to be run concurrently?
Initially I was thinking of having the first step of the SP to interrogate a flag into a custom table - which the SP then sets to 1 if it is running, and 0 at the end.
Are there better more efficient/effective ways of doing this?
During batch process record is entered in detail table as well as summary table.
The process first checks if record exists in summary table for same group_no and if 'yes' then "updates" the record with the newly added amount (sums it) else inserts a new record Whereas in the detail table it inserts the record directly
Now if the batch process runs in parallel, (out of many) two different sessions insert same group_no; This is because while sesond session inserts a record, first session inserting the same record (group_no) has not yet committed ; So second session Not knowing that already there is same Group_no (101) inserted, again inserts another record with same group_no rather than summing it.
Can it be solved without using temp table, select for update?
I was just wondering how Oracle manages multiple sessions in a database performing DML. I believe this is related to 'Read Consistency' and I tried to search for the same but could not get any satisfactory online documents.
CASE 1: user A logs in to a database1 issues select on table A and then inserts 4 rows user B logs in to databse1 issues select on table A and then inserts 5 rows issues rollback user C logs in to a database1 issues select on table A and then inserts 6 rows issues commit
How many rows can user C see in the table A when he issues select?
CASE 2: user A logs in to a database1 issues select on table A and then inserts 4 rows user B logs in to databse1 issues select on table A and then inserts 5 rows user C logs in to a database1 user B issues rollback user C issues select on table A and then inserts 6 rows issues commit
How many rows can user C see in the table A when he issues select?
NOTE: All the users are currently logged in to the same database and none has logged out.
I am trying to load multiple XML files into Oracle DB using SQL Loader. The filenames of the XML files starts with a description and then numbers, where the numbers are different each time.
Here's my CTL file:
LOAD DATA INFILE * INTO TABLE XML_TABLE TRUNCATE xmltype(XML_TABLE) FIELDS (
[code]....
I don't want to keep having to go into the ctl file and change the numbers of the xml file. Is there a way where I could just load all .xml files that begins with 'description'? Like maybe
I have a bunch of data in 50 excel files. I need to load all these 50 files into 50 different tables. I would like to do this in one script. I went through the forum to get this information, people suggested create a shell script etc or list the sqlldr command multiple times etc.
provide some clarity on this as to what's the best approach.If it is through shell scripting provide the shell script and instructions to execute it. Iam new to shell scripting.
the following situation, I have a directory named /dat/global/stock/ inside this i will get files named differently for example below.abcdef.112dfgrt.2......
Here i want to load this file one by one into the external tables and generate one more file based on some enrichment.
Step 1. Have to take first file and to load into the ext table. Step 2. Enrichment Step 3.File generation.
Now here i am facing a problem that in that particular directory i usually get 1000 files so i need to get file one by one and to put in one more directory. how can i get file one by one and generate file by using oracle loader
I have 1M Records coming from an External Data source as a Flat File (using ETL). Now I need only Yesterday's data only to load in my Database Table.
this can be done using Bulk Load and Filter.
write the CODE.
Second Part:-
Hint: if I need to update only those records been updated Say the Address1 field is updated. So this records need to update in my Master Customer Table.
If I have many fields in table and any records that are modified (coming to me from External Datasource as a Flat file) how to identify and update that record in my Master Customer Table?
our db shows more than 200 INACTIVE Sessions ; and the DBA plans to reboot the db to get rid of these sessions . Can we not KILL these sessions and avoid the reboot ?
I am using oracle 10g as server in my lab. I faced some problems initially, but later after increasing the USERS tablespace it is working fine.
But there is still one problem. During the query execution some queries will be blocked and it doesn't leave any consequent queries to execute from the same user.
The blocked sessions will be displayed in the admin page under blocking sessions link. There is a option to kill the session. But when i do that, it affects all the users and the connection will be lost to all the users. again I have startup the database from beginning.
I have some questions about Oracle + EMC shared storage. I have Oracle 11gR1 RAC (2nodes) + ASM environment with shared shorage EMC Clariion AX4.
The database is running no archivelog mode. I'd like to implement point-in-time recovery using Snap View snapshots.
Currently my AX4 platform has the following LUNS:
LUN1 - registry 1 LUN2 - registry 2 LUN3 - vote 1 LUN4 - vote 2 LUN5 - vote 3 LUN6 - ASM - DISKGROUP DATA DISK DATA01 (actual db datafiles) LUN7 - ASM - DISKGROUP DATA DISK DATA02 (actual db datafiles)
Using source LUNs in consistent session will take sync snapshots of all the LUNs working against my database. Once something happens with database, the LUNs can be returned to specific point in time when snapshot consistent session was taken and I'm expecting the database will up and continue to work.
Questions:
1. Is my approach correct at all? (The database is running noarchivelog mode, not dealing with hot backups. The recovery point in time implemented purely via EMC snapshot consistent sessions.) 2. The AX4 platform has a limit 8 source LUNs in session. If I understand it correctly, I can't place more than 8 LUNs in session. What will be the workaround if my database will occupy more than 8 LUNs, I'd like to take their consistent snapshot; for example:
LUN1 - registry 1 LUN2 - registry 2 LUN3 - vote 1 LUN4 - vote 2 LUN5 - vote 3 LUN6 - ASM - DISKGROUP DATA DISK DATA01 (actual db datafiles) LUN7 - ASM - DISKGROUP DATA DISK DATA02 (actual db datafiles) LUN8 - ASM - DISKGROUP DATA DISK DATA03 (actual db datafiles) LUN9 - ASM - DISKGROUP DATA DISK DATA04 (actual db datafiles)
I have a database using character set AL32UTF8. The database contains character strings (VARCHAR2 colums) that may contain both Western European and Eastern European characters (and may be even other kinds of characters such as Cyrillic or Asian).
Suppose a client application has set NLS_LANG character set to WE8ISO8859P1. By this Western European characters will be shown correctly, while Eastern European characters which do not compare with WE8ISO8859P1 will be converted and shown as '?' (question marks) in the client application. If a user of this application fetches a record with Eastern European characters, modifies the record and then rewrites it to the database, the Eastern European characters with be rewritten to the database as question marks, i.e. Eastern European data have been corrupted.
I would like to prevent this by detecting that data were not converted properly during the fetch and then show the record to the user in read-only mode in order to avoid data loss, but I have not been able to detect the conversion error.
The application fetches data through the OCI interface using the "ofetch" function. The error code set by ofetch is the same (i.e. no error) regardless of whether the record contains Eastern European characters or not.
I thought I could manage this by setting the database parameter NLS_NCHAR_CONV_EXCP to TRUE, but this has no effect. Apparently this only deals with operations directly in the database.
I have a console application that needs to be scheduled in task manager. My system is 32-bit operating system and I’m using Oracle.DataAccess.dll in my application to establish connection to the oracle db. The version is 2.112.1.0 and the processor architecture of this dll in C:WindowsAssembly is x86. In my local m/c this dll works fine with all 3 build and target platforms – x64, x86 or AnyCPU. But when I copy the files to my staging server which is a 64-bit OS I’m getting the following exception. (Note: I’m also having Oracle.DataAccess version 10.2.0.100 which is also x86 available in C:WindowsAssembly)
System.BadImageFormatException: Could not load file or assembly 'Oracle.DataAccess, Version=2.112.1.0, Culture=neutral, PublicKeyToken=89b483f429c47342' or one of its dependencies. An attempt was made to load a program with an incorrect format. File name: 'Oracle.DataAccess, Version=2.112.1.0, Culture=neutral, PublicKeyToken=89b483f429c47342' at
Assembly manager loaded from: C:WindowsMicrosoft.NETFramework64v4.0.30319clr.dll Running under executable D:ProjectFolderMyExecutable.exe Assembly manager loaded from: C:WindowsMicrosoft.NETFramework64v4.0.30319clr.dll Running under executable D: ProjectFolderMyExecutable.exe
--- A detailed error log follows. === Pre-bind state information === LOG: User = UserId LOG: DisplayName = Oracle.DataAccess, Version=2.112.1.0, Culture=neutral, PublicKeyToken=89b483f429c47342 (Fully-specified)
[Code] ......
I have tried to build the application to target AnyCPU / x64 / x86. It fails in all 3 scenarios. There are other applications in the staging server where Oracle connection can be established. So ODP.Net should be registered in the server. So looks like problem with my console app.
We have requirement such that whenever stored procedure is executed, their resultant records has to be stored in excel file ( Just like an reports ).No third party tool or reporting tools are used.
is there any option in oracle (Stored procedure or built in packages ) which can create excel file with the resultant records.
Here is one way to create EXCEL file from oracle sql query and prevent excel displaying large numbers in scientific notation(exponential notation)
set feedback off set verify off set heading off spool c:excel_test.xls select 'PO_NUMBER'||chr(9)||'VENDOR_NUMBER' from dual union select '=PROPER('||po_number||')'||chr(9)||'=PROPER('||vendor_number||')'||chr(9) from invoices where rownum < 12 order by 1 desc
Note that PO_NUMBER is 16 characters, VENDOR_NUMBER is 15 characters in invoices table.
ERROR at line 1: ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "badfile": expecting one of: "column, enclosed,
I am loading data from XML file into Oracle table.This program is working fine for small XML files. If I try to load large XML file with multiple pages, only first ten records are loaded. Here is the procedure.
PROCEDURE Test_xml_read(p_tag varchar2,p_xml_file varchar2,p_path varchar2) AS BEGIN INSERT INTO stg_xml_table( Productid,productname,price) select y1.productid,y1.productname,y1.price y2.categoryid,y2.categoryname,y2.categorypath FROM xmltable('ProductFeed/Products/Product' passing xmltype(bfilename('TEST_DIR1', 'sample.xml' ), nls_charset_id('CHAR_CS')) [code]...
what changes to be done to load multiple pages of data pages table.
I am loading a Excel file into oracle database using a ORACLE form 6i and i am getting an ORA-302000. This Form run and load database file into database in many pc.
But One of the PC COuld not loading a Excel file into oracle database. Also i can't Create Excel file Database Through form 6i.
we have a Oracle 11g database, but it does not support UTF8 character set. Now we have received a file which contains records with Trade Mark symbol (special character). Now when we are trying to load the record, it is getting loaded with a sign "?". As a result when we try to display record in web application, the application is not able to show the records properly.
I am using oracle 10g i have a table on my computer that i made for a friend when i load it on their computer the select statements say no data found if i use select * from table name all the data will show
if a column name select * from table name where duty_date = '05-JAN-11'no data found
You have a stock_amt value in one table and there is a procedure that updates and substracts from this stock_amt.
lets say in a store a have a stock amount of 50 items and this procedure, for each sale is subtracting from this value and it is not allowed to go below zero. The process, beside the update, on this column (set stock_amt = stock_amt - x) is doing a lot of other updates on other tables and it total it takes like 0.5 seconds. Everything is fine till I want to execute this procedure by 50 users in parallel.
The initial implementation, to avoid some dead locks and we put a lock on that column (stock_amt) but there where to much waits. we cannot hold that lock for 0.5 seconds.
What will be the best approach for this? For this stock amt problem, maybe the solution can be a trade like: do not update that column every time but once in a while, by another process or by a materialize view logic.
but what if my column is a critical value like a Prepay balance or bank balance and it needs to be updated in near real time. What will you do?