Forms :: Slow Performance Using OLE2 Reading Xls File
Oct 10, 2011
I just trying to import some informations from excel to Oracle using OLE2 over Oracle Forms 6i, but It´s very slow when I have import under then 10k lines. anything to optimize that ? Follow the code used...
I am using OLE2 to update an excel spreadsheet from Oracle Forms 6i. If the excel spreadsheet is already open, the code runs through fine, and even asks if you want to replace the file when running the lines: OLE2.INVOKE(worksheet, 'Save');
OLE2.INVOKE(workbook, 'Save');
However, the file never gets updated. There is no error that appears. I would like to have the program check if the file is open before running the program;
Im trying to display a word file from a blob column of table in reports 11g . In reports 6i OLE2 file format was available for that, but in 11g it became obsolete.
I need to read data from text file(located on application or db server or on some other server, however path is known to me.) and then append some data in it.
Data will be read and written on daily basis so i want to clear all data on date change.
I know that this topic in not new, but maybe my case is a bit different than common.My form has multi-record block (4 columns) in which i want to upload CSV file.
The form will have "Browse..." button to piont CSV file (on local users disc, not where Forms server resides) after pointing the CSV file, user will press UPLOAD button which will import CSV data into block. After that data is visible in block. But whole form must not be refreshed.
I'm studying about UTL_FILE package, but it seems it allows manipulating files on server not local user's machines. Do Forms have such "Browse... -> UPLOAD" possibility or I need to learn Java and create Java funcionality inside Forms?
Or it is impossible to load data directly into form? (I have to use SQLoader and load data into temp table then into block on form????)
I have a Excel File which contains some columns and rows, i need to load that excel into a form and import that form data to Database Table, using DDE Method.
In simple i say; Just to read the excel and load into a form, which can be imported into a table later.
I need some one provide me with doc, pdf, ppt, notepad anything to understand how to use OLE2
I know you will say search in this forum or google it i already search it and google it but all of them are examples or a questions for specific requirement. No one explain how is this code work even oracle it self check this link
[URL].......
no explanation . i don't want that's much
at least like this
DECLARE app OLE2.OBJ_TYPE; this for . . . . . . docs OLE2.OBJ_TYPE; this for . . . . . . doc OLE2.OBJ_TYPE; this for . . . . . . selection OLE2.OBJ_TYPE; this for . . . . . . args OLE2.LIST_TYPE; BEGIN
For example, we have a table ACCOUNT (snowflake dimension containing other dimension keys) and I have many fact tables based on this dimension. Normally data warehouse load happens like first dimensions needs to be loaded and then facts. Our frequency of loads is 30 mins.
To increase the rate in which the data will be available in the facts (as its a financial application), am considering to have two batches one with dimension and another one with fact (came to this conclusion as there is no dependency like first dimensions to be loaded then only fact) just the update might get missed sometimes. But if I do that, when dimension gets loaded, it will be read in the facts in another session. Will this affect the performance ?
LOADING (insert/update) and selecting data from table at the same time. Will it affect the performance in any way.
I want to export forms data into Excel sheetfor that i am using Client_Ole2.I have attached Webutil object library and Pl/Sql libraryStill I can not export data from Form to Excel sheet
I am looking for a code/script to read values from excel file and perform PLSQL script.
Now i have the PLSQL script to generate report which takes two value which i have to change all the time to generate new report .All i wanna do create a script to read from a excel file and perform the other script.
I have been searching from a long time and only found UTL_file Package which use CREATE OR REPLACE FUNCTION and create some virtual table.
The problem is i don't have create authorization in database so i m not able to use UTL_File command . Is there any simple way to read value from excel file?
I'm trying to utilize the utl file to read a txt file and import the data into a table in Oracle. I've read in various forums and have researched a lot on oracle documentation site and on the internet but can not find the answer to the problem.
The source follows:
Set serveroutput on DECLARE arquivo_ler UTL_File.File_Type; Linha Varchar2 (1000); BEGIN arquivo_ler: UTL_FILE.FOPEN = ('INTRANET_LOAD', 'carga_intranet.txt', 'R', 32767); Loop UTL_File.Get_Line (arquivo_ler, Linha); dbms_output.put_line (Linha); End Loop; UTL_File.Fclose (arquivo_ler); DBMS_OUTPUT.PUT_LINE ('File processed with sucesso.'); END; /
The errors:
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 633
ORA-29283: invalid file operation
ORA-06512: at line 5
What has been done:
Created DIRECTORY (INTRANET_LOAD) and given the GRANT read, write to the user
On Linux where Oracle is installed, was given full access to the Oracle user folder: / u01/app/oracle/product/11.2.0/db_1/adp
we are busy updating one databasee from a windows platform 2003 oracle 10G to a linux and oracle 11r2
We exported/imported the data and it looks ok Explain plans look the same . but our heavy batches are twice slower than on the windows box ,the two top events are disk related, sequential and scattered reads there are 90% of the time of the batch job , i read some white paper and found that using ASM can be bad in some cases the same with the linux for this particular kind of scattered reads , i was just wondering if just changing the SGA to 10GB instead of 4GB to get more cache and speedup the things .
I am running one simple delete statement in one table with rownum<10000 but it is taking nearly 10 to 15 mins.Table doesn't have any child table rows and triggers.
A doc file stored in a database having data type blob. To read this file I have written Following procedure
create or replace procedure XX_read_blobfile1 as b blob; c clob; n number; begin SELECT file_data INTO b FROM fnd_lobs WHERE file_id = 322420; if (b is null) then [code]......
With the data displaying some boxes before and after data
working on loading the data from flat file into table and below given is the validation condition given.I checked the UTL_FILE build in package but not able to figure out, how to identify the column header in flat file.
1. Skip the header, if any. The header is the first record, and starts with '000' 2. Skip the trailer, if any. The Trailer is the last record, and starts with '999' 3. Log an error, but continue if a line exceeds 512 characters 4. Log an error, but continue if a line is blank
When I click on excel report button on the parameter form, the excel runs behind the form instead of in running in fron of it thereby making it partially invinsible. I am having to minimise the form to have a clear view. This is not the case with Forms 10g. Is there an extra code on the OLE2 to send the form to the background so that the excel output can take over the screen.
I am trying to insert huge data into another huge table which is almost taking around 2-3 hrs. See my below query
INSERT /*+ APPEND *//*+ NOLOGGING */ INTO DB1.Table1 SELECT * FROM DB2.Table2 ; COMMIT;
Both Table1 and Table2 have same structure and table1 is master table having 100 Billion records and table2 having 30 Million records. This is a direct insert where each day this operation carried.
We have a MV which fetches data from around 27 tables containing 26 joins out of which 25 are outer joins. Some tables in the query are being referred multiple times through different alias names and hence the actual no of physical tables used is 18. This MV takes about 50 mins to refresh through complete refresh mechanism. We decided to make it fast refresh and thus made these configurations:
- Created MV logs based on rowid for each of the base tables. - Recreated MV using FAST refresh,with primary key option enabled - Pulled rowid for all these tables in the select column statement.
Even after making all the recommendations suggested by Oracle for fast refresh MV's we are still getting refresh time of around 65 mins(refresh time increased!!!).We already have indexes built on all the join columns of the base tables. What else do we need to do to make this a "fast" refresh MV ?
I'm extracting/retrieving the data from the oracle database using Java application it's bit slow. However, when I retrieve from the SQL server it's faster than oracle.
My ERP Application is responding fast while running reports or saving entries, if Oracle 10g Express Edition (XE) is installed. But in Oracle 10g Enterprise Edition or Standard Editions the same application is running very slow.
If a table(have a primary key) is empty(after truncate),the sql of dml(insert,update) is very quickly,but if the table have many rows about 10,000,000 rows, the dml is very slowly,why?
we are using oracle 9i on AIX Server. When Customer were accessing the database, accidentally power was shut down. we restarted the Server,and Oracle database. all resumed successfully.
However while doing "Payments by the customer" it takes a lot of time to insert even a single payment record on database.The database is Live and our customer are very much frustrated,
I am using 11gR2 on windows server. This is the query that runs many times a day and effect badly the performance of database. I don't have much idea about this query.
SELECT TO_CHAR(current_timestamp AT TIME ZONE 'GMT', 'YYYY-MM-DD HH24:MI:SS TZD') AS curr_timestamp, COUNT(username) AS failed_count FROM sys.dba_audit_session WHERE returncode != 0 AND TO_CHAR(timestamp, 'YYYY-MM-DD HH24:MI:SS') >= TO_CHAR(current_timestamp - TO_DSINTERVAL('0 0:30:00'), 'YYYY-MM-DD HH24:MI:SS')
The product I work on requires a query to tell us what tables are dependent on certain types.
SELECT dba_tab_cols.owner, dba_tab_cols.table_name, dba_tab_cols.data_type_owner, dba_tab_cols.data_type FROM dba_tab_cols JOIN dba_types ON dba_types.owner = dba_tab_cols.data_type_owner AND dba_types.type_name = dba_tab_cols.data_type WHERE (dba_types.owner IN ('SCHEMA1', 'SCHEMA2'......))
I find this query to be pretty slow. I think it is because data_type_owner in dba_tab_cols is not indexed. Adding an index is not an option because users expect our product to read-only.
Few days ago, My database server no access to StorageBox then I reboot it then after works fine. But, know DB import process is too slow. Before 100GB DB import process completed within 10 hours when server normal running. Now 2 day working, but not complete
How to investigate this issue? Maybe I miss increase some parameters on the Server or Oracle?
Here is my server brief info:
RAM is 16GB, SWAP size is 16GB, CPU 12 cores
SQL> show sga;
Total System Global Area 4294967296 bytes Fixed Size 1984144 bytes Variable Size 369105264 bytes Database Buffers 3909091328 bytes Redo Buffers 14786560 bytes
I need a way to ftp file to remote server by reading data from table. I searched a couple of sites which asked me to use Chris xutl_ftp package..but unfortunately the site is no accessible..
Here is the code
CREATE OR REPLACE PACKAGE UTL_FTP AUTHID CURRENT_USER AS /** * LICENSE: GNU Lesser General Public License (LGPL) * Copyright (C) 2003-2006 Russ Johnson (john_2885@yahoo.com)