SQL & PL/SQL :: Uploading Raster Data Into ArcGIS Slow?
Mar 14, 2012
I have got an issue while uploading the raster data which are in .img format, the upload process is very slow.Previously it was done in 6-8 mins for 10 images but now it is of 30min and even 1 hour too..
I am trying to upload the data into emp table through sqlloader and facing the following error.
col1 in the data file size is 300.
After reduced the length to 200 and its getting inserted successfully.
EMP ----- COL1 varchar2(300)
Error in Sql log file :-
value used for ROWS parameter changed from 64 to 10 Record 1: Rejected - Error on table "SCOTT"."EMP", column COL_1. Field in data file exceeds maximum length
Version : SQL*Loader: Release 11.2.0.3.0 - Production on Mon Mar 4 15:53:55 2013.
I am running emca -config all db -repos recreate command and recreating the repository. Doing so I am getting error while uploading the repository. The Drop and Create is going fine. The exception that I am getting is :
CONFIG: Error uploading configuration data to the repository oracle.sysman.emdrep.config.ConfigurationException: FATAL Configuration Exceptions
at oracle.sysman.emdrep.config.EMSchemaConfiguration.perform(EMSchemaConfiguration.java:232) at oracle.sysman.emcp.EMReposConfig.uploadConfigDataToRepository(EMReposConfig.java:699) at oracle.sysman.emcp.EMReposConfig.invoke(EMReposConfig.java:385)
[Code]..
And the emca config file is showing exception with i18N jar file.
oracle.sysman.emdrep.config.EMSchemaConfiguration$ConfigInstance run SEVERE: null java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[Code]...
Exception Occured during Execution of oracle.sysman.emdrep.util.TransxWrapper
why this exception is coming while uploading the reporsitory.
I am trying to insert huge data into another huge table which is almost taking around 2-3 hrs. See my below query
INSERT /*+ APPEND *//*+ NOLOGGING */ INTO DB1.Table1 SELECT * FROM DB2.Table2 ; COMMIT;
Both Table1 and Table2 have same structure and table1 is master table having 100 Billion records and table2 having 30 Million records. This is a direct insert where each day this operation carried.
I'm extracting/retrieving the data from the oracle database using Java application it's bit slow. However, when I retrieve from the SQL server it's faster than oracle.
I am trying to access and modify data of a table of another schema which contains 80,000-90,000 records. My procedure is taking near about 30 mins to complete the operation. faster access and updation of table data.
Details: I have two schema: TEST and PROD I am running the below code from TEST Schema. /* CODE START HERE*/ DECLARE exc_bulk_errors EXCEPTION; PRAGMA EXCEPTION_INIT (exc_bulk_errors, -24381); v_block_count NUMBER := 1000;
[Code]....
The above code is taking near about 30mins to process.
I have also tried another approch: Creating a procedure in PROD schema to update COMPONENT_MASTER table and by calling the procedure from above code by passing component code.
/* PROCEDURE CALL FROM ABOCE CODE INTEST SCHEMA*/ PROD.PROCEDURE_TO_UPDATE(v_comp_code);
Getting Error while uploading payment using API ap_checks_pkg.Insert_Row:
ORA-20001: APP-SQLAP-10000: ORA-01403: no data found occurred in AP_AC_TABLE_HANDLER_PKG.INSERT_ROW<-AP_CHECKS_PKG.INSERT_ROW<- with parameters (ROWID = , CHECK_ID = 39368) while performing the follow
I have a form that upload an excel file to the database . I make for in it a browsing button for the uploaded file and take the path (e.g c:upload.xls) when the procedure of upload work. it always search for the path on the application server and thats wrong . I want it to search on the client machine for that path .
PROCEDURE XLS IS BEGIN DECLARE application OLE2.OBJ_TYPE; workbooks OLE2.OBJ_TYPE; workbook OLE2.OBJ_TYPE; worksheets OLE2.OBJ_TYPE; worksheet OLE2.OBJ_TYPE; cell OLE2.OBJ_TYPE; args OLE2.OBJ_TYPE; [code]........
you see this
filename := :block7.FILE_NAME; (5th line from the begin block) :block7.FILE_NAME is the path that i said as an example c:upload
That path is on the client machine the form trying to open that path from the application server.
When i am uploading data in an excel file to my Forms(Windows) the screen works fine but when i am uploading the same excel file to my forms in AIX(Where the application is running) the data is uploaded with decimal points.
The field is a char type and no format mask is set.
E.g. : Actual data: In Form(Windows) In Form (AIX) 4080026 4080026 4080026.0
When i am uploading data in an excel file to my Forms(Windows) the screen works fine . But when i am uploading the same excel file to my forms in AIX(Where the application is running) the data is wrongly uploaded . The field is a number type and no format mask is set.
E.g. : Actual data:---------- In Form(Windows)------In Form (AIX) 3101513480750030000 3101513480750030000 3101513480750029800
I have multibyte CSV files (extract from BI) : Excel says "Unicode txt" and when I save them from Excel in "Text CSV", they get half the size on the disk.
here is the piece of code where the uploaded file get converted from blob to clob then to varchar2 (CSV Util from Oleg.Lihvoinen [URL]...
SELECT blob_content INTO v_blob_data FROM wwv_flow_files WHERE NAME = p_file_name;
[code]...
I have tried different values for "blob_csid := 873 ;" (and by the way, the list of possible values for this code is very difficult to find : I know, there is a function CS_name to CS_ID but a list would be great), but without any visible effect.If I use the Apex CSV uploader app, the result is the same than with this code.
is an example : �O�R�A�C�L�E�
instead of : ORACLE
How I can have these files imported whithout an Excel conversion ?
I got error "Upload New Image must be specified. Click browse to select an image from your local computer" while i upload image in shared components>image>create ...
my logo is saved in local drive D. when i browse file and click to upload it gives error that i mentioned. This is happening with every uploading weather its image or weather its plugin.
I get an ERR-7621 from Apex whenever I do anything in an application the tries to read a file. For example importing images, css files, themes, or applications. Even the data loader app will get the error if you choose to load a "csv" file. The following appears in my Apex Listener Log (version 2 early adopter). I am running 4.1.1 of Apex and also have another server running the same where the problem does not exist. Following is the log output whenever the load occurs:
Sep 28, 2012 11:21:18 AM com.sun.grizzly.http.servlet.ServletAdapter doService SEVERE: service exception: java.lang.NumberFormatException: For input string: "" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
I'm uploading data from excel to our database using forms thru webutil. The problem is when I open the same excel file while the upload process is in progress, the uploading stops, BUT not displaying any error. Not even when I chick Help->Show error.
How could I prevent this error and how to trap and display error message?
I have been searching the forum (and Google) looking for tips on how to ensure users can only upload files of specific formats (Word, PDF, etc) for specific document types as defined within the application (e.g. Curriculum Vitae (Word), Copy of Transcripts (PDF)).
While I have used this research to start work on a server-side solution, I would like to know if there is an apex-friendly way to "validate" a file-browse item based on mime-type.
uploading an image item which is not a database item from the operating system. I am using forms 10g on db 10g. My platform is windows Vista sp1.
I am trying to use GO_ITEM('BEEEE'); Set_Custom_Property( 'SALGRADE.BEEEE',1, 'READIMGFILE', 'C:/users/ajayia/Desktop/Beeee.jpg' ) ;
But I just get a blank image on runform. I am using WHEN-NEW-BLOCK-INSTANCE trigger. I have tried it on when-new-form-instance, but also not working. The file am trying to upload is a JPG file format, but i cant find the file type in forms image item property. What can I change it to. Or how to i go about changing it.
I want to call a Web form which should upload the image from my local machine.For that I have created a form which will take necessary data about employee now I want to Insert Image for that employee into table as I am new I struct on the Image uploading form. Latter I have seen the Enter & Maintain form which have Picture button.Pressing this button we get one new web form open & we can upload our image from there.
The issue is slow insertion in particular table(i.e A Table) it means insertion in all other tables(i.e B, C, D tables) in same schema is going properly but only when i am trying to insert in one particular table(i.e A table) in same schema it takes long time to complete insertion. Daily insertion is 6000 rows.
I have check all the details like Tablespace size, Analyzing of table, Analyzing of indexes and all. There is no any error alertlog file.
We have two database instances on the same server. One was left at 9.2.0.7 and one was upgraded to 10.2.0.3. Connecting externally (sqlplus '/as sysdba') to the 9.2.0.7 database is lightning fast. Connecting externally to the 10.2.0.3 database is very slow, comparatively speakiing. This is on an IBM AIX-5L (64-bit) machine. We are using "tnsnames".
I am migrating a oracle 9i database to 11g r3. I can only use imp. As the database is huge, I have split the exp dump by schemas. In my recent test, i have split the schema into 4 seperate threads to be imported into the new oracle11g database. The 4 thread of imp consist of almost similar sizes of schema (Eg thread 1 - Schema 1, 2 ,3. Thread 2 - Schema 4,5,6 etc)
All the dump files are in the same mount point. When i execute the import (4 threads) together, the total import timing is each thread is between 2.5 days to 3.5 days.
Then i proceed to try only 1 thread, only 2 hrs. So could this be a IO issue or oracle memory problem?
We have done cloning of our ORACLE APPLICATION(11i),after that performance of ERP is getting slow (like fetching of data). What we can do to increase the performance.
we are busy updating one databasee from a windows platform 2003 oracle 10G to a linux and oracle 11r2
We exported/imported the data and it looks ok Explain plans look the same . but our heavy batches are twice slower than on the windows box ,the two top events are disk related, sequential and scattered reads there are 90% of the time of the batch job , i read some white paper and found that using ASM can be bad in some cases the same with the linux for this particular kind of scattered reads , i was just wondering if just changing the SGA to 10GB instead of 4GB to get more cache and speedup the things .
We are using one software it is a test tool for verify the data base posting speed from server to client systems. In windows 2008 R2, database posting speed is very slow when compare to windows 2003 server .
Server configuration is same for both servers ( RAID 5 , RAM 4 GB) how we can improve writing performance in Oracle
Database : Oracle 8i Application Server: Oracle AS 9i Developer Suite : Oracle 6i(forms & reports)
I have created some character reports in oracle reports 6i.. when reports used run from my ERP(oracle 6i oriented) ... report usually took time to create on server. Sometimes my ERP used to hang up due to busy reports generation. And then we have to kill some processes to finally create charater reports on emergency basis.What is the valid reason for slow generation of report(character file )?
i have a nightly import ( about 20 tables ) and it takes up to 5 hours..we have one table of about 800,000 lines and the rest are between 1000 and 200,000 this is very slow when i monitor the import i see a very long amount of wait for the SQLnet from client ,
i run the import on the Database server itself .. if i check the current statement i see it's moving from one to one for instance i have
SELECT /*+ all_rows ordered */ "A".ROWID, 'REPORT', 'CONTRACT_LVL', 'SYS_C001329497' FROM "REPORT"."CONTRACT_LVL" "A" WHERE NOT (LENGTH (bonus_nat) <= 31) then SELECT /*+ all_rows ordered */ "A".ROWID, 'REPORT', 'CONTRACT_LVL', 'SYS_C001684584' FROM "REPORT"."CONTRACT_LVL" "A" WHERE NOT (LENGTH (outcome_cd) <= 1)
etc and it takes hours DB is on windows 2003 runnin oracle RDBMS 9.2.0.7 while the import screen show 185000 lines imported..I also see a lot of consistent gets for this sessions raising at that time..Would it be better to export import without statistics ?
I need also to mention that the dump file comes from a linux hosted Database don't think it will make the difference for a exp/imp.It's a peoplesoft Database there are a lot of tables more than 15000 and if i take the table mentioned above and i want to check its constraints it takes decade before toad can display them.I have seen that we have a incredible amount of constraints on those tables it might be the reason .
I just wonder if the system catalog needs to be tuned ? /* Update */ why but now the huge number of wait is no as "Library cache lock".
I am working on an SAP application migration project using Oracle 10.2.0.2 database. We are migrating the application from Windows to Solaris.
During the process we are facing problem with very slow insert operation on a particular table.The server's capacity is very good and so no resource bottleneck.
The table contains around 2,70,000 rows and inserting at around 100 rows per 10 seconds.
The table contains following data types.
SQL> desc SAPDATDB.CAF_GP_VALDEF; Name Null? Type ----------------------------------------- -------- ---------------------------- VAL_UUID NOT NULL NVARCHAR2(34) VAL_GUID NOT NULL NUMBER(10) VAL_CLOB NCLOB