XML DB :: Error When Parsing XML Data That Is Loaded Using SQL Loader
Sep 7, 2012
I am loading content of an XML file into a table using SQL loader.Below is my Control file script -
LOAD DATA
INFILE *
INTO TABLE xx_cc_response_xml_stg TRUNCATE
xmltype(XML_DATA)
FIELDS
( COLUMN_ID constant 1,
file_name filler char(4000),
XML_DATA LOBFILE(file_name) TERMINATED BY EOF)
BEGINDATA
B2B_MasterDataUpdate_20120906152137.xml
------------------------------------------------------------------------------------
The file B2B_MasterDataUpdate_20120906152137.xml is correct and XML is well formed.When i try to query for XML_DATA (datatype XMLType) column in the table, i cannot see any content in the record, and it appears as (XMLTYPE)When I parse this XML using the below,
select value(d)
from xxnbn_cc_response_xml_stg a,
table(xmlsequence(extract(a.xml_data,'/InventorySearch'))) d;
------------------------------------------------------------------------------------
I get this error:
------------------------------------------------------------------------------------
ORA-00600: internal error code, arguments: [qmcxdsSelf4], [], [], [], [], [], [], [], [], [], [], []
00600. 00000 - "internal error code, arguments: [%s], [%s], [%s], [%s], [%s], [%s], [%s], [%s]"
*Cause: This is the generic internal error number for Oracle program
exceptions. This indicates that a process has encountered an
exceptional condition.
*Action: Report as a bug - the first argument is the internal error number
------------------------------------------------------------------------------------
LOAD DATA INFILE "gateway.csv" truncate INTO TABLE GATEWAY Fields terminated by "," Optionally enclosed by '"' trailing nullcols
[code]....
and I got the following error:
zcyds891:/opt/oracle> sqlldr gwcem/gwcem@pfs control=gateway.ctl log=/tmp/ldr.log bad=/tmp/bad.log SQL*Loader: Release 9.2.0.8.0 - Production on Tue Dec 7 05:07:59 2010 Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved. SQL*Loader-350: Syntax error at line 12. Expecting "," or ")", found "INTERGER". GATEWAYPROTOCOL INTERGER, ^
I have few tables in Oracle 9i/10g , and they already have data in them. I am trying to migrate the data coming from various source systems into these Oracle tables. There is a chance that after loading I might get some unwanted data into these tables.
How do I remove just that data which I have loaded recently, and do not disturb the original data it already has.
Need to backup those tables and reload the data back if there is any problem, but I am looking at a different approach. I just don't want to change the existing system, as lot of users use the system.
I'm trying to create a relation from child block to the master block that I've created . Foreighn key is there from child to parent table.The error I get is below:
FRM-15004: Error while parsing join condition
The join condition is correct, but I'm still clueless as to what it could be.Is it a form bug?
I am trying to create an EXT table but is constantly having the following problem, not sure why. I have done a few checks and used the scripts used as a standard but still is experiencing an error.
ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "minussign": expecting one of: "double-quoted-string, identifier, single-quoted-string" KUP-01007: at line 9 column 1
I receive a "FRM-15004 Error while parsing join condition" when attempting to create a relation between block1 (parent table) and block2 (child table). If I do a simple straight join the statement is accepted but if I use a Decode then it results in an error. How to successfully use a decode or a nvl in a join statement of a relation?
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit ProductionPL/SQL Release 11.2.0.3.0 - ProductionCORE 11.2.0.3.0 ProductionTNS for Solaris: Version 11.2.0.3.0 - ProductionNLSRTL Version 11.2.0.3.0 - Production I'm trying to load a table, small in size (110 rows, 6 columns). One of the columns, called NOTES is erroring when I run the load. It is saying that the column size exceeds max limit. As you can see here, the table column is set to 4000 Bytes)
CREATE TABLE NRIS.NRN_REPORT_NOTES ( NOTES_CN VARCHAR2(40 BYTE) DEFAULT sys_guid() NOT NULL, REPORT_GROUP VARCHAR2(100 BYTE) NOT NULL, AREACODE VARCHAR2(50 BYTE) NOT NULL, ROUND NUMBER(3) NOT NULL, NOTES VARCHAR2(4000 BYTE),
ERROR at line 1: ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "identifier": expecting one of: "binary_double,
[code].....
when I select count(*) on the external table created below.
SQL> select * from v$instance; INSTANCE_NUMBER INSTANCE_NAME --------------- ---------------- HOST_NAME ---------------------------------------------------------------- VERSION STARTUP_T STATUS PAR THREAD# ARCHIVE LOG_SWITCH_WAIT
I try to transfer data from one database to another one through data pump via SQL Developer (data amount is quite important) exporting several tables. Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
"Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...= ORA-39001: argument value invalid ORA-39000: .... ORA-31619: ...
The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
Is there any way to find out the division between the time taken for query parsing, creating execution plan and actual data retrieval seperately? If I enable 'set timing on' I see the elapsed time which is the total time taken for all these 3. Some of my queries are taking long time when I run it first time and so want to know what is it taking long? is it the parsing or creating the execution plan, if so what can I optimize.
problem is: Our HFM application is running very slow, but we don't know where the bottleneck is. I would like to prove that the DB is not the one causing the issue.
solution: Show AWR metrics that indicate that utilization of the DB
I've been reading thru a lot of articles in the net (Oracle sites and others), but I cannot find a clear AWR baseline that will tell me if my database is heavily loaded. how much of my DB capacity am I using in a said time period. The different OEM graphs show: During non-peak: DB is relatively Idle Peak time: the graph suggest we are just using 30-40% of max capacity. This is when a dataload into the DB happens
So, is my DB loaded if my AWR Load Profile stats are:
CODELoad Profile Per Second Per Transaction Per Exec Per Call ~~~~~~~~~~~~ --------------- --------------- ---------- ---------- DB Time(s): 1.0 0.3 0.00 0.00 DB CPU(s): 0.8 0.3 0.00 0.00 Redo size: 408,893.4 125,411.1 Logical reads: 5,606.3 1,719.5 Block changes: 2,119.2 650.0 Physical reads: 455.9 139.8 Physical writes: 83.0 25.5 User calls: 469.4 144.0 Parses: 82.4 25.3 Hard parses: 44.5 13.7 W/A MB processed: 2,383,203.7 730,949.0 Logons: 0.2 0.1 Executes: 345.9 106.1 Rollbacks: 0.0 0.0 Transactions: 3.3
CODETop 5 Timed Foreground Events ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Avg wait % DB Event Waits Time(s) (ms) time Wait Class ------------------------------ ------------ ----------- ------ ------ ---------- DB CPU 8,687 79.5 db file sequential read 1,699,159 1,539 1 14.1 User I/O log file sync 35,518 170 5 1.6 Commit direct path read 418,577 165 0 1.5 User I/O enq: TX - index contention 8,136 121 15 1.1 Concurrenc Host CPU (CPUs: 64 Cores: 8 Sockets: 1) ~~~~~~~~ Load Average Begin End %User %System %WIO %Idle --------- --------- --------- --------- --------- --------- 1.03 1.29 1.7 0.5 0.0 97.8
I have a table emp_up, daily this table is uploaded by a SQL *LOADER(with REPLACE option) script run by a UNIX JOB.There is no particular timestamp column in this table. Is it possible to know when/AT what time the table is uploaded.
How can i get the partition name which was recently loaded. When my load start it truncate the partition and load the data.once the data is loaded there is another load which will read from earlier loaded partition table. i want to know on the fly which partition was loaded recently.
Ideally we need to include description in tnsnames.ora but its taking time to contact DBA here. Hence we tried with this work around. The same is working fine if we are using sqlplus but sqlloader gives this error.
LRM-00116: syntax error at 'ADDRESS' following '('
SQL*Loader: Release 11.2.0.2.0 - Production on Thu Jan 10 21:31:32 2013
My oracle is sitting on UNIX, i have a sql loader scripts which load the data in oracle at every 10 min and bad files is written into a directory. since the file names are same it overwrite the badfiles in case of error record. i can devise a code to write the bad file with different name. I want to write error record into oracle table, is this possible and how can i achieve ?
I have a table which is being load by sqlloader, when i load the table without direct path set to TRUE IT Works well , but when DIRECT path set to TRUE ,it comes out with the following error
SQL*Loader-702: Internal error - Unknown column for OCI_ATTR_COL_COUNT SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
control file looks like below.
load data BADFILE '/backup/temp/rajesh/RIO/BadFiles/FILENAME' append into table TEMP_rio_RESP_TIME_LND TRAILING NULLCOLS ( INSTALLATION_ID CHAR
Since there is an extra double quote (denoting inch) in the third column, im getting an error. Is there any way to avoid this error without modifying the csv file.
I have a data base oracle 10g (10.2.0.3). I am exporting the Schema using Grid Control 10g.
Servers:
Source: Oracle 10.2.0.3.0
To: Oracle RAC 11.2.0.3
I am trying to import the Schema for the server Oracle RAC 11gR2 11.2.0.3 using the Cloud Control 12c and is presenting the following error:
Error: O job IMPORTA_EGESTAO_ESQUEMA_NOVO foi reaberto em Terca-Feira, 05 Marco, 2013 11:02 Reiniciando "SYSTEM"."IMPORTA_EGESTAO_ESQUEMA_NOVO": Processando o tipo de objeto SCHEMA_EXPORT/USER Processando o tipo de objeto SCHEMA_EXPORT/ROLE_GRANT Processando o tipo de objeto SCHEMA_EXPORT/DEFAULT_ROLE Processando o tipo de objeto SCHEMA_EXPORT/TABLESPACE_QUOTA Execute Failed: ORA-06502: PL/SQL: numeric or value error: character string buffer too small ORA-06512: at line 18 ORA-06512: at line 44 (DBD ERROR: OCIStmtExecute)
I am trying to load the data from .csv file into the table using SQL Loader.
The table has the following schema: src_id : number, dest_id : number, range: intsys.interval_typ --- > a type containing (lower,upper) payload : varchar2(100)
The loader.ctl file is :
load data infile * append into table sb_packet fields terminated by "," optionally enclosed by ' " ' (src_id,dest_id,range,payload) BEGINDATA 3,32,intsys.interval_typ(10043,142703),"misc data"
When I use the following this ctl file to transfer the data, i get the following error:
SQL*Loader-418: Bad datafile datatype for column RANGE
Below function has been used to transfor data and callled in sql loader control file CREATE OR REPLACE function return_domain( domain_name varchar2) return varchar2 as v_dmn varchar2(100)
[code]...
sql loader control file is as below:
load data BADFILE '/backup/temp/rajesh/CERNER/BadFiles/FILENAME' append into table TEMP_CERNER_RESP_TIME_LND WHEN CLINICAL_TRANSACTION_ID = 'USR:ERM PMSEARCH ENCOUNTER RESULTS DISPLAY' TRAILING NULLCOLS
[code]...
function takes the parameter as 'DOMAIN50_LPAR5002_slainterval051712_rj35cmi102_08_45_00.csv '
FILENAME in control file will be replace by DOMAIN50_LPAR5002_slainterval051712_rj35cmi102_08_45_00.csv when i run the the the loader i get the below error.
Record 1: Rejected - Error on table TEMP_CERNER_RESP_TIME_LND. ORA-00604: error occurred at recursive SQL level 1 ORA-01843: not a valid month
I have one main form and there is one button when i click on it, another form should be opened which should display multi records with check box for each record.
I have implemented so far. When the new form is opened by default all the records should be checked. if user does not want some of them then it can be unchecked(This can done manually)..
I have put below code in when-new-block instance trigger
when i run my form im immediately im geting geting this error message
oracle.forms.webutil.file.FileFunctions bean not found.CLIENT_TEXT_IO.fopen will not work
when i searched in meta link i found a solution but not know ing how to implement it..
Symptoms When running a customized Webutil Form an error similar with the following is displayed:
oracle.forms.webutil.file.FileFunctions bean not found. CLIENT_TEXT_IO.fopen will not work.
The exact function that "will not work" may change in the error message depending on the WebUtil code used. There are no errors displayed in the Java Console. The original WebUtil Demo Form runs successfully. Cause The WebUtil code that is failing is placed in triggers that are firing before Forms instantiates the WebUtil PJCs.
This is not allowed when using WebUtil and it is explained in the WebUtil User Guide release 1.0.6, section 5.3: "Once the WebUtil library has been attached to your form you can start to add calls to the various PL/SQL APIs defined by the utility. However, there is an important restriction in the use of WebUtil functions: WebUtil can only start to communicate with the client once the Form has instantiated the WebUtil PJCs. This means that you cannot call WebUtil functions before the Forms user interface is rendered.
This would include triggers such as PRE-FORM, WHEN-NEW-FORM-INSTANCE and WHEN-NEW-BLOCK-INSTANCE for the first block in the Form." SolutionDo not use Webutil code in triggers like PRE-FORM, WHEN-NEW-FORM-INSTANCE and WHEN-NEW-BLOCK-INSTANCE because these triggers are firing before Forms instantiates the WebUtil PJCs Instead, you can test the WebUtil functionality with a WHEN-BUTTON-PRESSED trigger and you can use it in your application in other allowed locations.