Server Utilities :: Sqlloader Detect Invisible Characters

Aug 20, 2013

Is there a way to detect bogus characters in the datafile?

SQLLoader on original file

Record 1: Rejected - Error on table DP, column STARTTIME.

ORA-01858: a non-numeric character was found where a numeric was expected

Copy the data in the controlfile using notepad++: no errors

View 13 Replies


ADVERTISEMENT

Server Utilities :: CONCAT Fails In SQLLoader?

Mar 2, 2010

I'm trying to concatenate a local phone number field. The LDAP system only has the last 5 digits but for the directory database we need all 7 digits.I've tried every combination I can think of to get the concatenation to work but every combination results in just the first two digits being imported, e.g.,

LOCAL_NUM "'20'||:local_num",

results in just 20 being imported. Every iteration I've tried that didn't result in an error imported only the 20 and ignored the ||. I've also tried calling the CONCAT directly, e.g.,

LOCAL_NUM "CONCAT('20', :local_num)",

result is the same.The problem seems to be that the loader is ignoring the concatenate statement all together. I've tried the statements outside of the loader via sqlplus with expected result so I'm confused as to why it's not working within the loader.

View 2 Replies View Related

10g SQLLoader With Multibyte Characters

May 11, 2011

Here's an odd problem. I'm trying to load German characters positionally (not CSV) using Linux 10g. I don't get this error on Windows or via CSV, but I'm bound to the method and platform.

The problem is simplified thus. I have 2 columns, the 1st varchar2(8) and the 2nd a numeric(3). The error I'm getting is Invalid number only on rows with special characters. Let me demonstrate.

The file has been loaded into Linux and corrected using iconv.

[oracle@basic sqlldr]$ cat jh.txt
ELEKTROM001
ZEIPR�SI002

This is the loader control file

[oracle@basic sqlldr]$ cat jh.ctl
load data
characterset utf8
infile 'jh.txt'
replace into table TEMP1
(
FLD1 POSITION(1:8) CHAR,
FLD2 POSITION(9:11)
)

The 1st rows is accepted, but the second fails in sqlldr with

Record 2: Rejected - Error on table TEMP1, column FLD2.
ORA-01722: invalid number

The logical assumption is that the double width character is not being properly read by sqlldr but I can find no advice on other setting.
My nls parameters look like this.

PARAMETER VALUE
------------------------------ ------------------------------
NLS_LANGUAGE ENGLISH
NLS_TERRITORY UNITED KINGDOM
NLS_CURRENCY #
NLS_ISO_CURRENCY UNITED KINGDOM
NLS_NUMERIC_CHARACTERS .,
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE ENGLISH
NLS_CHARACTERSET UTF8
NLS_SORT BINARY
NLS_TIME_FORMAT HH24.MI.SSXFF

PARAMETER VALUE
------------------------------ ------------------------------
NLS_TIMESTAMP_FORMAT DD-MON-RR HH24.MI.SSXFF
NLS_TIME_TZ_FORMAT HH24.MI.SSXFF TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH24.MI.SSXFF TZR
NLS_DUAL_CURRENCY ?
NLS_NCHAR_CHARACTERSET UTF8
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS CHAR
NLS_NCHAR_CONV_EXCP FALSE

I've tried using other sqlldr options such as LENGTH SYMANTICS and BYTEORDER but with no success.

View 1 Replies View Related

Server Utilities :: Return A Column Value Using Sqlloader After Loading

Dec 1, 2011

I have the following table intra_trades with t_id as the primary key. There is a trigger on that table that gets the next sequence and inserts it into the t_id column for every insert. I need to load data into that table using SqlLoader as chunks of 3000 rows and return the t_id back the script that Sqlload the data so that it can use that t_id's for the next process in the script.

intra_trades
t_id NUMBER(15) pk
t_name VARCHAR2(30)
t_loc VARCHAR2(40)
t_start TIMESTSTAMP
t_end TIMESTSTAMP
[code]....

The problem is that the only unique key on that table is the t_id which has a sequence on it and it is the pk. There can be duplicate rows in that table to meet the business needs for the company. So it is hard to associate the rest of the data in a row with t_id. The only thing I can think of is return the t_ids in the order it inserted so if the script keeps the order of rows in the memory it can associate the tid with the rest of the intra_trades info.How can I make the sqlloader return an array of t_ids that inserted? I need to return the t_ids's in the order it inserted so that the script can associate the t_id with the rest of the rest of the data in a row.

View 4 Replies View Related

Server Utilities :: Difference Between Sqlloader And External Tables?

Feb 9, 2011

I would like to know which of the above is faster for the same conditions.

i.e. If I am loading 1 million rows for the same conditions which will perform faster?

View 9 Replies View Related

Server Utilities :: Trim Date Characters Within SQL Loader

Sep 9, 2010

I am attempting to insert date data into a column using sqlldr...Here's the current format:

2010-03-01 00:20:19.277

So far, I haven't gotten anything to work. I would like to trim the .277 from the existing date. Here's my latest attempt:

birthdate DATE "to_date(substr(birthdate,1,19),'YYYY-MM-DD HH24:MI:SS')"

View 6 Replies View Related

Server Utilities :: SQL Loader Is Not Uploading Non-ASCII Characters?

Jan 10, 2012

I am using SQL Loader to load data from text file to DB. non-ASCII characters present in the text file is not uploaded correctly to DB.

Sample Data

test data üindex

Data in DB

test data ?index

View 1 Replies View Related

Server Utilities :: Loader Loading Some Garbage Characters At The End Of The Data Field

Mar 25, 2011

After exporting some data to excel, I noticed that on one row all the columns shifted over some. So I queried this record in the database and noticed that the ADDRESS field has some unknown characters at the end of it. They are little squares. I think they are TABS.

2630 LINDEN BLVD, APT. #8G(2 squares are in here)

ADDRESS_1 "TRIM(:ADDRESS_1)",

Besides trimming the data, is there some other function I can use to clean up the address further?

View 9 Replies View Related

Forms :: Items On Tab Page Getting Invisible?

Oct 12, 2010

when i am selecting a record from a LOV which is associated with a button on a tab page of a tabbed canvas; all the items on the tab page getting invisible. But when i navigate to another tab page and then come back to the former tab page all items reappearing.

What could possibly be the reason for this and how to restrict items from disappearing?

View 4 Replies View Related

Invisible Index Getting Accessed In Trace File 11g

Jun 16, 2011

I am working on tuning the performance of one of the concurrent request in our 11i ERP System having database 11.1.0.7

I had enabled oradebug trace for the request and generated tkprof out of it. For below query which is taking time , I found that , in the trace generated , wait event is "db file sequential read" on an PO_LINES_N10 index but in the generated tkprof , for the same below query , the full table scan for PO_LINES_ALL is happening , as that table is 600 MB in size.

Below is the query ,
===============
UPDATE PO_LINES_ALL A
SET A.VENDOR_PRODUCT_NUM = (SELECT SUPPLIER_ITEM FROM APPS.IRPO_IN_BPAUPDATE_TMP C WHERE BATCH_ID = :B1 AND PROCESSED_FLAG = 'P' AND ACTION = 'UPDATE' AND C.LINE_ID =A.PO_LINE_ID AND ROWNUM = 1 AND SUPPLIER_ITEM IS NOT NULL),
LAST_UPDATE_DATE = SYSDATE
===============

Index PO_LINES_N10 is on the column LAST_UPDATE_DATE , logically for such query , index should not have got used as that indexed column is not in select / where clause.

Also, why there is discrepancy between tkprof and trace generated for the same query .

So , I decided to INVISIBLE the index PO_LINES_N10 but still that index is getting accessed in the trace file .

I have also checked the below parameter , which is false so optimizer should not make use of invisible indexes during query execution.

SQL> show parameter invisible

NAME TYPE VALUE
----------- optimizer_use_invisible_indexes boolean FALSE

View 5 Replies View Related

SQLloader Database Saturation?

Jul 2, 2013

I have a situation i can't find an explanation for, here it is:

I have a table, lets say:
table_a
-------------
name
last name
dept
status
notes

And this table has a trigger on insert, which does a lot of validation to the info to change the status field of the new record according to the results of the validation, some of the validations are:

○check for the name existing in a dictionary
○check for the last name existing in a dictionary
○check that fields (name,last name,dept) aren't already inserted in table_b... and so on

The thing is, if i do an insert on the table via query, like

insert into table_a
(name,last_name,dept,status,notes)
values
('john','smith',1,0,'new');

it takes only 173 ms to do all the validation process, update the status field and insert the record in the table. (the validation process does all the searches via indexes)But if i try this via SQLloader, reading a file with 5000 records, it takes like 40 minutes to validate and insert 149 records (of course i killed it...)i tried loading the data disabling the trigger (to check speed) and i got that it loads like all the records in less than 10 seconds.

So my question is, what can i do to improve this process?, my only theory is that i could be saturating the database because it loads so fast and launches many instances of the trigger, but i really don't know.

My objective is to load around 60 files with info and validate them through the process in the trigger (willing to try other options though).

View 1 Replies View Related

SQL & PL/SQL :: Removing Special Characters And Get Desired Characters From Column Values

Jul 23, 2013

create table test
(
name varchar2(50),
descd varchar2(50)
)
insert into test values ('kethlin','da,dad!tyerx');
insert into test values ('tauwatson','#$dfegr');
insert into test values ('jennybrown','fsa!!trtw$ fda');
insert into test values ('tauwatson','#$dfegr ,try');

how do I get the first three characters and last three characters from name field and remove all the junk characters from descd field?

so my o/p be like;

Quote:('ketlin','dadadtyerx')
('tauson','dfegr')
('jenown','fsatrtw fda')
('tauson','dfegr try')

View 6 Replies View Related

Forms :: Detect Browser Name?

Dec 8, 2010

I need to define the type of user's web-browser (Opera, Ie, chrome, firefox), under which launched forms.

View 3 Replies View Related

PL/SQL :: Detect I/O Before Running Statement?

Apr 29, 2013

how do I detect I/O before running sql statement?

View 3 Replies View Related

Forms :: How To Detect When ENTER Key Pressed

Mar 5, 2010

I have 2 text fields as search criteria and a push button to search. When the user enter any on first field (or in any text field) and press ENTER key, I want to execute the search.

I tried the key-next-item trigger, but it's firing when I press TAB key.

How can I detect ENTER key?

View 8 Replies View Related

SQL & PL/SQL :: Detect And Fix Negative Time In Oracle

Jul 15, 2011

I found the data in DB with negative time as below

Quote:
1 SELECT
2 c.time
3 FROM partition c
4 WHERE
5* c.time < to_date('0001-01-01','YYYY-MM-DD')
SQL> /

DEACTIVATION_TIME
------------------------------------------------------------------------
17-JUN-08 04.06.22.893 PM

The data type for c.time is Timestamp.

When I use JDBC to retrieve the data and convert it to mill seconds, and it also shows as negative time. My question is the retrieved time doesn't show the negative date, but obvious it is a BC date and earlier than 0001-01-01.

1. How can we detect/show the date to show correctly for the BC date without checking c.time < to_date('0001-01-01','YYYY-MM-DD')? (I am wondering if there is format configuration to show the time like 17-JUN-08 04.06.22.893 PM BC, or something to show the time is negative )

2.how could we fix it negative time to positive ?

View 6 Replies View Related

SQL & PL/SQL :: Detect And Move Duplicate Values

Mar 16, 2010

Table contains duplicate data . Have to move data to another table. Criteria: check for duplicate values if duplicate exist move all duplicates except one to the history table. While moving to other table see if the record being moved already exists.

source table
SOURCE TABLE : ODS_OWNER
grp_id grp_name face_id address1 city zipcode

3456789 NIKE AERO 457899 707 CROFT GRAND RA 12345
1256789 NIKE AERO CORP 678899 707 CROFT SE GRAND RA 12345
5465455 BB SHIPPING 809708 201 SOUTH CT DESPLAINE 45434

[Code]....

FIRST 4 RECORDS ARE DUPLICATES FROM WHICH 1 RECORD GOES TO w_grp AND ONE GOES TO HISTORY TABLE. THE RECORD WHICH GOES INTO w_grp OUT OF THE DUPLICATES WILL DEPEND ON THE LAST MODIFIED DATE FOR EACH

DISTINCT VALUES GO IN w_grp TABLE
DUPLICATE GO INTO match_his TABLE

View 3 Replies View Related

Windows :: Detect Oracle Version And Platform Installed?

Mar 14, 2012

Is there a way to check which versions or oracle installed on the machine from command line?

The thing is I don't want to login to the database so I can't use SQL Plus also, till now I ran tnsping and checked the output but in case there are several versions of different platform I got only one version that was first on the path env.

View 1 Replies View Related

Server Utilities :: Finding Versions Of Exp And Imp Utilities Of Database Server?

Feb 23, 2012

how to find the versions of exp and imp utilities of database server from windows command prompt?

Note: Currently i have 10.2.0.10 oracle software installed on my local machine.

View 4 Replies View Related

Precompilers, OCI & OCCI :: How To Detect Offending Record For Update Statement

Jul 23, 2008

way to run an update statement through OCI and make it stop on the first offending record and then identify that record? Here is an example of what I need to do: I run an update with 5 records in the input buffer like this:

- the 1st record exists in the table
- the 2st record exists in the table
- the 3rd record does not exist in the table
- the 4rd records exists in the table but is violating some unique constraint
- the 5nd record exists in the table

I need to report that the 4th record from the input buffer has a problem, and then commit records 1, 2 and 3. Do not commit record 5. So far I have tried the following:

- OCIStmtExecute with OCI_DEFAULT. It stops on the first error, but I don't know how to identify the bad record. OCI_ATTR_ROW_COUNT returns the number of affected records in the target table. I didn't find any other attributes that would return 4, for the 4th bad record in the input buffer.

- OCIStmtExecute with OCI_BATCH_ERROR. I can identify the bad records in the input buffer by using OCI_ATTR_DML_ROW_OFFSET. However, OCIStmt Execute does NOT stop on the first violating record. It loads the whole input buffer and then reports which records were in error. I don't know how to commit only up to the first bad record -- the 4th in the example.

View 6 Replies View Related

Application Express :: Possible To Detect Status Of Hide / Show Region

Nov 11, 2012

Is it possible to detect the status of Hide / Show Region.

I have three hide/show regions RX,RY,RZ. When an event EX happens, then region RX should expand and the rest of the region should collapse. When an event EY happens, then region RY should expand and the rest of the region should collapse.
........
...
We can use this JS on DA to change the status of a region:

$("RX.uRegionControl").click();
$("RY.uRegionControl").click();

.....But, first we need to detect the status to know where to apply that DA. ? I am working on apex 4.2. theme 25.

View 2 Replies View Related

Server Administration :: Prevent Bilingual Characters In Database?

Jan 22, 2010

we are using oracle10g version. Currently my character set is WE8MSWIN1252 At present, somehow, we have non-english characters(for example spanish) in the database. We wanted to stop entering these kind of characters.. Can we prevent this by changing the current character set?

View 2 Replies View Related

Server Utilities :: Do EXPDP Utilities Does Backup At Block Level As What RMAN Is Doing

May 29, 2013

I have one doubt on Expdp & RMAN. Do EXPDP utilities does backup at block level as what RMAN is doing? Which one is faster, expdp or RMAN?

View 16 Replies View Related

Server Administration :: Unable To See / Insert Chinese Characters In Oracle Database

Jun 14, 2013

We are getting problem with the Chinese character set. My current character set is as follows.

PARAMETER VALUE
---------------------------------------------------------------- ----------------------------------------------------------------
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
[code]....

My column description for the table product is as follows.

PART_NBRVARCHAR2 (30 Byte)
PART_DESCNVARCHAR2 (2000)

when trying to insert Chinese character using the insert command below

insert into product(part_nbr,part_desc,cust_name) values('322341',unistr('功'),'test');

I am getting the value when selecting the same record using the select command

select a.part_nbr,a.part_desc,a.cust_name from product a where a.part_nbr='322341'322341¿test

When I running this command on TOAD

select a.rowid,a.part_nbr,a.part_desc,a.cust_name from product a where a.part_nbr='322341'

and manually editing/inserting '功' character in output from select command above. After that I am able to get the same Chinese character when I am running select next time.

View 7 Replies View Related

Server Utilities :: Intermediate COMMITS When Using Sqlldr Or Imp Utilities

Oct 29, 2013

I want to load lakhs of records into a table. My problem is when after loading the ¼ of records my process is abend due to the size of my rollback segment area. I don't have an option to increase it. So, Is there any way to go for intermediate commits when I am using the imp or sqlldr utilities to load the entire data without abend?

View 2 Replies View Related

Server Utilities :: Netca And Netmgr Utilities?

Feb 20, 2012

I am familiar with tool Netca. However there is one more utility exist for the same functionatlty which is netmgrI checked with many DBAs for the exact difference, however I did not get the best answer from them. I also have checked in google but not exactly got the difference. list the exact difference between those 2 tools (netca, netmgr)

View 1 Replies View Related

Server Utilities :: How To Execute SQLLDR / Where Data File Reside In Another Server

Mar 24, 2010

How to execute the SQLLDR, where Data File Reside in another Server?

View 1 Replies View Related

Server Utilities :: Exported Data Of One User Importing Into Another Schema At Another Server

Sep 15, 2010

I have exported data of one user an importing into another schema at another server. when i am trying to imoport it is working fine for quite no of imports into tables, but after some time it starts giving me below mention error...

IMP-00008: unrecognized statement in the export file:
<
IMP-00008: unrecognized statement in the export file:
<
IMP-00008: unrecognized statement in the export file:
<‏ے
IMP-00008: unrecognized statement in the export file:
+A
IMP-00008: unrecognized statement in the export file:
[code]...

View 6 Replies View Related

Server Utilities :: How To Access Local File Residing At Client From DB Server

Nov 23, 2011

I have a requirement to read flat text file(around 15000 lines) residing at a client location from DB server and write into a table in One cell.

I tried UTL_FILE and DBMS_LOB but, i am not able to access client location to read the file as it reads path from Oracle Directory.

eg.
my client path is 198.168.1.1 and my DB server is in unix say 192.168.1.10.
file location is: \192.168.1.1shareabc.txt
So I created One Oracle directory as MY_DIR having DIRECTORY_PATH as '\192.168.1.1share'.
But both UTL_FILE and DBMS_LOB is not able to access the file.

Error Message:
-------------
Unable to process CLOB -22288 ~ ORA-22288: file or LOB operation FILEOPEN
failed
No such file or directory

Few Details for reference:
-------------------------
File Location: \192.168.1.1shareabc.txt
Unix DB Server location: 192.168.1.10
Table : Test (filename varchar2(30), Content CLOB)
Oracle Dir: MYDIR
Directory_Path: \192.168.1.1share

View 7 Replies View Related

Server Utilities :: Difference In Character Sets Of Server And Client?

Mar 16, 2011

I've a question regarding difference of character sets, while taking a export(logical backup) of database on directly to server(linux RHEL 2.1 AS) and export on a client (windows xp prof machine, where only a oracle 9i client is installed). On server it seems to fine and okay, but on client node i'm getting following error for almost all tables.

EXP-00091: Exporting questionable statistics.

My question is :

[1] Is it creating any sort of problem, if later on i import the data which was taken from client node.

[2] Why there is a difference(marginal) in dump(.dmp) file size.

[3] Is there any way to overcome it, or it is the natural behave of it. Means not a problem.

[4] If i'm using a long or blob as datatype for some of my table,is they have any problem if i persist like above.

Additional Information about character sets On server node :

Export done in US7ASCII character set and AL16UTF16 NCHAR character set server uses WE8ISO8859P1 character set (possible charset conversion)

On client node :

Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set server uses US7ASCII character set (possible charset conversion)

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved