Globalization :: Bad Character Set In Dump File?

Jul 23, 2013

IMPDP-ing a dump file that someone has handed me over into Oracle XE results in special characters, i.e. Umlauts, being messed up.  

In a hex editor, the dump file shows  a) the token WE8MSWIN1252 near the beginning, but b) Umlauts obviously being encoded in DOS 850, for example "König" is encoded as 4b 94(!) 6e 69 67. Does this prove that the dump file is badly formatted and that I have to resign myself to the complicated approach mentioned at the end of [URL]...

View 4 Replies


ADVERTISEMENT

Globalization :: Character Encoding - LATIN-1 Character In UTF-8 DB?

Feb 9, 2013

I am using C++ OCI LIB, to insert some report data from remote OCI client to oracle 11 server. This data is read by another process to create the report.The DB CHARSET is UTF-8. But the report tool expects the data to be ISO08859-1 encoded. So while inserting the data into the database i specify the following LANG and CHARSET for my table colulmn in client:

The TARGET DB CHARSET is UTF-8
NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P1
size_t csid = 871; // UTF-8
OCIAtrSet((void *) bnd1p, (ub4) OCI_HTYPE_BIND,
*(void *)&csid*,
(ub4) 0,
(ub4)OCI_ATTR_CHARSET_ID, errhp);

This solution works for almost every case of ASCII and Extended ASCII Charest but we are facing issues if we have few specific characters to be inserted.f we are trying to insert single beta character [β] through client, the data goes empty to the column.

Beta Character details:
DEC     OCT     HEX     BIN     Symbol           Description
223     337     DF     11011111     ß     Latin small letter sharp s - ess-zed

DB Output after insert single β:
select rawtohex(NAME) from PERSONS where EID=333;

RAWTOHEX(NAME)
---------------------------

But if the string is *"ββ"* everything work fine:
DB Output for "ββ":
select rawtohex(NAME) from PERSONS where EID=333;

RAWTOHEX(NAME)
---------------------------
DFDF

View 6 Replies View Related

Globalization :: Database Character Set

Jun 6, 2012

We have production DB 10g with character set US7ASCII. This DB stores Arabic data and English data.Production DB located in HP unix Operating System.

When I query data from DB through SQL developer data is shown as Junk or Unknown characters(Square Boxes).

Client (Workstation from where query is issued from SQL develope- Windows XP OS) Settings: NLS_LANG = AMERICAN_AMERICA.US7ASCII

In Client workstation Oracle 10g client is installed from where I used to query data through SQL developer. The problem is I am unable to see Arabic characters in the sense that it is displayed as Junk character. However English characters and Eneglish numeric values are displayed properly.

I tried below way to make sure that data is not corrupted: Converted "Name" column to hex value (rawtohex) and displayed its HEX value. Executed below query in UTF-8 DB.

select UTL_I18N.RAW_TO_CHAR(hex_value_of-name) from dual;

This displayed Arabic name properly in UTF8 DB.

Character set for this production DB can not be changed at this time. There are many applications which is based on this DB. All these applications are well capable of converting Junk data to Arabic to display in application.

My concern is: What I should required to do to view Arabic data properly through SQL developer? Is there any settings needs to be done at my client workstation?

View 15 Replies View Related

Globalization :: Character Set For Local Language

Dec 2, 2012

I am using oracle 10 g database on windows xp. I have backup of data contains data in local language (Marathi). I want read this data in oracle itself.Which character set need to choose?

View 6 Replies View Related

Globalization :: SQL Loader Loads Special Character Incorrectly

May 31, 2013

We are trying to load some xml files via

sqlldr user/pswd@xe control='C:xmlldr.ctl' data='listoffiles.dat'

where xmlldr.ctl is the following:

load data
CHARACTERSET WE8ISO8859P1
replace
into table LOADER_CTG_PROTOCOL
(

xmlfile lobfile(nct_code) terminated by eof,
nct_code char(15) "substr(:nct_code,1,11)",
created_dt sysdate
)

I was playing with different CHARACTERSET, but some special characters e.g. "greater than or equal" do not get loaded/displayed correctly in the database. Also tried changing NLS_LANG registry key and following some advices in the Oracle doc

[URL].......

View 1 Replies View Related

Globalization :: Convert Chinese Character From US7ASCII To AL32UTF8?

Sep 17, 2012

Source Database: AMERICAN_AMERICA.US7ASCII
Target Database: AMERICAN_AMERICA.AL32UTF8

From the source database, the chinese characters are stored in some schema table. From the csscan result, there are convertiable, truncate, data lossy character. So, I have tried to use exp/imp for the conversion. However, all chinese characters are invalided and cannot be read anymore. How can I convert them from US7ASCCI to UTF8 database?

Also, I have tried build up another database with AMERICAN_AMERICA.ZHT16MSWIN950. The exp/imp is used for conversion again. The chinese characters are readable in AL32UTF8 database.

- source database (US7ASCII)
export NLS_LANG=AMERICAN_AMERICA.US7ASCII
export LANG=AMERICAN_AMERICA.US7ASCII
exp userid='/ as sysdba' file=export.dmp full=y

- target database (AL32UTF8)
export NLS_LANG=AMERICAN_AMERICA.US7ASCII
export LANG=AMERICAN_AMERICA.US7ASCII
imp userid='/ as sysdba' file=export.dmp full=y ignore=y

Result:
from US7ASCII to AL32UTF8:
the chinese characters cannot be read

from US7ASCII to ZHT16MSWIN950:
the chinese characters cannot be read

from ZHT16MSWIN950 to AL32UTF8:
the chinese characters can be read

How can I convert the chinese character from US7ASCCI to UTF8 database?

View 6 Replies View Related

Globalization :: Non-English Character In Oracle 10g Express Edition

Oct 5, 2012

I have a table. It's name is INSTITUTION. It has a NUMBER INS_ID and NVARCHAR2(50) INS_NAME . INS_NAME can contain Turkish characters, such as "ğ,ü,ş,ç,ö". According to business logic, there can not be a repetition on the INS_NAME.User will enter institution name from a textbox in ASP.NET , and I check this name in database from c sharp code, if there is no repetition, we will add this record.

The problem is; when user enter a instition name that contains Turkish character, there is a duplication. If there is a instition name is *"su işleri"* , the both query; SELECT * FROM INSTITUTION WHERE INS_NAME = *'su işleri'*; and SELECT * FROM INSTITUTION WHERE INS_NAME = *'su isleri'*; returns no result, even though there it is.But if instition name is "oracle corporation" (there is no Turkish character) it query successfully. I have the same problem in Toad for Oracle 11.5.1.2. When I query database from toad SELECT * FROM INSTITUTION, the phrase *"su işleri"* has appeared. But when I query SELECT * FROM INSTITUTION WHERE INS_NAME = *'su işleri'*; , there is again no result.When I connect oracle database directly and perform the query SELECT * FROM INSTITUTION , the phrase *"su isleri"* (not *"su işleri"* ) has appeared.

Here are the language settings of the database:

National Language Support
National Language Parameter Value
NLS_CALENDAR______________GREGORIAN
NLS_CHARACTERSET__________WE8MSWIN1252
NLS_COMP__________________BINARY
NLS_CURRENCY______________TL
NLS_DATE_FORMAT__________DD/MM/RRRR
NLS_DATE_LANGUAGE________TURKISH
NLS_DUAL_CURRENCY_________YTL
[code]....

View 8 Replies View Related

Globalization :: Junk Character Insertion In Oracle Table

Feb 6, 2013

How to avoid Junk character insertion in oracle table. I have prepared scripts like this Say

customer - info

After insertion the data is inserted like below in production

Customer ¿ info

We are using command prompt for script execution in production environment. I am using PLSQL developer and SQL developer for development. i cannot see junk data in PLSQL developer and latest SQL developer , but its caught in old version of SQL developer. Also in Application also i can able to figure out junk data.

View 6 Replies View Related

Globalization :: Forms / Reports Chinese Character Support?

Jul 4, 2013

We have an existing db (10.2.0.4.0) and forms (11.1.2.1.0) application, that we're trying to extend to support Chinese characters. We're looking to add some unicode (nvarchar2) columns to existing tables, rather than converting the whole db charset. I've pasted my environment settings below. What I've found so far in trying to create a local (ie. running the form in Builder with local weblogic running) test form, is that I can insert the chars ok (using plsql developer) and the test form can display them correctly, but cannot write them back to the database. They appear as upside down question marks in any records the form has created. 

Env variables:NLS_LANG = .UTF8 Database:NLS_CHARACTERSET = WE8MSWIN1252NLS_NCHAR_CHARACTERSET = AL16UTF16 1)

So, how to get the form to write the characters back into the database correctly? 2) The chinese chars will only be relevant to a few forms inside the app, are there any settings local to the form that will enable unicode support, rather than setting at OS level. ie, an alter session, or equivalent? 3) Oracle Reprts doesn't appear to have an nchar datatype unlike Forms, is there anyway to get Reports (generating PDFs), to include Chinese?

I now have the chars writing back to the db ok. If you do it via an INSERT statement from inside the form, it doesn't work. It appears the value is sent to the db in the normal charset rather than the national charset, and it's written as a question mark. If you pass the value from the form into a back end stored proc though (which does the insert) it works okay.

View 3 Replies View Related

Globalization :: Migrating Character Data Using A Full Export And Import

Jul 23, 2013

I have a database in my local machine that doesn't support Turkish characters. My NLS_CHARACTERSET is WE8ISO8859P1, It must be changed to WE8ISO8859P9 , since it supports full Turkish characters. I would like to migrate character data using a full export and import and my strategy is as follows:

1- create a full export to a location in network,

2- create a new database in local machine that it's NLS_CHARACTERSET is WE8ISO8859P9 (I would like to change NLS_LANGUAGE and NLS_TERRITORY by the way)

3- and implement full import to newly created database. I 've implemented first step, but I couldn't implement the second step. I 've created the second step by using toad editor by clicking Create -> New Database but I can not connect the new database. I must connect new database in order to perform full import.

DetailsNLS_LANGUAGE.....................AMERICANNLS_TERRITORY.....................AMERICANLS_CURRENCY..

View 8 Replies View Related

Globalization :: Spool Brazilian Characters To A File

Jun 6, 2013

Oracle version Oracle Database 11g Release 11.2.0.1.0 - 64bit Production running on CentOS Linux release 6.0 (Final), kernel 2.6.32-71.29.1.el6.x86_64.

I am having a hard time spooling a file and displaying special Brazilian characters, even though I can see them correctly in SQLDeveloper:
LEOPOLDO COUTO DE MAGALHÃES JÚNIOR

Spool:
LEOPOLDO COUTO DE MAGALH?ES JUNIOR

I've tried changing the NLS_LANG at the session level, but that cannot be done. I don't want to change the default language of my DB, but really need these characters to display correctly in a file.

View 16 Replies View Related

Backup & Recovery :: ORACLE Dump File To Text File

Apr 9, 2011

Oracle10g to Sybase12.5 Migration:- How a Oracle dump file can be converted to any text file/xls file which will be loaded in sybase database later through BCP.

means
1.Exporting objects as dump file from Oracle.
2.Is there any tool/process available that can convert this into csv/txt/xls file.
3.This files can be loaded in sybase.

View 2 Replies View Related

Import Database Dump File?

May 28, 2011

i had import a database dump file & this database has username/password say abc/xyz. my database username/password is system/vinod. after importing i'm unable to login with system/vinod(error:invalid username/password) and one strange thing was happened;a user was created with abc/xyz(username/password). after altering my user only then i'm able to login with my original user.

View 1 Replies View Related

Import A Dump File (oracle 10g)

Mar 14, 2012

my problem is that whenever i want to import a dump file(oracle 10g) oracle just import 4 tables and then goes into hang state(not responding) i'm using old import method (not datapump).

View 1 Replies View Related

Import Only 1 Table From Full EXP Dump File?

Jan 14, 2011

We have an old full export .dmp file from a 10g db and there are 451 records in one specific table that we need to export. Is it possible to IMP just the one specific table from a full dump? Or, another option, can we extract the records from the one table in the .dmp file into an xml file?

View 14 Replies View Related

Server Utilities :: Dump File Determination

Mar 29, 2013

Is it possible to determine whether the dump file is created using data pump export or normal export method by just looking at dump file, If yes, how ?

Why i am asking such question is...normal export and data pump export would create a dump file with an same extension filename.dmp. So to avoid confusion during import, i would want to determine by what method the dump file was created.

Also this would be useful for me at the scenario when the customer sends me only the dumpfile and ask to import into target database. ( may be the customer don't know in what method the dump file was created ).

View 23 Replies View Related

Server Utilities :: FTP Dump File Over Network

Apr 19, 2010

I have A Daily hot backup using Expdp Command On oracle 10g R2 installed on the Linux server. And I'm trying to move this Dump File to Another directory on Windows server 2003 over network using Ftp script which will be run after the export process finished Automatically.

View 9 Replies View Related

Server Utilities :: Dump File Generation

Jan 18, 2012

I have a question on export dump file generation.

select sum(bytes)/(1024*1024*1024) "GB" from dba_segments where owner='JACK';

The above select query give the output of Schema size with 15 GB. When i perform the same schema export, the dump file size generating is 2 GB. What is the difference between the two scenarios as how come there could be a variation in file size?

View 6 Replies View Related

Server Utilities :: How To Reimport 9i Dump File To 10g

May 29, 2010

I am in the process of upgrading our 9i DB to 10g . As they are on different servers, I have installed 10g on the new server and applied the latest patchset 10.2.0.4.

I am creating the production database and importing th e9i dump file into this.Now I will be testing the whole application that uses this database.After a week, I need to take the latest 9i dump and export to the new 10g DB.

Do I need to just import the latest 9i dump into the 10g db or do I need to do anything else?

View 3 Replies View Related

Server Utilities :: Import Dump File In 11g

Feb 24, 2012

I am facing a problem importing DMP file in 11g. While importing it gives me error not responding. I have to attached the jpg file for that to clear you my point whats wrong is going during import. My Dump is on 9i i want to import that on 11G R2.

View 4 Replies View Related

Server Utilities :: Export Dump File

Jul 29, 2011

Is it possible to identify what level of export by looking at export dumpfile .. whether it is a schema export,full export,table export,..

If yes.. how ?

View 3 Replies View Related

Importing From A Dump File - Incompatible Version Number

Sep 3, 2012

import from dump (.dmp) file. I'm running Oracle database 11g Enterprise edition release 11.1.0.6.0.

the import statement I'm using is

impdp system/password@orcl full=Y DIRECTORY=data_pump_dir dumpfile=mydmpfile.dmp logfile=min.log

and the error I'm getting is "incompatible versionnumber 3.1 in the dumpfile mydmpfile.dmp"

The dump file was exported using oracle 11.2.0.2.0. I tried to download/unzip the client version of instantclient 11.2.0.2 and add it to the PATH variable in windows and then re-run the script, but it didn't work.

How I should go from here to import this dump file without reinstalling the whole database?

View 3 Replies View Related

Create Flatfile From Dump File Created By Expdp?

Mar 23, 2013

Can we create flatfile from dump file created by expdp?

View 2 Replies View Related

Server Utilities :: Import Dump File Without 2 Tables

Jan 3, 2012

I want to import dump file (without 2 tables) .The dump file contains 100 tables,indexes and constraints. So out of 100 tables i want to import 98 tables from dump file (without 2 tables).

View 13 Replies View Related

Server Utilities :: IMPDP (ORA-31619 Invalid Dump File)

May 29, 2012

I need to recreate/ clone my database to a new machine. The two machine are not connected in the network.

Step 1. (Oracle 10.2.0.5 AIX 64-bit)
expdp username/password@db1 full=y dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_expdp.log job_name=full_exp

Step 2.
FTP dump files to Windows

Step 3. (Oracle 10.1.0.2 Windows 32-bit)
impdp username/password@db1 dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_impdp.log full=y

I got:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"

Done in AIX:
create directory dp as '/bak'
grant read, write on directory dp to public;
grant exp_full_database to username;

Done in Windows:
create directory dp as 'C:P7DB';
grant read, write on directory dp to public;
grant exp_full_database to username;
grant imp_full_database to username;

View 8 Replies View Related

Server Utilities :: Error While Importing Dump File In Oracle 10g R1

Nov 1, 2012

While trying to import a schema using Data Dump, I am facing the following issue - UDI-00018 - Import utility version can not be more recent than the Data Dump server.Following is the version information of the source and target DB and the utilities :

Source DB server : 10.1.0.2.0
Export utility : 10.1.0.2.0
Import utility : 10.1.0.2.0

Target DB server : 10.1.0.2.0
Export utility : 10.2.0.1.0
Import utility : 10.2.0.1.0

View 5 Replies View Related

Server Utilities :: What Kind Of Backup By Looking At Export Dump File

Jan 2, 2012

Is there a way to know what kind of Backup (table/tablespace/full/schema) by looking at export dump file ? If yes, can you tell me the command ?

View 1 Replies View Related

Backup & Recovery :: Remap Schema From Dump File To New User?

Jan 18, 2012

I am trying to remap schema from dump file to new user but my import fails and giving me error.

-----------------------------------
Using username "oracle".
Last login: Tue Jan 17 17:41:30 2012 from 192.168.100.11
[oracle@cvs ~]$ dbstart
Processing Database instance "cvsdbm": log file /home/oracle/oracle/product/10.2.0/db_1/startup.log

[code]....

View 1 Replies View Related

Server Utilities :: EXP Versus IMPDP Interrogating Dump File

Jul 16, 2010

I have a bit of an issue with Oracle datapump dump files.

Today, I manage the export and import of oracle dump files. As part of the batch export process I have a script which essentially says:

For each schema realated to my application in THIS instance, export schema via the system user (system user allows me privs to all schemas).

On the import UI side of things I am able to run a "head -20" command on the dmp file and determine the "export client version", "date the schema was dumped", and "what schema it was dumped from". All useful info presented in my UI.

Sample output: Begin

EXPORT:V09.02.00
DSYSTEM
RUSERS
8192
Wed Jun 30 11:51:21 UserXXX.dmp
#C##
#C##

[code]....

Sample output: End

in that I allow the importation of production schemas into test schemas, (contained in a different tablespace). Based on naming convention I can determine the schema type (production or test). Additionally and probably most importantly, I am assured where the data has come from.

In looking at "expdp" and the dump file. Using the same method as above, it appears the data pump dump DOES NOT carry similar headers. Because of this, I am unable to return very little useful info from the dump file.

I realize I could run the impdp with the "sqlfile=myfile.sql" and then interrogate the sql file for the info. But on large dump files this would be fairly time consuming compared to a "head -20" on a dump file.

View 4 Replies View Related

Direct Path Exported Dump File Contains Illegal Column Length

Apr 28, 2011

I am trying to import the database and i see the following error:

IMP-00051: Direct path exported dump file contains illegal column length imp abruptly stops. my source and destination database is as follows:

Source: 9.2.0.8
Destinatin: 11g-r2.

I used exp with the following options:

direct=y
buffer=899989898
recordlength=64000
File=1.dmp,2.dmp,3.dmp,4.dmp,5.dmp,6.dmp,7.dmp,8.dmp,9.dmp,10.dmp,11.dmp,12.dmp,
13.dmp,14.dmp,15.dmp,16.dmp,17.dmp,18.dmp,19.dmp,20.dmp,21.dmp,22.dmp,23.dmp,24.
dmp,25.dmp,26.dmp,27.dmp,28.dmp,29.dmp,30.dmp
grants=y
compress=y
FILESIZE=25g
owner=BRM

Using imp with the following option:

indexes=n
buffer=99989898
recordlength=64000
File=1.dmp,2.dmp,3.dmp,4.dmp,5.dmp,6.dmp,7.dmp,8.dmp,9.dmp,10.dmp,11.dmp,12.dmp,
13.dmp,14.dmp,15.dmp,16.dmp,17.dmp,18.dmp,19.dmp,
20.dmp,21.dmp,22.dmp,23.dmp,24.dmp,25.dmp,26.dmp,27.dmp,28.dmp,29.dmp,30.dmp
analyze=n
grants=y
fromuser=brm
touser=brm
statistics=none

why my imp failing?

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved