Globalization :: Data Migration From WE8MSWIN1252 To AR8MSWIN1256

Feb 19, 2013

I have Arabic data stored into below two encodings in oracle AL32UTF8 database

1 Million rows into WE8MSWIN1252
.5 million rows into AR8MSWIN1256

in all cases I like to convert 1 Million row of WE8MSWIN1252 into AR8MSWIN1256. I could convert the data encoding from 1252 to 1256 using SQLdeveloper. But no luck using oracle export/import utility (both exp and expdp)…. I’m thinking may be certain locale is required for export/import to work.

Also my company said SQL developer is free utility may not be supported by oracle so use export and import for this, I need to convert only one table.

Similar case

[URL]...........

View 5 Replies


ADVERTISEMENT

Globalization :: Having A Oracle Database That Stores Data In Different Languages?

May 1, 2013

it is possible to have a oracle database that stores data in different languages.

I have gone through few blogs which says Oracle supports Unicode characters & UTF8 supports all languages including multi-byte.

I would like to know what are the languages supported by oracle 10G.

View 2 Replies View Related

Globalization :: Migrating Character Data Using A Full Export And Import

Jul 23, 2013

I have a database in my local machine that doesn't support Turkish characters. My NLS_CHARACTERSET is WE8ISO8859P1, It must be changed to WE8ISO8859P9 , since it supports full Turkish characters. I would like to migrate character data using a full export and import and my strategy is as follows:

1- create a full export to a location in network,

2- create a new database in local machine that it's NLS_CHARACTERSET is WE8ISO8859P9 (I would like to change NLS_LANGUAGE and NLS_TERRITORY by the way)

3- and implement full import to newly created database. I 've implemented first step, but I couldn't implement the second step. I 've created the second step by using toad editor by clicking Create -> New Database but I can not connect the new database. I must connect new database in order to perform full import.

DetailsNLS_LANGUAGE.....................AMERICANNLS_TERRITORY.....................AMERICANLS_CURRENCY..

View 8 Replies View Related

Character Set - Convert Source From WE8MSWIN1252 To Unicode

Feb 5, 2013

I want to convert my database characterset from WE8MSWIN1252 from any UNICODE, because i have to transportable tablespace to the destination, the destination is unicode and source is WE8MSWIN1252. While importing transportable tablespace i was not able to do because of this reason, so i want to convert lets say source from WE8MSWIN1252 to unicode.

View 7 Replies View Related

Client Tools :: Unable To Insert Euro Symbol In Database With Character Set WE8MSWIN1252?

Mar 29, 2011

I am using oracle 9.2.0.6.0 on HP-UX.

I am unable to insert and even display euro symbol from server as well as windows client.

Following are the details of my database server

SQL> select * from nls_database_parameters;
PARAMETER VALUE
------------------------------ ----------------------------------------
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET WE8MSWIN1252

[code]....

1) When I try to insert € from Db server (using putty) using Alt+0128 it does not print anything (nothing gets typed on the screen). Not even junk characters Also following query does not print anything

SQL> select chr(128) from dual;
C
-

2) while I set NLS_LANG on client and try to insert €, Alt+0128 produces a question mark symbol And following query displays junk character

SQL> select chr(128) from dual;

C
-
Ç

3) Regardless when I inserted couple of rows and tried UNISTR following was the result

SQL> select unistr(v) from t;
Error 45 initializing SQL*Plus
Internal error
$

View 19 Replies View Related

Data Migration From 9i To 11g R2

Mar 9, 2011

We have our current production running on 9i and eventually want to migrate to 11g-r2. But the challenges are as follows:

9i production is running in San Francisco data center. 11g-r2 Production already setup up and running in Atlanta data center.

The database size is around 2 TB running on 9i. We are looking to transfer this to 11g-r2 and wondering, what options we have at our disposal: I was looking into EXP/IMP, but somebody said, dblinks will be much faster and reliable than EXP/IMP.

View 10 Replies View Related

Data Migration With XML Files?

Aug 25, 2010

we're planning a data migration from an application (oracle-based) to another (also with oracle db).

the origin is a ca. 80 GB database. so lots of millions of records are to be migrated. (before loading records into the destination tables, they have to be transformed).

the current concept is to receive all origin data in xml files, load them in a staging area (an own migration scheme in oracle), transform and load them into the destination tables.

we have three days for the whole migration (including extract from origin database, transform, load, backup after completion...).

my question is, that a migration with xml-files is a good concept. i think xml processing is much slower than doing the same with csv files. my proposal to migrate an oracle dump (so we got the original data in our staging area) was declined.

is migration mass-data with xml files good or are there performance or other issues?

View 2 Replies View Related

SQL & PL/SQL :: Script For Data Migration

Mar 23, 2012

I had a table with columns as below

employee_id
payroll_pay_week1
payroll_pay_week2
payroll_pay_week3
payroll_pay_week26

I have created a new table, so that instead of having 26 columns for payment amount for each week, I want to have one column for pay amount and one column to represent the week as below

employee_id
payroll_pay
payroll_pay_week

how do I migrate the data from old table to the new table?

View 6 Replies View Related

PL/SQL :: Oracle Data Migration

Nov 9, 2013

Send me sample data migration scripts to get knowledge on Data migration. 

View 10 Replies View Related

Remote Migration Of Data

Oct 30, 2012

Is there any oracle utility to do remote migration of data or it is possible only using some third party software.

View 1 Replies View Related

Data Migration From MySQL Database

Oct 31, 2008

I need to migrate data from Mysql database to Oracle11g database.

a) is there any method available to import the all the sqls like table script,constraint scripts,data(insert ) script from Mysql.so that we can apply the sql directly to the oracle schema after making necessary changes(like datatype).

b) Is there any free tool available for the migration.

View 1 Replies View Related

Migration From Data Integrator To Java

May 6, 2013

how to migration from ODI to java.. How to work on this type of migration.

View 1 Replies View Related

Data Migration From WORD To ORACLE?

Sep 20, 2011

I have a long file in WORD as I try to load it in ORACLE quotes become periods, ex:

insert into mytab values ('myname, 26);

when i copy this and paste it in oRACLE (UNIX environment), it translates as insert into mytab values (.myname.,26)..does not recognize the quotes.

I tried copying from word to notepad to ORACLE same problem..

View 5 Replies View Related

Real Time Data Migration

Sep 13, 2013

we have a requirement for migrating data real time from source db to target db as well as to a queue.

achieve this using any custom technique?

we tried exploring Streams with CDC but Streams being depricated and CDC removed in Oracle 12 c, we are kind of stuck.

View 1 Replies View Related

SQL & PL/SQL :: Generate Data Migration Reports

May 9, 2011

I need to generate in report using PL/SQL code for counting the number of rows of all the tables from source and target database..The report should consist of following columns..

table name|source table row count|target table row count|mismatch|..provide me the PL/SQL code?

View 6 Replies View Related

Server Utilities :: Data Migration Using Datapump?

May 11, 2011

I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.

View 28 Replies View Related

Server Administration :: Data Migration Between Two Different Application?

Mar 9, 2011

We are planning to migrate data from an application called clintrace to another application called argus safety. Both the applications are related to pharmacovigilence safety operation. Both the applications functionality are similar. So both the database are having the same data though the table structures might be different. Both the database are oracle clintrace db is 9i and argus db is 11g.

View 8 Replies View Related

Performance Tuning :: Oracle Data Migration

Jun 7, 2011

We have a data migration scripts written for oracle. Data is not huge but we are observing that the migration is faster in the development labs but is 5x slower in the production site.

The development Oracle setup is on Windows and Production setup on Solaris. I have attached the AWR generated for a period where migration was run for 3 hours and stopped due to slow performance.

Here is my initial analysis.

1) The first timed events is the DB CPU. Hence I feel the migration scripts can be modified to run in parallel so that they can finish faster. However here the question arises why it should run faster in development env if this is an issue.
2) I tried increasing the
a.large_pool_size set to 512M
b.sga_max_size set to 8G
c.sga_target set to 8G
from 0, 4G and 4G respectively.

I have attached the AWR and below are the etc/system contents for solaris settings.

* Begin MDD root info (do not edit)
rootdev:/pseudo/md@0:0,1,blk
* End MDD root info (do not edit)
set noexec_user_stack=1
set noexec_user_stack_log=1
* IBMdpo vpath_START (do not remove)
* default SCSI timeout is 60 seconds
* uncomment to change SCSI timeout * set sd:sd_io_time=0x1e
forceload: drv/vpathdd
* IBMdpo vpath_END (do not remove)

set noexec_user_stack=1
set semsys:seminfo_semmni=100
set semsys:seminfo_semmns=1024
set semsys:seminfo_semmsl=256
set semsys:seminfo_semvmx=32767
set shmsys:shminfo_shmmax=4294967295
set shmsys:shminfo_shmmin=1
set shmsys:shminfo_shmmni=100
set shmsys:shminfo_shmseg=10

P.S. The awr report is renamed to .txt from .html to be able to upload the file.

View 6 Replies View Related

Security :: Tablespace Encryption - Data Migration?

Oct 22, 2010

Is it possible to migrate everything (tables, indexes) from a unencrypted to encrypted tablespaces online i.e. while the database is being used (DML)?

View 3 Replies View Related

Data Migration From Oracle To SQL Server 2005

Oct 25, 2012

How to migrate Data from oracle to MS SQL Server or Vice Versa.

I came to know about 2 methods:

1) Using SQL Developer
2) USing ODBC.

View 2 Replies View Related

SQL Queries Taking Long Time After 11g Data Migration

Sep 14, 2010

I have upgraded oracle database from 9i to 11g using export and import utility. After migration we are facing performance issue in report generation, We have observed that First execution of report is taking very long time and when we generate the same report 2 -3 times there is considerable change in the execution time and it is more better than the first execution.

2 days back I have restarted the database and found the same issue. There are around 300 Reports and it is not possible to generate all the reports 2-3 times every time we restart the database.

View 5 Replies View Related

Export/Import/SQL Loader :: Data Pump For Migration

Sep 9, 2013

We should migrate our 10gR2 single-instance database with conventional file system to a two-node 11gR2 RAC on ASM (on same Windows Server platform…). 

How can I migrate my production database using data pump? I have full data pump export from target but I don’t know how to import, whether the scheme after scheme, full import, do I need to first create manually tablespaces on destination, whether to exclude the index, constraint, statistics?

View 3 Replies View Related

Server Utilities :: Data Migration - Remote Operations Not Permitted

May 10, 2011

I am trying to insert data in one of the tables called as tstcntr_mstr in ibpslive instance by ibpslive user.

My source tables are on ncfiiidv instance.

Query is as follows:

insert into tstcntr_mstr
(select * from tstcntr_mstr@dlink_ncfmdv)

Error that I get is remote operations not permitted on object tables or user-defined type columns.

Table tstcntr_mstr@dlink_ncfmdv contains types.

the migration of the data.

View 20 Replies View Related

PL/SQL :: Legacy Data Migration Project - Cyclic Dependency Records

Mar 1, 2013

It's been a while since I worked with SQL Plus . I am using Oracle 11g. We are working on a legacy data migration project. I have a table of records with circular dependency records. i am trying to identify the records. I have the foll. columns- Product,Source,target. I want o identify the records which form a loop. For e.g.

Source Target
A B
B C
C D
D A

Last record forms a loop-I need to identify these records. My query is below-

SELECT DISTINCT SOURCE,TARGET FROM RULESELIB WHERE CONNECT_BY_ISCYCLE=1 CONNECT BY NOCYCLE SOURCE=PRIOR TARGET;

I ran this query on 2 tables- one with 75000 records and the other with 25000 records. It works fine on the table with 75000 records completes within a minute but it does not complete on the other table. I can't seem to be able to figure out the issue with the query or is there something about the data that is causing this query to loop infinitely?

View 11 Replies View Related

Globalization :: Database Character Set

Jun 6, 2012

We have production DB 10g with character set US7ASCII. This DB stores Arabic data and English data.Production DB located in HP unix Operating System.

When I query data from DB through SQL developer data is shown as Junk or Unknown characters(Square Boxes).

Client (Workstation from where query is issued from SQL develope- Windows XP OS) Settings: NLS_LANG = AMERICAN_AMERICA.US7ASCII

In Client workstation Oracle 10g client is installed from where I used to query data through SQL developer. The problem is I am unable to see Arabic characters in the sense that it is displayed as Junk character. However English characters and Eneglish numeric values are displayed properly.

I tried below way to make sure that data is not corrupted: Converted "Name" column to hex value (rawtohex) and displayed its HEX value. Executed below query in UTF-8 DB.

select UTL_I18N.RAW_TO_CHAR(hex_value_of-name) from dual;

This displayed Arabic name properly in UTF8 DB.

Character set for this production DB can not be changed at this time. There are many applications which is based on this DB. All these applications are well capable of converting Junk data to Arabic to display in application.

My concern is: What I should required to do to view Arabic data properly through SQL developer? Is there any settings needs to be done at my client workstation?

View 15 Replies View Related

Globalization :: Character Set For Local Language

Dec 2, 2012

I am using oracle 10 g database on windows xp. I have backup of data contains data in local language (Marathi). I want read this data in oracle itself.Which character set need to choose?

View 6 Replies View Related

Globalization :: Wrong Result For Query With Like And %?

Jun 28, 2012

I have a strange problem with query with like and %.

When I run this script:

ALTER SESSION SET NLS_SORT = 'BINARY_CI';
ALTER SESSION SET NLS_COMP = 'LINGUISTIC';
-- SELECT * FROM NLS_SESSION_PARAMETERS;
-- drop table test1;
CREATE TABLE TEST1(K1 NVARCHAR2(80));

[code]....

When i change datatype to varchar2 this code work correct.

The execution plan:

PLAN_TABLE_OUTPUT
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
SQL_ID d3d64aupz4bb5, child number 2
-------------------------------------
select * from TEST1 where k1 like N'Ł%'

[code]....

Note - dynamic sampling used for this statement (level=2)

View 2 Replies View Related

Globalization :: Handling Multibyte Characters

Aug 8, 2013

I have created a procedure which sends e-mail using UTL_SMTP. The procedure has a part in which we add the attachments to e-mail. Now , the issue is when i am adding an attachment which contains multibyte characters , these characters are replaced with '?'.

View 6 Replies View Related

Globalization :: Bad Character Set In Dump File?

Jul 23, 2013

IMPDP-ing a dump file that someone has handed me over into Oracle XE results in special characters, i.e. Umlauts, being messed up.  

In a hex editor, the dump file shows  a) the token WE8MSWIN1252 near the beginning, but b) Umlauts obviously being encoded in DOS 850, for example "König" is encoded as 4b 94(!) 6e 69 67. Does this prove that the dump file is badly formatted and that I have to resign myself to the complicated approach mentioned at the end of [URL]...

View 4 Replies View Related

Globalization :: Spool Brazilian Characters To A File

Jun 6, 2013

Oracle version Oracle Database 11g Release 11.2.0.1.0 - 64bit Production running on CentOS Linux release 6.0 (Final), kernel 2.6.32-71.29.1.el6.x86_64.

I am having a hard time spooling a file and displaying special Brazilian characters, even though I can see them correctly in SQLDeveloper:
LEOPOLDO COUTO DE MAGALHÃES JÚNIOR

Spool:
LEOPOLDO COUTO DE MAGALH?ES JUNIOR

I've tried changing the NLS_LANG at the session level, but that cannot be done. I don't want to change the default language of my DB, but really need these characters to display correctly in a file.

View 16 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved