Data Migration With XML Files?
Aug 25, 2010
we're planning a data migration from an application (oracle-based) to another (also with oracle db).
the origin is a ca. 80 GB database. so lots of millions of records are to be migrated. (before loading records into the destination tables, they have to be transformed).
the current concept is to receive all origin data in xml files, load them in a staging area (an own migration scheme in oracle), transform and load them into the destination tables.
we have three days for the whole migration (including extract from origin database, transform, load, backup after completion...).
my question is, that a migration with xml-files is a good concept. i think xml processing is much slower than doing the same with csv files. my proposal to migrate an oracle dump (so we got the original data in our staging area) was declined.
is migration mass-data with xml files good or are there performance or other issues?
View 2 Replies
ADVERTISEMENT
Mar 9, 2011
We have our current production running on 9i and eventually want to migrate to 11g-r2. But the challenges are as follows:
9i production is running in San Francisco data center. 11g-r2 Production already setup up and running in Atlanta data center.
The database size is around 2 TB running on 9i. We are looking to transfer this to 11g-r2 and wondering, what options we have at our disposal: I was looking into EXP/IMP, but somebody said, dblinks will be much faster and reliable than EXP/IMP.
View 10 Replies
View Related
Mar 23, 2012
I had a table with columns as below
employee_id
payroll_pay_week1
payroll_pay_week2
payroll_pay_week3
payroll_pay_week26
I have created a new table, so that instead of having 26 columns for payment amount for each week, I want to have one column for pay amount and one column to represent the week as below
employee_id
payroll_pay
payroll_pay_week
how do I migrate the data from old table to the new table?
View 6 Replies
View Related
Nov 9, 2013
Send me sample data migration scripts to get knowledge on Data migration.
View 10 Replies
View Related
Oct 30, 2012
Is there any oracle utility to do remote migration of data or it is possible only using some third party software.
View 1 Replies
View Related
Oct 31, 2008
I need to migrate data from Mysql database to Oracle11g database.
a) is there any method available to import the all the sqls like table script,constraint scripts,data(insert ) script from Mysql.so that we can apply the sql directly to the oracle schema after making necessary changes(like datatype).
b) Is there any free tool available for the migration.
View 1 Replies
View Related
May 6, 2013
how to migration from ODI to java.. How to work on this type of migration.
View 1 Replies
View Related
Sep 20, 2011
I have a long file in WORD as I try to load it in ORACLE quotes become periods, ex:
insert into mytab values ('myname, 26);
when i copy this and paste it in oRACLE (UNIX environment), it translates as insert into mytab values (.myname.,26)..does not recognize the quotes.
I tried copying from word to notepad to ORACLE same problem..
View 5 Replies
View Related
Sep 13, 2013
we have a requirement for migrating data real time from source db to target db as well as to a queue.
achieve this using any custom technique?
we tried exploring Streams with CDC but Streams being depricated and CDC removed in Oracle 12 c, we are kind of stuck.
View 1 Replies
View Related
May 9, 2011
I need to generate in report using PL/SQL code for counting the number of rows of all the tables from source and target database..The report should consist of following columns..
table name|source table row count|target table row count|mismatch|..provide me the PL/SQL code?
View 6 Replies
View Related
May 11, 2011
I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.
View 28 Replies
View Related
Mar 9, 2011
We are planning to migrate data from an application called clintrace to another application called argus safety. Both the applications are related to pharmacovigilence safety operation. Both the applications functionality are similar. So both the database are having the same data though the table structures might be different. Both the database are oracle clintrace db is 9i and argus db is 11g.
View 8 Replies
View Related
Jun 7, 2011
We have a data migration scripts written for oracle. Data is not huge but we are observing that the migration is faster in the development labs but is 5x slower in the production site.
The development Oracle setup is on Windows and Production setup on Solaris. I have attached the AWR generated for a period where migration was run for 3 hours and stopped due to slow performance.
Here is my initial analysis.
1) The first timed events is the DB CPU. Hence I feel the migration scripts can be modified to run in parallel so that they can finish faster. However here the question arises why it should run faster in development env if this is an issue.
2) I tried increasing the
a.large_pool_size set to 512M
b.sga_max_size set to 8G
c.sga_target set to 8G
from 0, 4G and 4G respectively.
I have attached the AWR and below are the etc/system contents for solaris settings.
* Begin MDD root info (do not edit)
rootdev:/pseudo/md@0:0,1,blk
* End MDD root info (do not edit)
set noexec_user_stack=1
set noexec_user_stack_log=1
* IBMdpo vpath_START (do not remove)
* default SCSI timeout is 60 seconds
* uncomment to change SCSI timeout * set sd:sd_io_time=0x1e
forceload: drv/vpathdd
* IBMdpo vpath_END (do not remove)
set noexec_user_stack=1
set semsys:seminfo_semmni=100
set semsys:seminfo_semmns=1024
set semsys:seminfo_semmsl=256
set semsys:seminfo_semvmx=32767
set shmsys:shminfo_shmmax=4294967295
set shmsys:shminfo_shmmin=1
set shmsys:shminfo_shmmni=100
set shmsys:shminfo_shmseg=10
P.S. The awr report is renamed to .txt from .html to be able to upload the file.
View 6 Replies
View Related
Oct 22, 2010
Is it possible to migrate everything (tables, indexes) from a unencrypted to encrypted tablespaces online i.e. while the database is being used (DML)?
View 3 Replies
View Related
Oct 25, 2012
How to migrate Data from oracle to MS SQL Server or Vice Versa.
I came to know about 2 methods:
1) Using SQL Developer
2) USing ODBC.
View 2 Replies
View Related
Feb 19, 2013
I have Arabic data stored into below two encodings in oracle AL32UTF8 database
1 Million rows into WE8MSWIN1252
.5 million rows into AR8MSWIN1256
in all cases I like to convert 1 Million row of WE8MSWIN1252 into AR8MSWIN1256. I could convert the data encoding from 1252 to 1256 using SQLdeveloper. But no luck using oracle export/import utility (both exp and expdp)…. I’m thinking may be certain locale is required for export/import to work.
Also my company said SQL developer is free utility may not be supported by oracle so use export and import for this, I need to convert only one table.
Similar case
[URL]...........
View 5 Replies
View Related
Sep 24, 2010
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
View 4 Replies
View Related
Mar 29, 2011
I am working in a bank as an system consultant, i have a SAN Storage Area and oracle as below.
SAN 1
This interface includes the DATA FILES of the oracle tablespace
SAN 2
SAN1 Mirrors the DATA FILES of the oracle tablespace to SAN 2
1. Can i rely on real time data recovery from SAN2 ?
2. if SAN1 (Data Files are currupted) will the SAN2 Data Files will be currupted as well.
3. If the SAN2 is currupted then what Oracle Features can be used to have uncurrupted data.
View 5 Replies
View Related
Jun 15, 2012
I have set up a cross platform (Microsoft Windows IA (32-bit) -> Linux x86 64-bit) data guard and it worked fine.Then I did a switch over (which again worked) and found out the data is not getting replicated at all.. checked the data files available from the new primary database and found out they are in the windows format as below..
SQL> select name from v$datafile;
NAME
--------------------------------------------------------------------------------
D:ORACLEAPPADMINISTRATORORADATAMFSSYSTEM01.DBF
D:ORACLEAPPADMINISTRATORORADATAMFSSYSAUX01.DBF
D:ORACLEAPPADMINISTRATORORADATAMFSUNDOTBS01.DBF
D:ORACLEAPPADMINISTRATORORADATAMFSUSERS01.DBF
D:ORACLEAPPADMINISTRATORORADATAMFSRMANRMAN_TS01.DBF
and physically they were created at '/home/app/oracle/product/11.2.0/db_1/dbs/' and as
D:ORACLEAPPADMINISTRATORORADATAMFSREDO02.LOG
D:ORACLEAPPADMINISTRATORORADATAMFSREDO03.LOG
D:ORACLEAPPADMINISTRATORORADATAMFSRMANRMAN_TS01.DBF
View 5 Replies
View Related
Sep 14, 2010
I have upgraded oracle database from 9i to 11g using export and import utility. After migration we are facing performance issue in report generation, We have observed that First execution of report is taking very long time and when we generate the same report 2 -3 times there is considerable change in the execution time and it is more better than the first execution.
2 days back I have restarted the database and found the same issue. There are around 300 Reports and it is not possible to generate all the reports 2-3 times every time we restart the database.
View 5 Replies
View Related
Sep 9, 2013
We should migrate our 10gR2 single-instance database with conventional file system to a two-node 11gR2 RAC on ASM (on same Windows Server platform…).
How can I migrate my production database using data pump? I have full data pump export from target but I don’t know how to import, whether the scheme after scheme, full import, do I need to first create manually tablespaces on destination, whether to exclude the index, constraint, statistics?
View 3 Replies
View Related
May 10, 2011
I am trying to insert data in one of the tables called as tstcntr_mstr in ibpslive instance by ibpslive user.
My source tables are on ncfiiidv instance.
Query is as follows:
insert into tstcntr_mstr
(select * from tstcntr_mstr@dlink_ncfmdv)
Error that I get is remote operations not permitted on object tables or user-defined type columns.
Table tstcntr_mstr@dlink_ncfmdv contains types.
the migration of the data.
View 20 Replies
View Related
Mar 1, 2013
It's been a while since I worked with SQL Plus . I am using Oracle 11g. We are working on a legacy data migration project. I have a table of records with circular dependency records. i am trying to identify the records. I have the foll. columns- Product,Source,target. I want o identify the records which form a loop. For e.g.
Source Target
A B
B C
C D
D A
Last record forms a loop-I need to identify these records. My query is below-
SELECT DISTINCT SOURCE,TARGET FROM RULESELIB WHERE CONNECT_BY_ISCYCLE=1 CONNECT BY NOCYCLE SOURCE=PRIOR TARGET;
I ran this query on 2 tables- one with 75000 records and the other with 25000 records. It works fine on the table with 75000 records completes within a minute but it does not complete on the other table. I can't seem to be able to figure out the issue with the query or is there something about the data that is causing this query to loop infinitely?
View 11 Replies
View Related
Aug 13, 2010
I'm a SAP consultant working in SQL on NT platforms. This is the first conversion from Oracle that I have done. My client has provided us with a "Cold" backup of the Oracle dbase on a HD formatted in Unix, I have the partition mounted and I'm able to view the files. I have the ORDATA folder with all the .DBF files.
Q: How do I extract the data from the .DBF files. I need to export to something workable with SQL.
Original database was on Unix, I'm operating on Windows platform.
View 4 Replies
View Related
Aug 6, 2012
DB version : 11.2.0.2
Platform : RHEL 5.4
We use Oracle Managed files for storing datafiles in ASM diskgroups. To add datafiles to a tablespace we usually issue
SQL > ALTER TABLESPACE CADL_WM_TBS ADD DATAFILE '+DATA' SIZE 10g AUTOEXTEND Off;
Tablespace altered.And the new datafile will be created at the below location
+DATA/orcl/datafile/cadl_wm_tbs.893.767888027But my db_create_file_dest is set only as DATA
SQL > show parameter db_create_file
NAME TYPE VALUE
------------------------------------ ----------- ------------------------------
db_create_file_dest string +DATA
Although the above datafile got created in the desired location (ie inside +DATA/<dbname>/datafile/ directory), how did this happen without us setting the db_create_file_dest parameter to +DATA/orcl/datafile ?
View 4 Replies
View Related
Jul 3, 2013
I was trying to load data from XML files to an Oracle database table.I followed these below steps to load that file data into a table. Created XML_DIR1 as oracle directory where i have kept all XML files.
Create table import_rpt_xml of xmltypexmltype store as binary xml; insert into import_rpt_xmlvalues (xmltype (bfilename('XML_DIR1','I-Yamanouchi-20040525-501.xml'),nls_charset_id('AL32UTF8')));
This insert shows below error: Error starting at line 80 in command:insert into import_rpt_xmlvalues(xmltype(bfilename('XML_DIR1', 'I-Yamanouchi-20040525-501.SGM'), nls_charset_id('AL32UTF8')))
Error report:SQL Error: ORA-31061: XDB error: XML event errorORA-19202: Error occurred in XML processingIn line 69 of orastream:LPX-00217: invalid character 142 (U+008E) I tried to look into my XML and got that it has some Japanese characters in it.
this to deal with japanese characters in XML. I don't want to miss those characters. My databse NLS_CHARACTERSET is 'AL32UTF8'.
My sample XML file looks like this.
<ichicsr lang="ja">
<ichicsrmessageheader>
<messagetype>ichicsr</messagetype>
<messageformatversion>2.1</messageformatversion>
<messageformatrelease>2.0</messageformatrelease>
<messagenumb>US-Yamanouchi-W2004050033-4</messagenumb>
<messagesenderidentifier>Yamanouchi</messagesenderidentifier>
<messagereceiveridentifier>PMDA</messagereceiveridentifier>
[Code]...
and so on.
View 4 Replies
View Related
Jul 24, 2012
: I noticed three data files on my standby show's they are corrupted. What will be the easy and fast way to sync them up from my production database?
View 1 Replies
View Related
May 13, 2004
I would like to extract some data value from oracle to a text file...and i m not sure how to set the delimiter between the columns data value
SET echo off
SET space 0
SET pagesize 0
SPOOL a.txt
SELECT emp_id, name, add
FROM table1
/
SPOOL OFF
Where do i set the delimiter?
Can i do something like in SQL*Loader?
fields terminated by ',' enclosed by '"'
I would like the text file to be display as
"123","ABCD","123 abc road"
"234","XYZ","234 xyz road"
View 5 Replies
View Related
Jun 12, 2013
I am trying to insert data in three tables from three csv files simultaneously. This is what I have so far:
---insert all data from three csv files
DECLARE
--zenobject
F UTL_FILE.FILE_TYPE;
[Code]....
View 5 Replies
View Related
Dec 28, 2012
db can only be opened if all the datafile and controlfiles are synchronize.I wonder if db crash and we dont have any kind of backup. is there some way to synchronize the control file with the datafile?....any way.db is not idle either when crashes we can manage data lose.just want our database open
View 1 Replies
View Related