Remote Migration Of Data
Oct 30, 2012Is there any oracle utility to do remote migration of data or it is possible only using some third party software.
View 1 RepliesIs there any oracle utility to do remote migration of data or it is possible only using some third party software.
View 1 RepliesI am trying to insert data in one of the tables called as tstcntr_mstr in ibpslive instance by ibpslive user.
My source tables are on ncfiiidv instance.
Query is as follows:
insert into tstcntr_mstr
(select * from tstcntr_mstr@dlink_ncfmdv)
Error that I get is remote operations not permitted on object tables or user-defined type columns.
Table tstcntr_mstr@dlink_ncfmdv contains types.
the migration of the data.
We have our current production running on 9i and eventually want to migrate to 11g-r2. But the challenges are as follows:
9i production is running in San Francisco data center. 11g-r2 Production already setup up and running in Atlanta data center.
The database size is around 2 TB running on 9i. We are looking to transfer this to 11g-r2 and wondering, what options we have at our disposal: I was looking into EXP/IMP, but somebody said, dblinks will be much faster and reliable than EXP/IMP.
we're planning a data migration from an application (oracle-based) to another (also with oracle db).
the origin is a ca. 80 GB database. so lots of millions of records are to be migrated. (before loading records into the destination tables, they have to be transformed).
the current concept is to receive all origin data in xml files, load them in a staging area (an own migration scheme in oracle), transform and load them into the destination tables.
we have three days for the whole migration (including extract from origin database, transform, load, backup after completion...).
my question is, that a migration with xml-files is a good concept. i think xml processing is much slower than doing the same with csv files. my proposal to migrate an oracle dump (so we got the original data in our staging area) was declined.
is migration mass-data with xml files good or are there performance or other issues?
I had a table with columns as below
employee_id
payroll_pay_week1
payroll_pay_week2
payroll_pay_week3
payroll_pay_week26
I have created a new table, so that instead of having 26 columns for payment amount for each week, I want to have one column for pay amount and one column to represent the week as below
employee_id
payroll_pay
payroll_pay_week
how do I migrate the data from old table to the new table?
Send me sample data migration scripts to get knowledge on Data migration.
View 10 Replies View Relatedi am trying to load data into a table in a remote database schema, and my files are residing on another remote Server, Server having the files does not have a DB installed, i just need to know that if its possible or not..
View 2 Replies View RelatedI have one PLSQL package that does a join of two tables of remote database instance via the datalink. I just wonder where the most calculation(the join) is done, local machine or remote machine? Is there any best practice to have a better performance for such configuration.
View 3 Replies View RelatedI need to migrate data from Mysql database to Oracle11g database.
a) is there any method available to import the all the sqls like table script,constraint scripts,data(insert ) script from Mysql.so that we can apply the sql directly to the oracle schema after making necessary changes(like datatype).
b) Is there any free tool available for the migration.
how to migration from ODI to java.. How to work on this type of migration.
View 1 Replies View RelatedI have a long file in WORD as I try to load it in ORACLE quotes become periods, ex:
insert into mytab values ('myname, 26);
when i copy this and paste it in oRACLE (UNIX environment), it translates as insert into mytab values (.myname.,26)..does not recognize the quotes.
I tried copying from word to notepad to ORACLE same problem..
we have a requirement for migrating data real time from source db to target db as well as to a queue.
achieve this using any custom technique?
we tried exploring Streams with CDC but Streams being depricated and CDC removed in Oracle 12 c, we are kind of stuck.
I need to generate in report using PL/SQL code for counting the number of rows of all the tables from source and target database..The report should consist of following columns..
table name|source table row count|target table row count|mismatch|..provide me the PL/SQL code?
I need a way to ftp file to remote server by reading data from table. I searched a couple of sites which asked me to use Chris xutl_ftp package..but unfortunately the site is no accessible..
Here is the code
CREATE OR REPLACE PACKAGE UTL_FTP
AUTHID CURRENT_USER
AS
/**
* LICENSE: GNU Lesser General Public License (LGPL)
* Copyright (C) 2003-2006 Russ Johnson (john_2885@yahoo.com)
[code].....
I need to spool data from a remote server using putty(sqlplus) to a local machine. There are credentials i need to give before accessing the remote databases and i am able to do it..i tired with the below query but the spool file(csv or txt) is not able to create on local machine.
set colsep ,
set pagesize 120
set trimspool on
set headsep off
set linesize 1000
set numw
spool D: estmyfile.csv
select table_name, tablespace_name from all_tables;
spool off
I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.
View 28 Replies View RelatedWe are planning to migrate data from an application called clintrace to another application called argus safety. Both the applications are related to pharmacovigilence safety operation. Both the applications functionality are similar. So both the database are having the same data though the table structures might be different. Both the database are oracle clintrace db is 9i and argus db is 11g.
View 8 Replies View RelatedWe have a data migration scripts written for oracle. Data is not huge but we are observing that the migration is faster in the development labs but is 5x slower in the production site.
The development Oracle setup is on Windows and Production setup on Solaris. I have attached the AWR generated for a period where migration was run for 3 hours and stopped due to slow performance.
Here is my initial analysis.
1) The first timed events is the DB CPU. Hence I feel the migration scripts can be modified to run in parallel so that they can finish faster. However here the question arises why it should run faster in development env if this is an issue.
2) I tried increasing the
a.large_pool_size set to 512M
b.sga_max_size set to 8G
c.sga_target set to 8G
from 0, 4G and 4G respectively.
I have attached the AWR and below are the etc/system contents for solaris settings.
* Begin MDD root info (do not edit)
rootdev:/pseudo/md@0:0,1,blk
* End MDD root info (do not edit)
set noexec_user_stack=1
set noexec_user_stack_log=1
* IBMdpo vpath_START (do not remove)
* default SCSI timeout is 60 seconds
* uncomment to change SCSI timeout * set sd:sd_io_time=0x1e
forceload: drv/vpathdd
* IBMdpo vpath_END (do not remove)
set noexec_user_stack=1
set semsys:seminfo_semmni=100
set semsys:seminfo_semmns=1024
set semsys:seminfo_semmsl=256
set semsys:seminfo_semvmx=32767
set shmsys:shminfo_shmmax=4294967295
set shmsys:shminfo_shmmin=1
set shmsys:shminfo_shmmni=100
set shmsys:shminfo_shmseg=10
P.S. The awr report is renamed to .txt from .html to be able to upload the file.
Is it possible to migrate everything (tables, indexes) from a unencrypted to encrypted tablespaces online i.e. while the database is being used (DML)?
View 3 Replies View RelatedHow to migrate Data from oracle to MS SQL Server or Vice Versa.
I came to know about 2 methods:
1) Using SQL Developer
2) USing ODBC.
I have Arabic data stored into below two encodings in oracle AL32UTF8 database
1 Million rows into WE8MSWIN1252
.5 million rows into AR8MSWIN1256
in all cases I like to convert 1 Million row of WE8MSWIN1252 into AR8MSWIN1256. I could convert the data encoding from 1252 to 1256 using SQLdeveloper. But no luck using oracle export/import utility (both exp and expdp)…. I’m thinking may be certain locale is required for export/import to work.
Also my company said SQL developer is free utility may not be supported by oracle so use export and import for this, I need to convert only one table.
Similar case
[URL]...........
I have to servers 'A' and 'B', On Server there is a schema with the name "test" having a table "t1". I want to import this t1 table to server B.
Is it possible to export dump using expdp to remote host.
I found that there is an option for this like "network_link". for testing this, I created a dblink from Server "B" to "A" named "vxmldb".
When I am using the below command on Server B there I am getting the following error.
C:>expdp directory=data_pump_dir logfile=test.log network_link=vxmldb schemas=test dumpfile=test.dmp
Export: Release 11.1.0.6.0 - 64bit Production on Tuesday, 05 June, 2012 14:22:07
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Username: system/vxmldb@vxmldb
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39170: Schema expression 'TEST' does not correspond to any schemas.
In above command
directory ---> Server "B" location
network_link ----- > dblink name which is created on Server "B" to access Server "A"
schemas ------ > schema name which is to be exported . Exists on Server "A" DB
username/password ---- >> higher level username/password for Server "A".
@connectString ----- >> connecting to Server "A"
is there a way to copy WF_EVENT_T type column data from local database to remote database.
View 2 Replies View RelatedI have upgraded oracle database from 9i to 11g using export and import utility. After migration we are facing performance issue in report generation, We have observed that First execution of report is taking very long time and when we generate the same report 2 -3 times there is considerable change in the execution time and it is more better than the first execution.
2 days back I have restarted the database and found the same issue. There are around 300 Reports and it is not possible to generate all the reports 2-3 times every time we restart the database.
We should migrate our 10gR2 single-instance database with conventional file system to a two-node 11gR2 RAC on ASM (on same Windows Server platform…).
How can I migrate my production database using data pump? I have full data pump export from target but I don’t know how to import, whether the scheme after scheme, full import, do I need to first create manually tablespaces on destination, whether to exclude the index, constraint, statistics?
I have a situation where I want to configure primary database (11.2) with 2 remote destinations. dest_2 is the default and points to a standby on host_2. However, I also want the primary db to continue transporting redo to dest_3 on node_3 when node_2 is taken down (planned or unplanned).
1) Configure the ALTERNATE attribute of dest_2 to point to dest_3.
2) Configure tnsnames client-side failover on primary host to point to 2 nodes (node_2 and node_3).
I was considering a solution to maintain a replicated copy of a database in a remote office. However we are using SE One edition of oracle, so native support for dataguard is not available. There definitely should be some scripting solutions for this task, but I can't find any to date.
View 7 Replies View RelatedIt's been a while since I worked with SQL Plus . I am using Oracle 11g. We are working on a legacy data migration project. I have a table of records with circular dependency records. i am trying to identify the records. I have the foll. columns- Product,Source,target. I want o identify the records which form a loop. For e.g.
Source Target
A B
B C
C D
D A
Last record forms a loop-I need to identify these records. My query is below-
SELECT DISTINCT SOURCE,TARGET FROM RULESELIB WHERE CONNECT_BY_ISCYCLE=1 CONNECT BY NOCYCLE SOURCE=PRIOR TARGET;
I ran this query on 2 tables- one with 75000 records and the other with 25000 records. It works fine on the table with 75000 records completes within a minute but it does not complete on the other table. I can't seem to be able to figure out the issue with the query or is there something about the data that is causing this query to loop infinitely?
I have one prod server (11.1.0.6.0) servers on Windows 2003 R2 64 bit.
Server Name (PRODDB)
I do not have access to that prod server , i want to take one export data pump from my client machine and due to space issue in prod server , i want to keep dump file in my client machine itself. i can take traditional export and keep the dump file in my client machine but i do not know how to achieve the same via data pump ...
How to generate dump file in client machine itself via data pump ?
I Have two DB s with same tables, I need to insert the data from db1 tables to db2 tables. Primary ID starts from 1 on both the tables.
EXample:
Table1:
ID Name c3 c4 c5
1 Oracle1 x y b
2 Oracle2 x n b
3 Oracle3 x f b
4 Oracle4 x f b
5 Oracle5 x f b
Table2:
ID Name c3 c4 c5
1 ONT1 t y t
2 ONT2 t m h
3 ONT2 t b n
All the table1 data should be moved to table2..... Desired out put should be like following
ID Name c3 c4 c5
1 ONT1 t y t
2 ONT2 t m h
3 ONT2 t b n
4 Oracle1 x y b
5 Oracle2 x n b
6 Oracle3 x f b
7 Oracle4 x f b
8 Oracle5 x f b
Data volume is huge... There are 1500 tables and on an average there 10 Million records per table. Currently I am using DB links