Performance Tuning :: Import And Export Of Database
Oct 14, 2010I am doing import and export of database.Before loading data i drop all the tables and import.Is there any issue if we do drop tables and import data frequently.
View 2 RepliesI am doing import and export of database.Before loading data i drop all the tables and import.Is there any issue if we do drop tables and import data frequently.
View 2 RepliesTesting our 9i to 11g upgrade, we've imported the entire DB into the new machine.We've found that certain procedures are really suffering performance problems. BUT, we've also found, that if we check out a production copy of the procedure from our source code control, and reinstall it, the performance issue goes away. Just alter the procedure and recompiling does NOT work.
The new machine where the 11g database exists is slightly different than the source, but it's not like we have this problem with every procedure. It's only a couple.
any possible reason that we'd have to re-install a procedure to correct a performance problem?
An SQL query is taking a lot of time than usual and not completing even left after hours! The query joins a table with a quite complex view.
The same query in a test database completes in less than 2 mins.
I would like to export the sql plan from test database to prod database.
how to export/import in 10.2.0.4 version for a particular sql statement's execution plan.
1) Is there a way to skip database jobs while exporting (EXPDP) ?
2) Is there a way to skip database jobs while importing (IMPDP) ?
I have an issue with export(expdp).
When i exporting an user using expdp utility, the load the on the server is going up-to 5. The size of the database is 180GB. Below is the command that i use for export.
expdp sys/xxxx directory=dbpdump dumpfile=expdp_trk_backup.dmp logfile=expdp_trk_backup.log exclude=statistics schemas=trk
Do i need any look into any memory parameters for this?
I would like to export an entire DB metadata . I want to exclude data.is it possible.We have 100+users.We get request to restore package from their schema very often.So I am thinking of creating job to emport an entire DB metadata .
View 7 Replies View RelatedLooking to understand the difference between instance tuning and database tuning.
What is the difference between these two tuning exercises? I understand that an instance is memory based structures (logical) where as database consists of physical structures.
However, how does one tune a database the physical structure? Does it have to do with file placements/block sizes etc. Would you agree that a lot of that is taken care by ASM now in 11g? What tools are required/available (third party as well as oracle supplied) for these types of tuning scenarios?
Few days ago, My database server no access to StorageBox then I reboot it then after works fine. But, know DB import process is too slow. Before 100GB DB import process completed within 10 hours when server normal running. Now 2 day working, but not complete
How to investigate this issue? Maybe I miss increase some parameters on the Server or Oracle?
Here is my server brief info:
RAM is 16GB,
SWAP size is 16GB,
CPU 12 cores
SQL> show sga;
Total System Global Area 4294967296 bytes
Fixed Size 1984144 bytes
Variable Size 369105264 bytes
Database Buffers 3909091328 bytes
Redo Buffers 14786560 bytes
1) I have 5 Exported Dump files.
2) All of those 5 dump files were taken in different time periods.
3) Many of those Dump files are having the same Partition records.
eg:-
Dump 1:- 01-06-2010 to 31-11-2010
Dump 2:- 01-09-2010 to 31-12-2010
4) Now i want to import all those partitioning data into a single table, without having any duplication.
Why do export-import require temporary tablespace? Since export-import do behave like DMLs, when does temporary tablespace be needed by datapump utility?
View 2 Replies View RelatedI need to refresh a PROD database into TEST database. The PROD and TEST runs on 10g. I need a full refresh. Is there any pre req's which i should keep in mind ?.
View 1 Replies View RelatedThere is a simple way to increase the performance of a query by reducing the row-size of the table it hits. I used it in the past by dividing the table into smaller parts and querying respective smaller table in each query.
what is this method called ? just forgot the method and can't recall it. what this type of row-reduction optimization is called ?
i want to export a particular schema from a database, and than import it in the same database but saving it as a different schema name.
and i need to use imp export utility for it.
one schema with the data... other one with just the meta data.
steps to export/import a database with options with exp/imp command.i want to acess the exported database table with my current user schema, means actually i had exported a database earlier but i'm not able to access tables directly rather i've to use exported database schema like abc.tablename.
View 1 Replies View RelatedHow to import and export database in oracle? what is difference between backup and restore?
View 0 Replies View RelatedHow can I export FGA / row level security policies from one database to another? I have created a new version of my schools ERP database, with upgraded application software, and now need to get the policies from our current production system to the new one.
View 5 Replies View Relatedversion 10203 on windows (old db on solaris 8170)
i Have to refresh database data, No of users/schemas are 400+. fastest way to do that would be to do full exp/imp.but 1st drop current users cascade. (any command to drop all users in with one command?)then validate all all tables/schemas are same & up-to-date. i am thinking to check full exp logs on old & new db.(what that can take forever manually going through thousands of tables etc)
I need to migrate a 10g database into an 11gR2 on same Red Hat Linux platform (although different servers with different versions of Linux). The difference between the two databases however is the SID, the new one has a different SID which means that the datafiles will be named differently (as our datafile names include the SID) otherwise everything else is the same..
I propose to take the following steps:
- install 11gR2 on the new server
- create the 11gR2 database with new SID using DBCA
- full export 10g database
- full import dump file into 11gR2 database.
I however do not have experience of how this will work with respect to the full import where the datafiles are named differently. For example TABLESPACE TEST in the source database has datafile TEST_SOURCE.dbf but the same TABLESPACE TEST in the target database will have datafile TEST_TARGET.dbf.
Will all the data in the source database be correctly imported into the new database?
I inherit a backup procedure described in
[URL]......
I ask if this procedure works on Oracle Database 10g R2. I have a Oracle DB 10g r2 on Linux machine and I want to copy it to a windows Oracle DB 10g r2, or if it not works to another Linux machine.
I want to create two or three sachems on my production server which should be the same copy of my another second production server. And I access this second server through VPN connection on toad9.0.1. And I access my production server through VNC viewer and database through toad.
How cloud I create schema on my first prod. server from second server.
import and export database in oracle with proper example and procedure?
View 3 Replies View RelatedCan I use impdp to restore to a different database name? If yes what is the syntax? This is 10.2 on linux.
View 2 Replies View RelatedI need to copy more than 1000 database users(without objects) from orcle 9i to oracle 11g. They don't allow to use any graphical tools.which is the best way to complete this task? does conventional export /import works for only users only ?
View 2 Replies View RelatedI have a RedHat Server with Oracle Ent. 8iI did a full export of all my current data bases ( DCTEST, DCPROG, DCUSAC, DCCAND)I now have the DC*.dmp files. How can I know import these files into Oracle Database 11R2?
Do you have a Step-by-Step how-to ?IF I don't have the scripts to re-create the databases how to go forward ?
i want to perform full export + import of an oracle 11g database as fast as possible. i was thinking to perform the exp+imp on the same command.in exp i can perform something like this :
mknod /oracle/migration/exp_pipe p
exp '/ AS SYSDBA' file= /oracle/migration/exp_pipe full=y | imp system/***@oracle_db file= /oracle/migration/exp_pipe full=y
i know that i can do both action in impdp when using a dblink, but the problem is that some objects in the database cannot be copied via a dblink. the question is if there's a corresponding datapump command to the old exp+imp command i presented.
I want to Copy a data from One oracle database to another.
I have checked Import/Export Utility but the problem is import utility doesn't support conflicts resolution techniques between rows.
For Example if there's a table in the source database have the same row key in the destination database. if i use 'Ignore' parameter with value = y, the destination table will have a duplicate rows.
I want to ask if there's another way to import data from oracle database to another with some mechanism of detecting the conflicts and resolve them?
I'm studying abt SQL*Loader. All I've learn it needs to have:
1. One text input file
2.Control file
3.Bad file...
But I'm confused where to put the input file...where to put the control file in which format and in control file what should I write...
My oracle version is:
Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Prod
PL/SQL Release 10.2.0.3.0 - Production
CORE 10.2.0.3.0 Production
TNS for 32-bit Windows: Version 10.2.0.3.0 - Production
NLSRTL Version 10.2.0.3.0 - Production
the database (11gR2) is located on Linux server. A business application is installed on a Windows server with an Oracle client 11g.The application is able to start a datapump export, but as matter of fact the dumpfiles are always written to the Linux-server. The directoryobject is defined as DATA_PUMP_DIR (which is the default directory).
Now we are supposed to change the datapump export in a matterthat the dumpfiles get written to the Windows server. Creating a new directory (e.g. c:datapump) and starting than the datapump from theclient always raises the errors
ORA-39002: ...ORA-39070: ...ORA-29283: ...ORA-06512: in "SYS.UTL_FILE", Zeile 536ORA-29283: ...
Is it possible at all to start a datapump export from a Windows client and writing the dumpfiles to the Windows server itself? Or do the dumpfiles always written to the database-server?
Upgrading one of the 9i database to 11g that supports a 3rd party software - ***Vendor provided an over-simplified documentations*** and recommends moving from 9i to 10g before going to 11g. A few changes from 9i to 10g.
1) db_block_size
2) character sets
etc.
Anyway, created the database DBUPGTEST on 10.2.0.1 (ultimately moving to 11gR2, so no point patching to 10.2.0.5, is there?) with all the parameter changes. At this point, these are the 2 db in play:
Current production db: Oracle 9i - PROD dbname => 2048K db block size
Current migrating db to: Oracle10g - DBUPGTEST dbname => 8192k db block size
Steps
According to vendor notes / documentation,
1) create db
2) exp full from 9i
3) imp full to 10g
Problems
1) import ended with completed unsuccessful.
2) user accounts are imported (because their default tablespace is USERS - which had already been created during DB creation); but, user accounts (schema accounts) with a different default tablespace are not imported.
Looking at the imp.log - seems like it's complaining about the db_block_size during tablespace creation - which explains why the schema accounts are not imported; because the tablespace was not created.
My questions
1) How do I import to 10g? Can I create all the tablespace in 10g first? Then import? Will it crap out because it already exists? Or will it import the objects in the schema?
2) How do I refresh data from PROD? Remember this is 9i and most of the expdp functionalities are not available. And I cannot re-exp and re-imp because there are steps (sql to run) after moving to 10g to fix some software upgrade table mappings. If I re-exp from 9i and re-imp to 10g, won't I have to re-run all those steps before the apps will run?
i have a .dmp file and i want to use the data in this file for my further practices. so, i need to dump the data in the .dmp file to the any schema exists in data base.
View 1 Replies View Related