Server Utilities :: Export / Import Tables Based On Date Condition?
Jul 26, 2012
i'm trying to do an export/import process using command prompt and the idea is export a records based on the date condition. and the date will be the parameter. my code is like this:
exp <username>/<password>@<database> file=<table_name>.dmp tables=<source_table> query="where <date> between &start_date AND &end_date";
is it possible to do like this, that it should prompt you to enter the start and end date?
I was told to move 8 tables along with constraints,indexes,grants,rows,triggers from one database to another database.I did export and import for that.The command i used was
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR
character set server uses WE8ISO8859P1 character set (possible charset conversion) About to export specified tables via Direct Path ... . . exporting table tab1 12 rows exported EXP-00091: Exporting questionable statistics. EXP-00091: Exporting questionable statistics.
[code]...
Here is the import output log Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export file created by EXPORT:V10.02.01 via direct path import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set import server uses WE8ISO8859P1 character set (possible charset conversion) . importing JAM's objects into JAM . [code]...
Everything got imported successfully . Still i have a doubt in export and import command, whether the command that i used for export and import was correct or if there is anything need to be added in command.
Tables same column names but diffrenet index structures and traget one to be partitioned hence only want to import the content Each table on source datbaase hascolumn seq number and only want to extract the last few months of data.
TABLES:table1,table2... DUMPFILE=dump_dir CONTENT=data_only QUERY= table1:"WHERE seq_num >100 "want to use expdp but not sure about how to ensure all tables have the WHERE seq_num >100 condition, if leave table1: out and just have QUERY= "WHERE seq_num >100 " will this condition be applied to all tables which is what we want.
I'm assuming also can use impdp CONTENT=data_only?
I was asked to do export/import of some schemas from 10g(linux) to 11g(AIX) using original expor/import method. I did not consider the character set and started doing export and import. while exporting, I get questionable statistics error in export log file. In the import log, I see the error like CREATE DATABASE LINK "xxxxxxxxxxxxx" CONNECT TO "xxxx" IDENTIFIED BY...
how can i monitor the export and import job and how increase the export and import job performance.
can i monitor the export and import job by checking the log and dump file created by export and import and can its performance increase by configure parallism. m i right or not?
I would like to run a daily job that will export the table data from SQL server table and Import back into Oracle table. I might need to run the query to update the flag back into sql server table once job is done. How can i do this using either sql server or oracle?
We have oracle 9.2 and sql server 2005.
Normally i do from flat file or csv file which is generated by developer or user from source destination (not me) and i dump into oracle using sql*loader but this time I have to directly extract/export the data from MS Sql server and load into Oracle table, mostly it will reload so i might doing any massaging data during the load.
Is it sql sql*loader has any function that i can use the datasource to connect the MS Sql server and fetch the data and insert back data into oracle? I have access to Sql server but i don't how to use sql server to do this or using oracle as a daily job even because have to schedule the job for this as it will be a daily job.
We are doing daily cold backup. Due to lack of disk space,we couldn't Hot backup. We want our database to be up when doing backups. Since only export/import is possible in our scenario, clarify few queries:
1) Export was done during off period from the live server. 2) We have a development server, in which we have to update our database daily. Can i overwrite the Development server using IMPORT daily? Since this import might show lots of errors (Object already exist), what parameters can i use for import.
I want to create two or three sachems on my production server which should be the same copy of my another second production server. And I access this second server through VPN connection on toad9.0.1. And I access my production server through VNC viewer and database through toad.
How cloud I create schema on my first prod. server from second server.
Is it also possible to use the EXPORT and IMPORT utilities from a client machine? I want to give these utilities to one of my developers without allowing him to sit in front of my Oracle server.
1)Want to perform an Export from a Production Schema and Import the results into Test Schema. BUT, do not want to export ALL objects from Production (only a subset of tables). Is this possible?doco on how to do this? (rather than a complete Export and then a complete Import).
2)I have 2 test instances of Oracle on the same development server, UNIT and SIT. Am using Oracle SQL Developer tool. While in the UNIT instance, is there a way to select data from the SIT instance? An example of syntax to use?
3) Can tables in the UNIT instance be compared to tables in the SIT instance, through any existing Oracle utilities
In oracle 11g R2 , I face a problem when I export/import xml records in the tables . Everytime it takes huge time (like 2 or 3 days) to import or export the data . But the dump size is very small (4gb) and this dump comes from a vendor so that I dont understand that is it a data structure problem or it the normal behaviour to import xml records.
I am not used to with xml record with oracle database before. Here I am using datapump feature . I also mention that when I delete the schema (where I imported the xml data) , it also takes 2/3 days to delete.The import script will hang in the following stage :
Import: Release 11.2.0.1.0 - Production on Sun Jan 29 11:10:38 2012 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Master table "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully loaded/unloaded Starting "SYSTEM"."SYS_IMPORT_SCHEMA_02": system/******** DIRECTORY=isb_dir DUMPFILE=jblbld_20110301_01.dmp,jblbld_20110301_02.dmp,jblbld_20110301_03.dmp logfile=JBLBLD_28Jan2012.log schemas=TBLD Processing object type SCHEMA_EXPORT/USER [code]...
The machine has 16 cpu , 32 gb RAM and during export/import most of the time maximum memory will free.
I am new to Oracle DBA. m facing one problem. i imported and exported my data from one oracle DB to another by using exp/imp.....in both exp/imp log files, its shows the fallowing messages at the end.
"Import terminated successfully with warnings." "Export terminated successfully with warnings."
but when i count there are some rows missing in some tables...
what could be the cause? is there any other way to cross check whether export/import was successful...
send me the command for exporting multiple tables(1000+) in Linux env. 9i db, i know we can do using spool command but dont know exactly how to put it. i know using Datapump but this is 9i.
I want to import dump file (without 2 tables) .The dump file contains 100 tables,indexes and constraints. So out of 100 tables i want to import 98 tables from dump file (without 2 tables).
There's an Application Express application which is based on a schema named TRAFOGLED. In order to let users test new features, there's a test application (Apex has export/import capabilities; no problem about that) which is based on another schema whose name is TRAFOTEST.
I'd like to export TRAFOGLED and import it into TRAFOTEST.I'm using 10gR2 EXPDP utility with a parameter file. Everything seems to be OK, except the fact that I'm unable to export global temporary tables (GTT). How can I tell? I didn't see them after import!
These are my GTTs: SQL> show user USER is "TRAFOGLED" SQL> SQL> select table_name from user_tables where temporary = 'Y';
[code]...
C:TEMP> No tables were exported. Certainly, I don't expect any data to be exported, but I'd be happy with CREATE TABLE statements so that I don't have to create these tables separately.
any other utilities that we can use to load data from our PROD server (10g) to DEV server (9i)? I've read some related topics here that it's not possible to import from a HIGHER to LOWER versions of Oracle. We've tried (many times) EXPorting selected tables from the 10g dB, then IMPort it to the 9i dB and we've haven't succeeded anyhow. PROD & DEV have a different schema/owner but the same table structures.
have two queries that will return same columns from two different set of tables ( column mapping has been taken care of). The return type is out ref cursor. (P_SUPPLY_REORDER )
Query-2 Xcom -------------------------------------- select null as sMO_NO, xso.created_date as SPLY_ORD_DT, xso.fk_cust_id as cust_id, cust.cust_po_no as cust_PO_NO ,(sta.SHIP_TO_ATTN_FIRST_NAME||''||sta.SHIP_TO_ATTN_LAST_NAME) as attention_name, xsol.CARTONS_ORDERED as SPLY_ORD_QTY, [code].......
Now the requirement is One of four conditions are possible for each Supply Reorder Number:
. Both table queries return no records- Populate all the P_SUPPLY_REORDER output fields with nulls . SUPPLY_ORDER returns a record, but XCOM_ORDER_HEADER returns no records - Populate output fields with values from the join of SUPPLY_ORDER and SUPPLY_ORDER_LINE. . SUPPLY_ORDER returns no records, but XCOM_ORDER_HEADER returns one record - Populate output fields with values from the join of XCOM_ORDER_HEADER and XCOM_ORDER_LINES. . SUPPLY_ORDER returns a record, and XCOM_ORDER_HEADER returns a record; find out the latest order by comapring max(SPLY_ORD_DT) from SUPPLY_ORDER with max(CREATED_DATE) from XCOM_ORDER_HEADER. - If the latest order is in SUPPLY_ORDER, then populate output fields with values from the join of SUPPLY_ORDER and SUPPLY_ORDER_LINE. - If order dates are equal from both join results, then populate output fields with values from the join of SUPPLY_ORDER and SUPPLY_ORDER_LINE. - If the latest order is in XCOM_ORDER_HEADER, then populate output fields with values from the join of XCOM_ORDER_HEADER and XCOM_ORDER_LINES.
Question is how can we switch over the queries to pull respective dataset based on these conditions ( checking that which table join is going to return a row and then based upon latest order if both tables return a row) and all this logic as part of single SQL statement that is returned as OUT Ref Cursor.
i have a list of 500 tables. I want to delete data from those tables based on a condition. (Data before 2008 year needs to be deleted). Each table has a column based on which data needs to be deleted. Provide a code which does this efficiently and fast. Bulk collect is preferable.