Datapump - Importing With Inconsistent Table Structure
Apr 30, 2013
I am receiving two large export files from a vendor, so I have no control over the contents. I need to import these into our database. The two export files are very similar, except the one has slightly differenet columns in it. So, export file 1 may have a table:
COLUMN_A
COLUMN_B
COLUMN_C
The second file may have:
COLUMN_A
COLUMN_B
COLUMN_D
At the destination, I have a table that has:
COLUMN_A
COLUMN_B
COLUMN_C
COLUMN_D
Is there a parameter that would let me interchangably import either (or both) files into this destination table? This is my first attempt at data pump - but I know using import this has caused me issues. Not sure if the same limitations exist? Will the missing columns cause it to fail?
--this for txn details CREATE TABLE txn_det( txnid NUMBER PRIMARY KEY, amount NUMBER, status varchar2(50), cust_id NUMBER); ----this for customer details CREATE TABLE cust_det( cust_id NUMBER PRIMARY KEY, cust_name VARCHAR2(50), cust_acc number(15));
--data to insert for customer table INSERT INTO cust_det VALUES(101,'Miller','12345');
INSERT INTO cust_det VALUES(201,'Scott','45678'); ----data to insert for txn table INSERT INTO txn_det VALUES('tx0045',123.00,'success',101);
INSERT INTO txn_det VALUES('tx0046',4512.50,'success',101);
insert into txn_det values('tx0049',78.12,'success',101);
INSERT INTO txn_det VALUES('tx0055',123.12,'success',201);
Now THE problem IS cust_det TABLE's cust_id coulmn may contain duplicate.So I thought OF adding THE txn_id COLUMN TO THE cust_det table but I know that encourgaes redundancy.
How to take table structure in oracle? Actually I got it through this command "SELECT dbms_metadata.get_ddl(a.object_type,a.object_name) FROM user_objects a where object_type != 'PACKAGE BODY'"
any other way to get it? I need like table name field name datatype
create table my_rows ( my_envvarchar2(100), anumber(2), bnumber(2) ) / insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); insert into my_rows values ('A', 10, 20); [code]....
The first row means that the value 10 represents 40% in the couple (10,20). Meaning if I have 100 rows with the couple (10,20), 40 rows will be marked with the value 10 and 60 will be marked with the value 20. To do this, I used to create a temporary table with the same structure as the my_rows table with a new column "the_value" and I used to update this new column wth a PL/SQL for loop. But I think it is doable in a signle SQL.
I have a problem with DBMS_DATAPUMP.metadata_filter.Let's suppose that I need to export a huge list of tables (a,b,c,d,e,f,g,h,i...). Let's suppose that the list is dynamic do NOT want to use
DBMS_DATAPUMP.metadata_filter (handle => h1, NAME => 'NAME_EXPR', VALUE => 'IN (''a'', ''b'', ...)', object_type => NULL);
DBMS_DATAPUMP.metadata_filter (handle => h1, NAME => 'NAME_EXPR', VALUE => 'IN (SELECT a.export_object_name FROM my_export_table a, user_objects b WHERE a.export_object_name = b.object_name AND b.object_type = ''TABLE'')', object_type => NULL );
but it results in error.
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB [] ORA-00942: table or view does not exist
a table structure is modified every now and then because of which the few packages get uncompiled. is there any way to monitor which user has changed table structure.
create trigger on certain column for table structure.
SQL> desc MXMS_BF_TXN_DTL_T Name Null? Type ----------------------------------------- -------- ---------------------------- DOC_NO NOT NULL VARCHAR2(200) SEQ_NO NOT NULL NUMBER(24) GL_CODE VARCHAR2(200) TXN_NATURE VARCHAR2(200) TXN_TYPE_CODE VARCHAR2(200) [code].....
I need to collect new and old data whenever update statement fire on DOC_NO,POLICY_KEY,CRT_USER column.i have created only audit table for the above as below structure .
Name Null? Type ----------------------------------------- -------- ---------------------------- TIMESTAMP DATE WHO VARCHAR2(30) CNAME VARCHAR2(30) OLD VARCHAR2(2000) NEW VARCHAR2(2000)
Description:- TIMESTAMP is for when the modification happen. WHO is for username CNAME is for column which is modified OLD is for old value for the modified column New os for new value for the modified column
I have a small question is it possible to find the details of a user who modified the structure of a table, including what command he ran to change the structure of the table?
i want to find the name of user who make changes in the table structure or create any index or constraint or unique key or alter the column? Is there any way to find in Oracle. in which table what change has been done as well?
following Output needed
userid, username, schemaname, schemachangetime, "what_change_has_been_made", IP address or Computername
I have to load .csv file contents to table . I a m using Oracle Forms 10g, I have kept .csv file and .ctl file in application server c: DIRECTORY. In Form i have called host command to execute the batch file . But data is not loading as well as no error. Even i have batch file also in Application Server C: path .
LOAD DATA INFILE 'C:GL2009.CSV' REPLACE INTO TABLE F_GL_SMRY_TEMP FIELDS TERMINATED BY "," (ref_id,tran_date,dr_amt,cr_amt,acct_code,sub_code,cctr_code,sundry_code,dept_code)
if i run test.bat in manual mode without Forms Application .. it works fine with no error. sqlloader is installed in application server .
sample code in OCI in C for receiving records of table in array of structure? Or dynamically storing the result-set in an array..using array of pointers to structure..
I am trying to export a partition of a table and import it to another database. I get the below error when I try to import.
ORA-14400: inserted partition key does not map to any partition
If I export the table(for that particular partition) and import the table(after dropping the table) in destination, the partitions and sub partitions are created without any problem.
The table is Range Partitioned and Sub partitioned in List. So I had to perform the below operation if I want to retain other data in the Destination table.
1. Drop the existing partition 2. Create the partition and sub partition, same as source 3. Execute imp
In fact I had to perform step#2, as if I split the partition also, the sub partition gets replicated in the new partition, which again throws the same error. Is there better way of managing the partitions and subpartition in destination with exp/imp utility, so that I need not perform step#1 and step#2 manually.
i take export of one table (export complet successfully without warnings) when i am going to import into prduction databae the data in the table no coming i past the table structure and import command and logfile for import.
CREATE TABLE t_id_rac_ra_header (ra_company VARCHAR2(10) NOT NULL, ra_key NUMBER NOT NULL, ra_doc_type VARCHAR2(50) NOT NULL, ra_doc_number VARCHAR2(25) NOT NULL, ra_doc_date DATE DEFAULT SYSDATE NOT NULL, ra_reserve_key NUMBER,
I am working on Pro*C and i have a requirement where i need to select all the rows from a table into a c - structure variable. Since i get to know the no of rows in the table which is getting selected only at run time, i need to create a pointer variable to the structure and i'll allocate the size to it based on the count of rows in the table using malloc or calloc.I tried allocating memory using calloc and it does not show any error. But when i when the exec select statement run it shows an error.
Statements i have used: struct common *comp; struct common_ind *comp_i;
I have a query regarding importing data in a partitioned table. let me make myself more clear with an example:
I have 1 month table that contains 30 partitions single partition for a single day on one machine say machine A. on another machine say machine B i create the same table with the same script which is on machine A for the same table. i loaded data till 1-15th of a month in Machine A table and rest of 15 -30 Days data into table on machine B at the end i want to import the data on partitioned table on machine B that is from 15th -30th to machine A table. I just want to know whether data is properly imported or not not or i need to specify something
I take export partition wise (15 -30th) 15 partitions dumps and imported into Machine A table. Is it possible that i can import day wise partition from 15th to 30th into a partitioned table which already contains data from 1st -15th partition.
DECLARE cursor c1 is select /*+ INDEX(NI04.NI_DPR_DEALER IX_DPR_DEALER) */ DEALER_CODE from ni_dpr_dealer where not exists (select null from dealer_processed where ni_dpr_deale r.DEALER_CODE = dealer_processed.DEALER_CODE); type cur_type is REF CURSOR; [code].......
show errors;
I am getting errors as below
Entering Dealers count ::13236 entering conditions Dealer name at 1 => HOL202 DECLARE * ERROR at line 1: ORA-00932: inconsistent datatypes: expected - got - ORA-06512: at line 27
I have writen PL/SQL packages for data loging through pipe lined function for better peformance.The below packages has been compiled sucessfully but during the run time it shows an error like "ORA-00932: inconsistent datatypes: expected - got -".
CREATE OR REPLACE PACKAGE pkg_mkt_hub_load AS PROCEDURE sp_final_load_mkt_hub; FUNCTION fnc_pipe_tot_lvl_idx_mon_hub (pi_input_cur IN SYS_REFCURSOR) RETURN tot_lvl_idx_mon_tt PIPELINED;
[code]...
SHOW ERRORS
Error:
ERROR at line 1: ORA-00932: inconsistent datatypes: expected - got - ORA-06512: at "GPAIHMKTDTA.PKG_MKT_HUB_LOAD", line 33 ORA-06512: at "GPAIHMKTDTA.PKG_MKT_HUB_LOAD", line 55 ORA-06512: at "GPAIHMKTDTA.PKG_MKT_HUB_LOAD", line 92 ORA-06512: at line 1
types scripts:
create or replace type tot_lvl_idx_mon_ot as object (SSIA_INDEX_ID VARCHAR2(60), start_date date, CURRENCY VARCHAR2(10), LEVEL1 NUMBER(31,11), TYPE VARCHAR2(31) ,
I am working on a data migration project where I am extracting data from Oracle8i database and writing it to a Text file using File_UTL utility. In one extract procedure, I need to write a LONG datatype with some VARCHAR2 datatype in extract file. Procedure compiled fine but I am getting the "ORA-00932: inconsistent datatypes" error while executing the final procedure to write data to a file.
below the code snipplet. I am trying t write Account_Primarykey, Account_name and it's AText (LONG datatype) together in a extract file:
create or replace procedure FORCE_W_ACCOUNT as Begin declare file_handel UTL_FILE.FILE_TYPE;
I am Having below query which is having total 664 records and for WHERE Clause (accountno ='13987135') it is having 3 records but when i am taking count it is returning 3 at first time and again returning 4 every time from then onwords.