Server Utilities :: ORA-00001 - Unique Constraint Violated Error During Impdp?
Oct 17, 2011
I am trying to import the dump using the impdp utility and followed the below steps I have disabled all the constraints and executed the below command
impdp DIRECTORY=EXPDP_DAILY_BACKUP_DIR DUMPFILE=expdp_QA_full_11102011_214501.dmp
logfile=imp_EXPDP_QA_full_12102011_203254.log remap_schema=prod:prod remap_tablespace=prod:prod
schemas=prod TABLE_EXISTS_ACTION=truncate content=data_only.
but I am getting the below error like this for 1 or 2 tables . and If I import those tables seperately its getting imported successfully. i am not getting the below error always .
ORA-31693: Table data object "PROD"."DAS_ID_GENERATOR" failed to load/unload and is being skipped due to error:
ORA-00001: unique constraint (PROD.DAS_ID_GENERATOR_P) violated
ORA-31693: Table data object "PROD"."TKT_DIST_SRV_STAT" failed to load/unload and is being skipped due to error:
ORA-00001: unique constraint (PROD.SERVER_STATS_P) violated
View 8 Replies
ADVERTISEMENT
May 14, 2010
I have faced an error as below:-
1 : 23000 : java.sql.BatchUpdateException: ORA-00001: unique constraint (SDS1.PK_EXP_TXT) violated
1 : 23000 : java.sql.SQLException: ORA-00001: unique constraint (SDS1.PK_EXP_TXT) violated
java.sql.BatchUpdateException: ORA-00001: unique constraint (SDS1.PK_EXP_TXT) violated
at oracle.jdbc.driver.DatabaseError.throwBatchUpdateException(DatabaseError.java:367)
at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:9119)
[Code] ......
After i check data in all_indexes table, i have found some old data which are belongs to 2008 and 2009. The sample as below:-
OWNER INDEX_NAME LAST_ANALYZED
SDS1PK_ACTION 31-OCT-09 07:04:49
SDS1AK_ACTION 31-OCT-09 07:04:49
SDS1PK_COND 31-OCT-09 07:04:50
SDS1AK_COND 31-OCT-09 07:04:50
SDS1COND_FK1 31-OCT-09 07:04:50
SDS1COND_FK2 31-OCT-09 07:04:50
is it the problem due to the old data not remove from the all_indexes table ?.. if YES is it I have to delete the old data manually from the all_indexes table ?
View 7 Replies
View Related
Aug 3, 2008
I got this error message in my replication environment.
ORA-12012: error on auto execute of job 2182370
ORA-12008: error in materialized view refresh path
ORA-00001: unique constraint (TE.S_TE_MTH_DBASE_SALES_INFO_A_U1) violated
ORA-06512: at "SYS.DBMS_SNAPSHOT", line 189
Materialized view and Base table having only one unique index. There are no referable constraints in my base table. Because source table not refer any other tables.
Even; materialized view log created using rowid because there are no primary key constraints in my base table.When I manually refresh materialized view I got the below error message
ORA-00001: unique constraint (TE.S_TE_MTH_DBASE_SALES_INFO_A_U1) violated
How it's possible? Because there is parent/child relationship in my source table and there is no way to enter duplicate records it's unique index.
View 8 Replies
View Related
Mar 15, 2011
I have disabled the primary key on a table using this...ALTER TABLE C_LOV_BOROUGH DISABLE CONSTRAINT CLOVBOROUGH_PK CASCADE;
but now when I try to insert duplicate value. It gives this error.
ORA-00001: unique constraint (LOT_DBA.CLOVBOROUGH_PK) violated
View 5 Replies
View Related
Mar 26, 2007
From a step by step instructions I'm asked to put the following into sql*plus:
CREATE TABLE Lab2Lecturer
(staffNO VarCHAR2(10) NOT NULL,
title VARCHAR2(3),
fName VARCHAR2(30),
[code]...
Then the following:
INSERT INTO Lab2Lecturer
(staffNO, title, fName, lName, streetAddress, suburb, city, postCode, country, lecturerLevel, bankNO, bankName, salary, workLoad, researchArea)
VALUES
('1000', 'Dr', 'Johanna','Santoso',
'3 Robinson Av', 'Kew', 'Melbourne', '3080', 'Australia', 'C', '1000567237', 'CommBank', 65000.00,1.0, 'O-R DB');
and finally,
INSERT INTO Lab2Lecturer
(staffNO, title, fName, lName, streetAddress, suburb, city, postCode,country, lecturerLevel, BankNO,bankName, salary, workLoad, researchArea)
VALUES
('1000', 'Dr', 'Justine', 'Martin', '6 Algorithm AV', 'Montmorency', 'Melbourne', '3089', 'Australia', 'D', '1000123456', 'CommBank', 89000.00, 1.0, 'CBR');
when I try entering in the second one I get an error 'unique constraint violated'.So whats wrong exactly?
View 1 Replies
View Related
Oct 4, 2013
My DB version is
BANNER
----------------------------------------------------------------
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Linux: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
These error codes I'm getting in production.First of all, I've no duplicates present in that table for which this error has been raised.I've checked the index and related columns as well. NO DATA is there.So NO CHANCE for unique constraint violation.
SELECT * FROM ORDER_OCC_REQUISITION_X_REF WHERE LAB_ORDER_OCC_TEST_ID IN(SELECT LAB_ORDER_OCC_TEST_ID FROM LAB_ORDER_OCC_TEST WHERE LAB_ORDER_OCC_ID = 7944858);
no rows selected
Now when I'm trying to insert one row inside this table I'm getting this error, as you are seeing no records for this occurrence_id.
SELECT * FROM USER_INDEXES WHERE INDEX_NAME = 'ORD_OCC_REQ_UQ_TEST_IX_04';
--ORDER_OCC_REQUISITION_X_REF (Table name)
--MERGE_DT, LAB_ORDER_OCC_TEST_ID, TEST_ID, ACTIVE_YN (columns for the index 'ORD_OCC_REQ_UQ_TEST_IX_04')
As you can see there is no data then this error should not be raised. Update procedure.
/*******************************************************************************************************************
* Name : UPDATE_REQUISITION_X_REF
* Description : This Procedure update ORDER_OCC_REQUISITION_X_REF table with requisition_id
* that was generated due to merge process.
* In parameters : IN_merge_id NUMBER The order_ref_no of the orders to be merged (comma seperated)
***********************************************************************************************************************/
PROCEDURE UPDATE_REQUISITION_X_REF ( IN_merge_id IN TT_ORD_REQUISITION_WORK_AREA.merge_id%TYPE)
IS
[Code]....
View 8 Replies
View Related
Jun 27, 2012
I have in a plsql block somewhere a statement like
INSERT INTO TABLE1( id , col)
SELECT id, col
FROM TABLE2;
This statement returns an error ORA-00001: unique constraint because id is a primary key on TABLE1. I would like to know what is the value of id that raised the exception.
View 15 Replies
View Related
Nov 4, 2012
I also put in the relevant code in case it's needed.
SQL> @lab_05_01.sql
SQL> -- Oracle Database 10g: Administration Workshop II
SQL> -- Oracle Server Technologies - Curriculum Development
SQL> --
SQL> -- ***Training purposes only***
SQL> -- ***Not appropriate for production use***
SQL> --
SQL> -- This script performs a batch promotion update.
SQL> -- The logic of the updates is not important -
[code]....
View 2 Replies
View Related
Oct 8, 2010
Getting below error While Impdp..
Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
ORA-39083: Object type PROCACT_SCHEMA failed to create with error:
ORA-31625: Schema ADAS is needed to import this object, but is unaccessible
ORA-28031: maximum of 148 enabled roles exceeded
[code]...
ORA-06512: at "SYS.KUPW$WORKER", line 1342
ORA-06512: at line 2
Job "SYS"."SYS_IMPORT_FULL_01" stopped due to fatal error at 17:13:38
View 2 Replies
View Related
Oct 13, 2010
I have Oracle 9i and yesterday I wanted to create new DB(datebase) but I couldn't did it.
In Database Configuration Assistant chose Create database then New database. Used configuration by default. On 92% I got this error
Quote:ORA-02291: integrity constraint (SH. COSTS_PRODUCT_FK) violated - parent key not found
View 2 Replies
View Related
Nov 9, 2012
I exported three databases : servicedesk, report and mostcmdb on server A to a dumpfile. Moved them to similar dump location on server B. Now, I want to import them into B by using:
[SQL> impdp system/OSS_MOS7100 dumpfile=MOSTCMDB_0211.DMP schemas=mostcmdb
The error was:
SP2-0734: unknown command beginning "impdp syst..." - rest of line ignored.
View 7 Replies
View Related
Sep 16, 2011
I have error ORA-00904: "USERNAME" : identificateur non valide. What is the problem ?
My command impdp :
impdp datapump/password@%ORACLE_SID% DIRECTORY=datapump schemas=gcom INCLUDE=PROCEDURE remap_tablespace=gpao_indx:indx remap_tablespace=gpao_data:data DUMPFILE=%ORA_DUMPFILE% LOGFILE=Imp_%annee%%mois%%jour%%hh%%min%%sec%_%ORACLE_SID%.log
The result :
;;;
Import: Release 10.2.0.4.0 - Production on Jeudi, 15 Septembre, 2011 17:47:16
Copyright (c) 2003, 2007, Oracle. All rights reserved.
;;;
Connecté à : Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Table maître "DATAPUMP"."SYS_IMPORT_SCHEMA_29" chargée/déchargée avec succès
[Code]....
View 2 Replies
View Related
Jan 14, 2011
I am using below command to import a schema using network link. Command is :
impdp system directory = IMP_DIR schemas = XYZ network_link = PQR remap_schema = XYZ:XYZ exclude=view: "= 'XYZ.VW_ACCEPTDETAILS'"
This command is giving below error
LRM-00116: syntax error at 'view:' following '='
When I have tried Like in place of '=' sign i.e. EXCLUDE = VIEW:"LIKE '%VW_ACCEPTDETAILS%'" , it gives me below error:
UDI-00014: invalid value for parameter, 'exclude'
View 4 Replies
View Related
Jun 21, 2013
I am having issue with IMPDP on ORACLE VIRTUAL COLUMNS.I am having following table with Virtual column defined with Not null. Expdp is fine without any issue.
DDL :
------
CREATE TABLE alert_hist
(
alertky INTEGER NOT NULL,
alertcreatedttm TIMESTAMP(6) DEFAULT systimestamp NOT NULL,
alertcreatedt DATE GENERATED ALWAYS AS (To_date(Trunc("alertcreatedttm"))) VIRTUAL NOT NULL
When I do the import (IMPDP) it got failed with the following error.
. . imported "TESTSCHEMA"."VALART" 359.1 KB 4536 rows
ORA-31693: Table data object "TESTSCHEMA"."ALERT_HIST" failed to load/unload and is being skipped due to error:
ORA-39097: Data Pump job encountered unexpected error -1
After that I dropped the Virtual Not null column and recreated that column with Nullable.
DDL :
-----
alter table alert_hist drop column alertcreatedt;
alter table alert_hist add alertcreatedt DATE GENERATED ALWAYS AS (To_date(Trunc("alertcreatedttm"))) VIRTUAL;
After that I took the expdp and impdp , it went fine with out any issue.
View 7 Replies
View Related
Apr 18, 2010
I have two different java process trying to insert in the same time in the same table for the same trade. The structure of the table Report is
create table report
( TRADE_ID NUMBER,
VERSION NUMBER,
MESSAGE_TIME TIMESTAMP)
There is a unique key on (TRADE_ID and VERSION) So if a new trade_id is inserted, the version is set to 1 and the second becomes 2 and so on. The version is calculated as last version of the trade_id ie. version + 1. It was woking fine till a new Java process was build that fired inserts through ten different java instances at the same time resulting in unique key error. So in detail what is hapenning is if three records of trade_id's comes in at the same time it should allocate versions in a first come first serve basis and there should be three versions of trade id 1,2 and 3. Now due to the multiple instances they all seems to get fired at once and all ending up with version one and thus resulting in unique key constrain error while trying to insert into the table.
View 3 Replies
View Related
Jul 5, 2012
i have a table for example :
CREATE TABLE RMD_2
(
"RMD_ID" NUMBER(10,0) NOT NULL ENABLE,
"ABB" VARCHAR2(16 BYTE),
"ACT_IND" VARCHAR2(1 BYTE) NOT NULL DISABLE,
CONSTRAINT "RMD_2_C1" CHECK (act_ind IN ('Y', 'N')) ENABLE
)
for column ACT_IND their is a check constraint RMD_2_C1 it should only accept either Y or N.
but, when ever i am inserting other values other than Y or N it is accepting it should not accept and it should have to give error message ORA-02293- : check constraint violated.
but it is not happening,it is accepting all other values even NULL value also.my requirement is this column should only accept either Y or N and it should not accept other values.
View 4 Replies
View Related
May 31, 2013
i am running the query from which i am getting below mentioned error how can i find the record which is not there in parent table ora02291 integrety constrain violated and parent key not found.
View 11 Replies
View Related
Nov 4, 2013
Each time when I import an application into my workspace in local 10g XE using the APEX 4.2 Application Builder, I get the "ORA-02291, constraint wwv_flows_fk violated..". The application I whant to import was originally exported from [URL]. I found out, that it has to to with wwv_flow_application_install.get_workspace_id in the installation script, which obviously returns NULL and not my current workspace id as I suppose it should. I replaced wwv_flow_application_install.get_workspace_id by '<my worskspace id>' and then the import does work.
View 0 Replies
View Related
Apr 30, 2009
When I am exporting a schema that own several DBA_JOBS using expdp, and then importing it into another database - the DBA_JOB change ownership to the schema I used for the import.
For example :
the log_user,priv_user and schema_user used to be schema 'A'
but after the impdp, assuming I imported as SYSTEM, the log_user,priv_user and schema_user are now SYSTEM !.
Is this an expected behavior ? How can I avoid it ? the job will fail executing as SYSTEM , and I need to manually create it as user A again.
View 5 Replies
View Related
Jul 9, 2013
I have a question to clarify regarding user creation during export and import.
Will user get created along with roles,privileges by default when using impdp command ?
View 6 Replies
View Related
May 8, 2012
I did an export using the following parfile (see below) I want to import all the objects associated with this schema into another DB but I want don't want to over-write any of the permissions such as grants.
Is there a way I can get the grants into a sql file before I do the import. If so, provide an example.
cat exp_par
DUMPFILE=exp.dmp
LOGFILE=exp.log
DIRECTORY=DBBACKUP
schemas=t1
View 1 Replies
View Related
May 26, 2010
I'm trying to do a network datapump between oracle databases, and it seems to continually hang when it gets to the point where it should be processing table data.
C:>impdp DP_USER/DP_USER parfile=sde_webmap_2.par
Import: Release 11.1.0.7.0 - 64bit Production on Wednesday, 26 May, 2010 17:42:03
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "DP_USER"."SYS_IMPORT_FULL_01": DP_USER/******** parfile=sde_webmap_2.par
Estimate in progress using BLOCKS method...
[code]...
It just sits at this point indefinitely.The parfile for those interested:
directory=datapumps
logfile=sde_webmap_2.log
network_link=backup
full=y
INCLUDE=SCHEMA:"IN ('SDE_WEBMAP_BUSINESS','SDE_WEBMAP_BUSINESS_A','SDE_WEBMAP_BUSINESS_B')"
And the results from V$SESSION_LONGOPS
69 SYS_IMPORT_FULL_01 IMPORT 0 1031 MB 5/26/2010 5:50:37 PM 5/26/2010 6:03:29 PM
View 3 Replies
View Related
Sep 25, 2011
As we know,there is a parameters named indexes of orignal imp,it use to generate create index ddl,Is there a parameter in impdp compare to it?
View 6 Replies
View Related
Oct 23, 2013
I was running IMPDP for my company's database.Everything is just fine except one problem.A big table(61.1G) could't be imported.The warning message shows "imported TBL_XXXX 61.1G 0 out of 147653981rows".The alert log had no warning message.I didn't know
View 12 Replies
View Related
May 29, 2012
I need to recreate/ clone my database to a new machine. The two machine are not connected in the network.
Step 1. (Oracle 10.2.0.5 AIX 64-bit)
expdp username/password@db1 full=y dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_expdp.log job_name=full_exp
Step 2.
FTP dump files to Windows
Step 3. (Oracle 10.1.0.2 Windows 32-bit)
impdp username/password@db1 dumpfile=dp:fpac052912_dp%U.dmp logfile=dp:fpac052912_impdp.log full=y
I got:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "C:P7DBfpac052912_dp01.dmp"
Done in AIX:
create directory dp as '/bak'
grant read, write on directory dp to public;
grant exp_full_database to username;
Done in Windows:
create directory dp as 'C:P7DB';
grant read, write on directory dp to public;
grant exp_full_database to username;
grant imp_full_database to username;
View 8 Replies
View Related
Feb 13, 2012
1) My database dump size near about 4GB , which is provided by the vendor .
2) In the dump , total objects are 364949 , where
Table : 121316
LOB object : 121315
(Normal+LOB) indexes : 122317
3) Now when I run the import using system or another user , it hangs on the below stage for 70+ hours ..
impdp ntest/ntest directory=test_dir dumpfile=JBLLIVE.31Jan2012.11.50AM.dmp remap_schema=JBLLIVE:NTEST logfile=ntest_10feb.log
Import: Release 11.2.0.1.0 - Production on Fri Feb 10 09:49:50 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "NTEST"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "NTEST"."SYS_IMPORT_FULL_01": ntest/******** directory=test_dir dumpfile=JBLLIVE.31Jan2012.11.50AM.dmp remap_schema=JBLLIVE:NTEST logfile=ntest_10feb.log
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"NTEST" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
----
In this situation I observed the worker status and see that some table and some LOB objects including LOB indexes are imported . Worker process do it in background but it does not show in the front import log file (I dont understand why it not shows in the import logfile). it imports one table,one LOB , one LOB index ..then again one table,one LOB , one LOB index ... in this way .
And my observation first it inserts data into the LOB tables and then it inserts into normal table . And when it is starting to insert data to the normal table then this table's log are shown in the import logfile.
an example of our data type :
Objects :
===================================================
LOB_FD17_RGS_TSTCD2 LOB
FD17_RGS_VERSION TABLE
(here i see one table has one LOB segment, in this way 121316 table has 121316 LOB)
SQL> desc FD17_RGS_VERSION
Name Null? Type
----------------------------------------- -------- ----------------------------
RECID VARCHAR2(255)
XMLRECORD BLOB
Our observation perhaps inserting blob mainly occurs the slowness . Is there any patch or is there any bug regarding BLOB/LOB objects in oracle-11gR2
View 6 Replies
View Related
Apr 15, 2012
I started datapump not using parallel option
I issued the following command
impdp t24/t24 directory=dp_dump dumpfile=bef_cob_%U.dmp schemas=t24
The total dumpfile size is 200 GB..
Now I want to add parallel process to the job..
View 9 Replies
View Related
May 24, 2012
We need to setup a test system using data from one of the offsite customer location.The option we would like to use is impdp with network link. Below given step will make import faster if we exclude indexes and constraints
Steps
Import schema excluding constraints,rf_constraints,indexes with metadata only
Import schema with data only
spool ddl using dbms_metadata.get_ddl from Source for constraints,rf_constraints and indexes
execute on Destination the Index creation ddl
execute on Destination constraints and rf_constraints ddl
View 1 Replies
View Related
Jun 26, 2013
Below is my import command for importing specific function from export file but iam getting below errors
impdp system/PASSWORD schemas=TNC6 directory=dumpdir dumpfile=FULL01-02-2011.dmp logfile=IMP.log include=FUNCTION:"IN ('TNC_IS_NUMBER')"
ORA-39001: invalid argument value
ORA-39071: Value for INCLUDE is badly formed.
ORA-00936: missing expression
View 4 Replies
View Related
Jul 16, 2010
I have a bit of an issue with Oracle datapump dump files.
Today, I manage the export and import of oracle dump files. As part of the batch export process I have a script which essentially says:
For each schema realated to my application in THIS instance, export schema via the system user (system user allows me privs to all schemas).
On the import UI side of things I am able to run a "head -20" command on the dmp file and determine the "export client version", "date the schema was dumped", and "what schema it was dumped from". All useful info presented in my UI.
Sample output: Begin
EXPORT:V09.02.00
DSYSTEM
RUSERS
8192
Wed Jun 30 11:51:21 UserXXX.dmp
#C##
#C##
[code]....
Sample output: End
in that I allow the importation of production schemas into test schemas, (contained in a different tablespace). Based on naming convention I can determine the schema type (production or test). Additionally and probably most importantly, I am assured where the data has come from.
In looking at "expdp" and the dump file. Using the same method as above, it appears the data pump dump DOES NOT carry similar headers. Because of this, I am unable to return very little useful info from the dump file.
I realize I could run the impdp with the "sqlfile=myfile.sql" and then interrogate the sql file for the info. But on large dump files this would be fairly time consuming compared to a "head -20" on a dump file.
View 4 Replies
View Related