Export/Import/SQL Loader :: Expdp Failing With Snapshot Too Old Error
Sep 4, 2012
We are getting the below errors while migrating partitioned tables using expdp.
The source and target databases are both running on 10.2.0.5 and the main thing is source database doesn't have any active sessions. This is a clone of a Prod Database and no one is accessing it.
ORA-31693: Table data object "DPMMGR"."WHSE_CTNR_EVNT_W":"MSG_PRCS_N"."MSG_PRCS_N_DC556" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-01555: snapshot too old: rollback segment number 31 with name "_SYSSMU31$" too small
ORA-31693: Table data object "DPMMGR"."RLTM_PRDCT_LOG":"RPL_20120814" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-01555: snapshot too old: rollback segment number 14 with name "_SYSSMU14$" too small
Undo Tablespace has enough space but still the expdp is failing.
SQL>/
TABLESPACE Totalspace(MB) Used Space(MB) Freespace(MB) % Used % Free
--------------- --------------- -------------- ------------- ---------- ----------
UNDO01 145096 115338 29758 79.49 20.51
SQL> show parameter undo
[code]....
I am trying to export selective data from one of my prod database tables. But not succeeding. I was keep on trying for the past 2 hours.
OS : SOLARIS SPARC ORACLE - 10G Query --> WHERE E3RECV_DT LIKE '201305%' (I need to export this query data)
Below Script i am using =============== exp E3USER@SGEBAPU2 statistics=none consistent=n buffer=100000000 file=exp_pipe_file TABLES=IFDATA query="WHERE E3RECV_DT LIKE '201305\%'" log=PGTB_IFDATA_conditional.log
I am using expdp command to export the table by specifying Query parameter. But i am unable to export the table based on the condition.
Ex:EXPDP username/password dumpfile=employee.dmp logfile=emp.log directory=DATADIR_EXP TABLES=EMPLOYEE query=EMPLOYEE:"UPDATED_TIME >= '04-JUN-13' AND UPDATED_TIME >= '05-AUG-13'" Estimate in progress using BLOCKS method...Processing object type TABLE_ EXPORT/ TABLE/ TABLE_ DATATotal estimation using BLOCKS method: 3 GBORA-31693: Table data object "<username>"."EMPLOYEE" failed to load/unload and is being skipped due to error:ORA-00933: SQL command not properly endedMaster table "<username>"."EMPLOYEE" successfully loaded/unloaded...Dump file set for <username>.SYS_EXPORT_TABLE_01 is: E:IMPDPemployee.dmpJob "<username>"."SYS_EXPORT_TABLE_01" completed with 1 error(s) at 12:34:45 Oracle 11g,
So, I want to export two schemas from database with condition:
1. I want to export scheme_1 with all metadata objects + data. 2. I want to export scheme_2 with only metadata objects.
Oracle version is Oracle EE 10.2.0.4.0, OS - Microsoft Server 2003R2.
As far as I know I can not use parameter EXCLUDE like:
EXCLUDE =TABLES:" IN ('SCHEMA_NAME.TABLE')"
(-but this parameter will give me no tables at all) or I can not use CONTENT=SCHEMA_NAME.METADATA_ONLY, maybe I can use QUERY=where table in (select tablename where schema is .... - but I have tables with same name in both schemas).
My database size is around 2900 GB in AIX 6.1, database version is 10.2.0.3. Everyday I need to take expdp dump backup of a single table which is only 57 MB in size. It takes around 55 minutes to complete the dump backup.
I have noticed that when backup starts , in first phase it does table scan ( we have 330000 tables) , next purely backup begin. My query is ,
1. how to make first my dump backup? 2. is there any way to skip table scan ?
I am asked to migration from oracle 10.2 to 11.2 on Red hat.In 10g,the db has around 50 users including default users like Scott,xdb etc. 1,Should i skip those users when i import? The schema MAHM05 has the tables, None has directly access privilege this schema.
All the users access by synonym. 2,which schema should i import first? 3,What are things i need to check?
to export a table aprtition we do table=(T1:P1,T1:P2) but what if I have to export, say 50 partitions do I have to write tables=(T1:P1,T1:P2 .......................................,T1:P50) Or is there a simple way to have it in a single go.
I am using EXPDP to export a schema (Oracle 11g R2), and I need to exclude all the tablespaces that the schema is using. I have seen exluding Oracle objects like functions, tables, packages, indexes...etc. But I have not seen excluding tablespaces.Iis it possible to exclude tablespaces while creating the export dump?
We are importing data from one DB to other.Schema contains 200 tables and each table contains 1million rows. We are using query option to subset the data using different conditions instead of loading full table data .
We want to subset from different tables. But query option in expdp only taking 1800 characters only. why it is taking upto 1800 characters only. Any restriction imposed on query option.how to increase query clause length?
below is the example of export command.
otispa/********@otisua1 schemas=tbaadm directory=PA_OTIS_DIR dumpfile=tbaadm data.dmp CONTENT=all tableexists_action=replace query='tbaadm.ACCOUNT_LIEN_HISTORY_TABLE:"where ACID in(select FINACLE_INT_BNK_ACC T_ID from INIT_ACCT_LD)"','tbaadm.DISCRET_ADVN_TABLE:"where ACID in(select FINACLE_INT_BNK_ACCT_ID from INIT_ACCT_LD)"','tbaadm.TEMP DISCRETADVN_TABLE:"where ACID in(select FINACLE_INT_BNK_ACCT_ID from INIT_ACCT_LD)"','tbaadm.PYMNT_RCPT_DET_TABLE:"where ACID in(s* *elect FINACLE_INT_BNK_ACCT_ID from INIT_ACCT_LD)"','tbaadm.STOP_PAYMENT_ADDTNL_TABLE:"where ACID in(select FINACLE_INT_BNK_ACCT_ID f* *rom INIT_ACCT_LD)"','tbaadm.STOP_PAYMENT_REG_TABLE:"where ACID in(select FINACLE_INT_BNK_ACCT_ID from INIT_ACCT_LD)"','tbaadm.GEN_AC*
Tables same column names but diffrenet index structures and traget one to be partitioned hence only want to import the content Each table on source datbaase hascolumn seq number and only want to extract the last few months of data.
TABLES:table1,table2... DUMPFILE=dump_dir CONTENT=data_only QUERY= table1:"WHERE seq_num >100 "want to use expdp but not sure about how to ensure all tables have the WHERE seq_num >100 condition, if leave table1: out and just have QUERY= "WHERE seq_num >100 " will this condition be applied to all tables which is what we want.
I'm assuming also can use impdp CONTENT=data_only?
Import: Release 11.2.0.3.0 - Production on Tue Apr 23 13:10:51 2013 Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Starting "IMPDB"."SYS_IMPORT_TABLE_01": abc/******** directory=DATA_PUMP_DIR network_link=TESTAR logfile=net_import_proddev.log TABLES=impdb.abc parallel=12 REMAP_SCHEMA=IMPDB:ABC
I try to transfer data from one database to another one through data pump via SQL Developer (data amount is quite important) exporting several tables. Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
"Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...= ORA-39001: argument value invalid ORA-39000: .... ORA-31619: ...
The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
data pump export is very slow. For 50GB export has taken more than 24Hrs with one below error:
Database Version:11.2.0.2.0 OS: Windows server 2008 r2 Increased 10GB RAM and CPU 6 to 8 then also same issue
Error: ORA-31693: Table data object "BNCSDB"."MS_DATA_PTORE" failed to load/unload and is being skipped due to error: ORA-02354: error in exporting/importing data ORA-01555: snapshot too old: rollback segment number 20 with name "_SYSSMU20_4037596720$" too small
Export log: Export: Release 11.2.0.2.0 - Production on Tue May 14 20:03:25 2013
Copyright � 1982, 2009, Oracle and/or its affiliates. All rights reserved. ;;; Connected to: Oracle Database 11g Release 11.2.0.2.0 - 64bit Production Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01": system/********@orcl dumpfile=BCSDB04_19.dmp logfile=BCSDB04_19.log
I have a data base oracle 10g (10.2.0.3). I am exporting the Schema using Grid Control 10g.
Servers:
Source: Oracle 10.2.0.3.0
To: Oracle RAC 11.2.0.3
I am trying to import the Schema for the server Oracle RAC 11gR2 11.2.0.3 using the Cloud Control 12c and is presenting the following error:
Error: O job IMPORTA_EGESTAO_ESQUEMA_NOVO foi reaberto em Terca-Feira, 05 Marco, 2013 11:02 Reiniciando "SYSTEM"."IMPORTA_EGESTAO_ESQUEMA_NOVO": Processando o tipo de objeto SCHEMA_EXPORT/USER Processando o tipo de objeto SCHEMA_EXPORT/ROLE_GRANT Processando o tipo de objeto SCHEMA_EXPORT/DEFAULT_ROLE Processando o tipo de objeto SCHEMA_EXPORT/TABLESPACE_QUOTA Execute Failed: ORA-06502: PL/SQL: numeric or value error: character string buffer too small ORA-06512: at line 18 ORA-06512: at line 44 (DBD ERROR: OCIStmtExecute)
I've been using datapump for a long time now but I have not come across this problem before.
Importing just two tables: Table1 data=100Mb=11 million rows Table2 data=4.2Gb =19.6 million rows
Table1 ran for approx. 5 hours Table2 ran for approx. 15 hours
If I run the impdp importing both tables in the same par file the default tablespace of the users the import is running as runs out of space due to ORA-01691: unable to extend lob segment <owner>.SYS_LOG0001175799C00045$$ by 512 in tablespace USERS. I do not understand why it is creating objects in order to import tables into someone elses schema.
The environment is Red Hat LINUX 4.1.2-51 running Oracle 11.2.0.1 of Oracle11gR2. This is a 9 node RAC using ASM.
I am trying to export a Schema from 11.2.0.2 and getting the below error on export ORA-39127: unexpected error from call to export_string
:=SYS.DBMS_RMGR_GROUP_EXPORT.GRANT_EXP(12425,1,...) ORA-06502: PL/SQL: numeric or value error: NULL index table key value ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 154 ORA-06512: at line 1 ORA-06512: at "SYS.DBMS_METADATA", line 7049 ORA-39127: unexpected error from call to export_string :=SYS.DBMS_RMGR_GROUP_EXPORT.GRANT_EXP(12424,1,...) ORA-06502: PL/SQL: numeric or value error: NULL index table key value ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 154 ORA-06512: at line 1 ORA-06512: at "SYS.DBMS_METADATA", line 7049 ORA-39127: unexpected error from call to export_string :=SYS.DBMS_RMGR_GROUP_EXPORT.GRANT_EXP(12423,1,...) ORA-06502: PL/SQL: numeric or value error: NULL index table key value ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 154 ORA-06512: at line 1 ORA-06512: at "SYS.DBMS_METADATA", line 7049 ORA-39127: unexpected error from call to export_string := SYS.DBMS_SCHED_JOB_EXPORT.GRANT_EXP(2003412,1,...) ORA-01031: insufficient privileges ORA-06512: at "SYS.DBMS_SCHED_MAIN_EXPORT", line 2296 ORA-06512: at "SYS.DBMS_SCHED_JOB_EXPORT", line 52 ORA-06512: at line 1 ORA-06512: at "SYS.DBMS_METADATA", line 7049.
I have searched the forum and found similar errors, but I am struggling to understand the cause and solution.Forums seem to indicate it may be related to bug 4358907, Re: Datapump export/import object grants given by another user but not sure it is relevant as I can see a ORA-01031: insufficient privileges in this log.
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit ProductionPL/SQL Release 11.2.0.3.0 - ProductionCORE 11.2.0.3.0 ProductionTNS for Solaris: Version 11.2.0.3.0 - ProductionNLSRTL Version 11.2.0.3.0 - Production I'm trying to load a table, small in size (110 rows, 6 columns). One of the columns, called NOTES is erroring when I run the load. It is saying that the column size exceeds max limit. As you can see here, the table column is set to 4000 Bytes)
CREATE TABLE NRIS.NRN_REPORT_NOTES ( NOTES_CN VARCHAR2(40 BYTE) DEFAULT sys_guid() NOT NULL, REPORT_GROUP VARCHAR2(100 BYTE) NOT NULL, AREACODE VARCHAR2(50 BYTE) NOT NULL, ROUND NUMBER(3) NOT NULL, NOTES VARCHAR2(4000 BYTE),
Why do export-import require temporary tablespace? Since export-import do behave like DMLs, when does temporary tablespace be needed by datapump utility?
I have a process to export a schema using expdp and import using impdp. Everything creates successfully except for a trigger. The trigger gives and error that the table or view does not exist. The account that I use to import the schema is different than the schema user but is a highly privileged account. I notice that the schema in the create or replace trigger line of code is remapped (I am using remapping in the impdp syntax) and the rest of the syntax of the trigger (which is just a sequence trigger for a primary key column) does not have the schema. In order to fix the issue, I have my bash script log into oracle as the schema user after the import of the schema and execute the trigger code. why do I have to do this for trigger code but not for other objects like views that create just fine.
How do i map student_newId field to student_id field in STUDENT DB table so that new id should be inserted in student_id column. How do i specify the mapping in control file. I dont want to create a new column in student table. In control file i will specify the below, Is this a best approach?. Do we have any othe way?
But the query of import is still runing even not showing any amount of rows to be imported.
i already make the tablespace in which the table was previosuly before dropping but when i check the sapce of tablespace that is also not consuming one error i got preiviously while performing this task is:
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production With the Partitioning, OLAP and Data Mining options Master table "CDR"."SYS_IMPORT_TABLE_03" successfully loaded/unloaded Starting "CDR"."SYS_IMPORT_TABLE_03": cdr/********@tsiindia directory=TEST_DIR dumpfile=CAT_IN_DATA_042012.DMP tables=CAT_IN_DATA_042012 logfile=impdpCAT_IN_DATA_042012.log
[code]....
i check streams_pool_size it will show zero and then i make it to 48M and after that
SQL> show parameter streams_pool_size; NAME TYPE VALUE ----------- streams_pool_size big integer 48M
BANNER ---------------------------------------------------------------- Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod PL/SQL Release 10.2.0.1.0 - Production CORE 10.2.0.1.0 Production TNS for 32-bit Windows: Version 10.2.0.1.0 - Production NLSRTL Version 10.2.0.1.0 - ProductionOS version:
Windows 7 64bit I have schema(scott) export with schema level option and imported with different name as (scott1).At regular period of time i need to import the scott to scott1 without affecting existing records.such as *1. Need to append new created records.* *2. Need to append updated records.*
for the above requirement I did in the following way expdp xxxx/******** schemas=SCOTT directory=dumpdir dumpfile=SCOTT_28-SEP-2012.dmp logfile=exp_SCOTT_28-SEP-2012.log imported in the following way impdp xxxx/******** AS SYSDBA REMAP_SCHEMA=SCOTT:SCOTT1 directory=DUMPDIR dumpfile=SCOTT_28-SEP-2012.dmp logfile=imp_SCOTT2_28-09-2012.log TRANSFORM=SEGMENT_ATTRIBUTES:n TABLE_EXISTS_ACTION=APPEND.
The problem is i couldn'table to append the records to existing tables the log error show such ways.
ORA-31684: Object type USER:"SCOTT1" already exists Processing object type SCHEMA_EXPORT/SYSTEM_GRANT Processing object type SCHEMA_EXPORT/ROLE_GRANT Processing object type SCHEMA_EXPORT/DEFAULT_ROLE Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA [code].....
aix 6.111.2.0.3 I have an expdp dump from prod to be imported to our test database.I have imported it using impdp, but to my surprise the tables were imported but lots of indexes were not created? even If I have used TRANSFORM=SEGMENT_ATTRIBUTES:N just to use the default USERS tablespace. How do I import the indexes separately, skipping the tables and other objects?