Server Utilities :: Error During Expdp On Oracle 11gR2 On Solaris?

Dec 2, 2010

From some day I have this error during export data pump:

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-31687: error creating worker process with worker id 1
ORA-31687: error creating worker process with worker id 1
ORA-31688: Worker process failed during startup.

This error is random, if I retry after few minutes the expdp work correctly.

View 8 Replies


ADVERTISEMENT

Server Utilities :: Using Expdp / Impdp To Migrate 4 TB Database From Solaris To Linux

Aug 4, 2011

I am using expdp/impdp to migrate 4 TB database from solaris to Linux.But the import process is taking forever.

View 13 Replies View Related

Server Utilities :: EXPDP With Query Error?

Jun 3, 2010

While trying to expdp using Query logics, getting syntax related erros shown below:

expdp system/xxxx SCHEMAS=LOG NETWORK_LINK=DBLINK1 INCLUDE=TABLE:"IN('DAILY_LOG')" QUERY=LOG.DAILY_LOG:"where entry_date< to_char(sysdate -1,'yyyymmdd')" DIRECTORY=dump DUMPFILE=log_exp.dmp logfile=log_exp.log

But gives the following error
ORA-31693: Table data object "LOG"."DAILY_LOG" failed to load/unload and is being skipped due to error:
ORA-00904: "YYYYMMDD": invalid identifier

I tried with simple sql with YYYMMDD and it works fine, the entry_date is a char field. in QUERY where i'm doing wrong here?

View 4 Replies View Related

Server Utilities :: Syntax Error In Using Query Parameter In Expdp

Aug 17, 2013

I want to take an export of table MESSAGE, and filter it for the day of 17 JUL 2013 (just to limit the size). i used the following expdp command but its not working.

expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile=FA0001P_BG_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')

But with select query i am able to retrieve the rows for the specific date.

select * from MESSAGE where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
Here is the command with syntax error.
[oracle@orcl log]$ expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile= DB_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
-bash: syntax error near unexpected token `('

View 3 Replies View Related

Server Utilities :: Expdp Hangs In Oracle 11g?

May 27, 2011

I am trying to export Schema using expdp command. but its going hang after few minutes. it seems that it stucks any where. Even I am trying with normal scott schema it is also hanging.

View 16 Replies View Related

RAC/ASM Clusterware Installation :: 11gR2 - Solaris LDOM On Same Physical Server

Aug 14, 2013

I have a question: Did someone have in production two or more RAC 11gR2 on different Solaris LDOMs on the same physical server?. 

View 0 Replies View Related

Server Utilities :: Overwrite Existing Dump File In Expdp In Oracle 10g?

Apr 13, 2012

How we can overwrite existing dump file for expdp in oracle 10g because everytime we excute expdp and dmp file exist we get below error

ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "C:scott_emp.dmp"
ORA-27038: created file already exists
OSD-04010: <create> option specified, file already exists

We have one feature in 11g reuse_dumpfiles=y ,which doesnt work in 10g, I want something which can overwrite existing dumpfile in 10g?

View 1 Replies View Related

Server Utilities :: Oracle Migration From 8 To 11gR2?

Sep 13, 2010

I have Windows server working Oracle 8.0.5.I have to migrate to another server that is Oracle 11gR2 on Windows 2008.Some say I can't do Export and Import directly.And some say DBlink is not working from 11Gr2 to 8.

What can be fewest step to migrate?Somebody try to use 'Oracle directory' and VB and Powerbuilder program to send data and DDL from 8 to 11Gr2.

migrating from Oracle 8.0.5 to 11Gr2 machine.

View 6 Replies View Related

Server Utilities :: Do EXPDP Utilities Does Backup At Block Level As What RMAN Is Doing

May 29, 2013

I have one doubt on Expdp & RMAN. Do EXPDP utilities does backup at block level as what RMAN is doing? Which one is faster, expdp or RMAN?

View 16 Replies View Related

Server Utilities :: Export / Import XML Records In Oracle 11gR2

Jan 28, 2012

In oracle 11g R2 , I face a problem when I export/import xml records in the tables . Everytime it takes huge time (like 2 or 3 days) to import or export the data . But the dump size is very small (4gb) and this dump comes from a vendor so that I dont understand that is it a data structure problem or it the normal behaviour to import xml records.

I am not used to with xml record with oracle database before. Here I am using datapump feature . I also mention that when I delete the schema (where I imported the xml data) , it also takes 2/3 days to delete.The import script will hang in the following stage :

=========================
bash-3.2$ impdp system/sys123 DIRECTORY=isb_dir DUMPFILE=jblbld_20110301_01.dmp,jblbld_20110301_02.dmp,jblbld_20110301_03.dmp logfile=JBLBLD_28Jan2012.log schemas=TBLD;date

Import: Release 11.2.0.1.0 - Production on Sun Jan 29 11:10:38 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_SCHEMA_02" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_SCHEMA_02": system/******** DIRECTORY=isb_dir DUMPFILE=jblbld_20110301_01.dmp,jblbld_20110301_02.dmp,jblbld_20110301_03.dmp logfile=JBLBLD_28Jan2012.log schemas=TBLD
Processing object type SCHEMA_EXPORT/USER
[code]...

The machine has 16 cpu , 32 gb RAM and during export/import most of the time maximum memory will free.

View 1 Replies View Related

Server Utilities :: Export Sequence Using Expdp

Apr 13, 2011

I would like to export specific tables(not entire schema) including metadata. I am using a parameter file for expdp.

Tables=emp,dept

Does this also include all metadata or should i also add the below Include in the parfile ?

INCLUDE =Indexes,Sequences,Procedures,Views

View 3 Replies View Related

Server Utilities :: Slow Expdp After Upgrade

Oct 1, 2012

We had AIX OS on 570 machine and database 10.2.0.4. We took expdp and it took 2 and hour to complete every night.

Now we upgrade to 10.2.0.5 and 770 machine and now same command takes 6 hours to complete even database and hardware is upgraded

Command is

expdp T24SILK/oracle directory=backup dumpfile=exp_beod_T24_%U_$dt
.dmp logfile=exp_T24_$dt.log EXCLUDE=TABLE:"LIKE '%TRACE'" parallel=6

View 1 Replies View Related

Server Utilities :: Performance Diff In Exp And Expdp?

Jun 4, 2010

I export a table using exp utility it take 30 mins to complete the export.The same i have done in expdp utility it take 10 mins to complete the export.

How it happens?

View 3 Replies View Related

Server Utilities :: Excluding Data Of Some Tables In EXPDP

Oct 5, 2013

i want to exclude only data of some particular tables not complete table object when exporting using expdp.

View 13 Replies View Related

Server Utilities :: Expdp Displays Output In German

Aug 26, 2012

I have a server configured to German & English. when i connect with SQLPLUS, i have German language server output, but when i do "alter session set nls_language='AMERICAN'" - it solves the issue for me.

I need the same for expdp command, but I don't know how to do this. I have tried to add a parameter nls_language, but expdp doesn't recognize it. Is it possible to somehow see server output of the expdp & writing it to the log file in English?

View 5 Replies View Related

Server Utilities :: Data Pump Expdp To ASM Daily?

Jun 16, 2011

i succeeded to expdp to ASM diskgroup such as

create directory asmexpdir as '+RECO/FILTDB/EXPDP';
grant read,write on directory asmexpdir to oraasfs;
expdp oraasfs/oraasfs2301 directory=asmexpdir dumpfile=SBSR_EXP.dmp tables=TM_SFS_CUST_01 logfile=EXPDP_LOG:SBSR_EXP.log

SUCCESS MESSAGE

. . exported "ORAASFS"."TM_SFS_CUST_01" 387.2 MB 817684 rows
Master table "ORAASFS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ORAASFS.SYS_EXPORT_TABLE_01 is:
+RECO/filtdb/expdp/sbsr_exp.dmp
Job "ORAASFS"."SYS_EXPORT_TABLE_01" successfully completed at 03:34:59

And I like to run this daily and delete after 14 days. but it show error, what can be the solution to run this script?

#!/bin/bash
#Script to Perform Datapump Export backup Every Day
################################################################
#Change History

[code]...

View 9 Replies View Related

Server Utilities :: Export Data Using Expdp To Remote Host

Jun 5, 2012

I have to servers 'A' and 'B', On Server there is a schema with the name "test" having a table "t1". I want to import this t1 table to server B.

Is it possible to export dump using expdp to remote host.

I found that there is an option for this like "network_link". for testing this, I created a dblink from Server "B" to "A" named "vxmldb".

When I am using the below command on Server B there I am getting the following error.

C:>expdp directory=data_pump_dir logfile=test.log network_link=vxmldb schemas=test dumpfile=test.dmp

Export: Release 11.1.0.6.0 - 64bit Production on Tuesday, 05 June, 2012 14:22:07
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Username: system/vxmldb@vxmldb
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39170: Schema expression 'TEST' does not correspond to any schemas.

In above command

directory ---> Server "B" location
network_link ----- > dblink name which is created on Server "B" to access Server "A"
schemas ------ > schema name which is to be exported . Exists on Server "A" DB
username/password ---- >> higher level username/password for Server "A".
@connectString ----- >> connecting to Server "A"

View 15 Replies View Related

Server Utilities :: Export / Dump With Expdp For Non-privileged Users?

May 29, 2012

We are DB users (not DBAs) and used always exp/imp bevore application upgrade.

Was googling arround and read something like "Oracle Data Pump - Time to let go of Exp / Imp". It seems exp/imp is obsolete.

Our system doesn't have "expdp" command

> find . -name expdp
>

is this because of too old SQL*Plus?

> sqlplus
SQL*Plus: Release 8.1.7.0.0 - Production on Tue May 29 16:05:28 2012
(c) Copyright 2000 Oracle Corporation. All rights reserved.
Enter user-name: ^C^C

- does our DBA need to give us privileges to run expdp/impdp?

- is that true that a expdp/impdp dump will be on the Oracle server (not the client machine)?

View 4 Replies View Related

Server Utilities :: Using Expdp To Exclude The Stream Objects While Doing Import?

Feb 1, 2012

I am having one prod and one devl with prod having stream setup.

I have to refresh devl with prod , but if i will go by full expdp then db_links also get imported into the devl and may cause problem in devl.

Is there any other way using expdp to exclude the stream objects while doing import.

View 1 Replies View Related

Server Utilities :: Expdp Using Remap_schema Will It Also Remap Grants And Synonyms

May 1, 2010

If I would be using expdp using remap_schema will it also remap grants and synonyms ?

View 5 Replies View Related

Expdp Slower After 11gR2 Database Upgrade?

Aug 6, 2012

After upgrading 11gR1 database (11.1.0.7.0) to 11gR2 (11.2.0.3.0), the datapump exports have been taking quite a bit longer. When database was 11gR1, a full expdp took approx. 40-45 minutes. After upgrade, it takes approx. 1 hour 40-50 minutes. These times were with parallel=4. I tried with parallel=8 and parallel=12, both of these took around 1 hour 5-10 minutes, better but still quite a bit slower than pre-11gR2 upgrade. I tried with exclude=statistics, index_statistics, indexes; it still took approx. 1 hour 40-45 minutes. This is a PeopleSoft database so there are many, many objects to be exported. The database was upgraded using dbua.

View 1 Replies View Related

Server Utilities :: Taking Export Dump Using Expdp Of Some Schema's Of Total Size Is 300GB

Mar 30, 2007

I'm taking export dump using expdp of some schema's of total size is 300GB. This is the par file:

DIRECTORY=expdp
FILESIZE=32212254720
DUMPFILE=expdp_schema01.dmp,expdp_schema02.dmp,expdp_schema03.dmp,expdp_schema04.dmp,expdp_schema05.dmp,expdp_schema06.dmp,expdp_sche ma07.dmp,expdp_schema08.dmp,expdp_schema09.dmp,expdp_schema10.dmp,expdp_schema11.dmp,expdp_schema12.dmp,expdp_schema13.d
[code]....

here one biggest schema size is 250GB and the total size of all the schema's is 300GB. The file where am taking the dump has 350GB space but even then the expdp failed saying

ORA-39095: Dump file space has been exhausted: Unable to allocate 8192 bytes

why it failed and how to restart it and make sure it runs successfully without error.

View 4 Replies View Related

Server Utilities :: Migrate 10gR2 To 11gR2 On Different Server

Jul 11, 2011

Here are the cards I've been dealt:

I've inherited a 10.2.0.1.0 instance running on a windows 2003 box; running fine, no problems other than system has been in production since 2005 and has gotten pretty old and tired. This old box has one tablespace on it... called "gateway".

I've installed 11.2.0.1 on a new (Windows 2008 R2 Enterprise 64-bit) server and created an empty database also called "gateway".

Now to move the data and views, objects, everything.

I've read up on a variety of migration techniques (oops, I mean "upgradation" LOL) and can follow the steps...

In short, I want to pull everything off of server a (10.2) and put it into production on server b (11.2). There seems to be quit a few options.

1. install 10.2 on my NEW server (server b), move the data over and get everything running, then install 11.2 and have it upgrade the database as part of the installation process.
2. drop the empty tablespace on server b, stop the database on server a, copy the files over from the old to the new home, run DBUA or set the compatibility attribute...
3. run some type of server a to server b utility that can bridge the two over the network. Some type of mirroring technique?
4. run file export scripts on server a, copy files to server b and run various import scripts

I tend to think that option 3 would be the best because both instances are in great health and are running right now. Is there a mechanism that allows the 11.2 instance to see and upgrade from a different server?

View 9 Replies View Related

Server Utilities :: ORACLE Import Error?

Feb 23, 2012

I have the problem with import in Oracle 8.1.7.The size of import file is 29600 kb and tablespace size is 16gb and when I try to make import oracle back this message:

IMP-00003: ORACLE error 1659 encountered
ORA-01659: unable to allocate MINEXTENTS beyond 7 in tablespace DATA

The data tablespace is full. I think that the import file contains information about the original tablespace from which has made ​​export. But I don't now how to resolve the problem

View 10 Replies View Related

Server Utilities :: EXP-00008 / ORACLE Error 933 Encountered

Feb 27, 2012

every day we have full backup of oracle database (9.2.0.1.0) on windows 2003 server. since couple of days i am getting following below error. i checked the solutions on google which talk about user quotas , i checked the user quotas by which i am taking full backup it is unlimited , even i tried another user which also have umlimited quota on its default tablespace still i getting below

E:ackup>exp system/manager file=full26022012.dmp log=full26022012.log full=y statistics=none
EXP-00008: ORACLE error 933 encountered
ORA-00933: SQL command not properly ended
ORA-06512: at "SYS.DBMS_RULE_EXP_RL_INTERNAL", line 311
ORA-06512: at "SYS.DBMS_RULE_EXP_RULES", line 142
ORA-06512: at line 1
EXP-00083: The previous problem occurred when calling SYS.DBMS_RULE_EXP_RULES.schema_info_exp

even i go for full export by any other user which has full export privilege & unlimited quotas. i getting the same error.

E:ackup>exp username/password file=full26022012.dmp log=full26022012.log full=y statistics=none
EXP-00008: ORACLE error 933 encountered
ORA-00933: SQL command not properly ended
ORA-06512: at "SYS.DBMS_RULE_EXP_RL_INTERNAL", line 311
ORA-06512: at "SYS.DBMS_RULE_EXP_RULES", line 142
ORA-06512: at line 1
EXP-00083: The previous problem occurred when calling SYS.DBMS_RULE_EXP_RULES.schema_info_exp

View 4 Replies View Related

Server Utilities :: Export Oracle Error Messages

Nov 5, 2004

I have an oracle instance running on solaris 5.8 oracle 10.1.0.3 and when I export the database I get a bunch of error messages which I dont know why?

. exporting private type synonyms
. exporting object type definitions
. exporting system procedural objects and actions
EXP-00008: ORACLE error 6550 encountered
ORA-06550: line 1, column 13:
[code]........

View 9 Replies View Related

Server Utilities :: Error While Importing Dump File In Oracle 10g R1

Nov 1, 2012

While trying to import a schema using Data Dump, I am facing the following issue - UDI-00018 - Import utility version can not be more recent than the Data Dump server.Following is the version information of the source and target DB and the utilities :

Source DB server : 10.1.0.2.0
Export utility : 10.1.0.2.0
Import utility : 10.1.0.2.0

Target DB server : 10.1.0.2.0
Export utility : 10.2.0.1.0
Import utility : 10.2.0.1.0

View 5 Replies View Related

Server Utilities :: Oracle Data Pump Import Error

Jul 26, 2010

I am trying to import database dump using the following command

impdp system/xxxx@xxxx schemas=staging
remap_schema=staging:staging directory=DUMPDIR dumpfile=staging.dmp logfile=impdpstaing.log
TRANSFORM=SEGMENT_ATTRIBUTES:n

its importing data fine upto some stage after that oracle gives the following error

Processing object type SCHEMA_EXPORT/JAVA_SOURCE/JAVA_SOURCE
ORA-39097: Data Pump job encountered unexpected error -1423
ORA-39065: unexpected master process exception in DISPATCH
ORA-01423: error encountered while checking for extra rows in exact fetch
ORA-04030: out of process memory when trying to allocate 123404 bytes (QERHJ has
h-joi,kllcqas:kllsltba)

ORA-39014: One or more workers have prematurely exited.
Job "SYSTEM"."SYS_IMPORT_SCHEMA_04" stopped due to fatal error at 11:42:03

I though its due to lack of memory, so i have increased pga_aggregate_target=512MB to 600MB still i am getting a same error.

View 5 Replies View Related

Server Administration :: Migration Of Oracle From AIX To Solaris

Jan 4, 2013

We have to migrate our database from current OS (AIX 5.3) to Solaris 10. We'll do it through expdp / impdp. Is there any other way to do it as well?

Also, we have dataguard configured at current setup. So is there any way we can take backup of existing database at DR & restore it at new server (Solaris) or we'll have to take a fresh backup from DC (after migration) & ship it to DR & create a new standby?

View 8 Replies View Related

Server Utilities :: Execution Of Oracle 11g Wrapped Procedure Gives Ora-00900 Error

Apr 4, 2013

I have a procedure which i wrapped using the oracle 11g wrap utility. If i execute the wrapped procedure using jdbc i am getting an error of 0RA-00900 invalid sql statement.

The procedure is having basic sql statements only.The same procedure if i wrap using Oracle 9i and execute using jdbc it works fine.Is there any change in Oracle 9i wrap utility and Oracle 11g wrap utility.

I tried even Oracle 10g wrap it is also not working fine.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved