Server Utilities :: Expdp Using Remap_schema Will It Also Remap Grants And Synonyms
May 1, 2010If I would be using expdp using remap_schema will it also remap grants and synonyms ?
View 5 RepliesIf I would be using expdp using remap_schema will it also remap grants and synonyms ?
View 5 RepliesOS: RHEL
DB: 11.2.0.2
Every time i try to refresh my production DB with the a old expdp dumpfile using data pump i always face the issue of grants and creation of synonym. I would like to tell you that my DB has three schemas which have lots of dependencies among them and before refreshing them i drop the schemas and recreate the same.
Drop user user_name cascade;So i want to know, is there a script from which i can get all the grants of the DB before dropping the schemas, so that after import i can grant the same and also a query with which i will be able to get all the synonyms of the DB.
is it possible to remap the database schema during export?
Our developers have their databases stored within individual schemas and i want to provide a dumpfile that each developer is possible to easily import to his schema. But when i want to impdp the dumpfile i have to know the schema name within the dumpfile to do a remap to the individual developers schema -> so providing a specific schema name for within the dumpfile would be great.
At the moment i'm getting the ORA-39146, schema does not exist on importing the database..
I did an export from 8.1.7 and imported in 10g. All the tables were imported. I created all the users and gave them necessary grants. when i try to compile synonyms for this schema i imported,I get the following error in enterprise manager:
Failed to compile: ORA-00980: synonym translation is no longer valid
I cannot login to my application with these users even though they have all grants. I receive the following error:
FRM-40735:PRE-FORM trigger raised unhandled exception ORA-00942
When i click ok, the screen disappears.
I did an export using the following parfile (see below) I want to import all the objects associated with this schema into another DB but I want don't want to over-write any of the permissions such as grants.
Is there a way I can get the grants into a sql file before I do the import. If so, provide an example.
cat exp_par
DUMPFILE=exp.dmp
LOGFILE=exp.log
DIRECTORY=DBBACKUP
schemas=t1
i want to do a schema export from Database A. There are hundreds of users under this schema.I have to import this schema into other database say B. My question's are:
1) Do i need to pre-create only schema user or all the users under it.
2) Will the schema export all the roles,procedures,packages,synonyms,funsctions and triggers?
I have one doubt on Expdp & RMAN. Do EXPDP utilities does backup at block level as what RMAN is doing? Which one is faster, expdp or RMAN?
View 16 Replies View RelatedI would like to export specific tables(not entire schema) including metadata. I am using a parameter file for expdp.
Tables=emp,dept
Does this also include all metadata or should i also add the below Include in the parfile ?
INCLUDE =Indexes,Sequences,Procedures,Views
We had AIX OS on 570 machine and database 10.2.0.4. We took expdp and it took 2 and hour to complete every night.
Now we upgrade to 10.2.0.5 and 770 machine and now same command takes 6 hours to complete even database and hardware is upgraded
Command is
expdp T24SILK/oracle directory=backup dumpfile=exp_beod_T24_%U_$dt
.dmp logfile=exp_T24_$dt.log EXCLUDE=TABLE:"LIKE '%TRACE'" parallel=6
I am trying to export Schema using expdp command. but its going hang after few minutes. it seems that it stucks any where. Even I am trying with normal scott schema it is also hanging.
View 16 Replies View RelatedI export a table using exp utility it take 30 mins to complete the export.The same i have done in expdp utility it take 10 mins to complete the export.
How it happens?
While trying to expdp using Query logics, getting syntax related erros shown below:
expdp system/xxxx SCHEMAS=LOG NETWORK_LINK=DBLINK1 INCLUDE=TABLE:"IN('DAILY_LOG')" QUERY=LOG.DAILY_LOG:"where entry_date< to_char(sysdate -1,'yyyymmdd')" DIRECTORY=dump DUMPFILE=log_exp.dmp logfile=log_exp.log
But gives the following error
ORA-31693: Table data object "LOG"."DAILY_LOG" failed to load/unload and is being skipped due to error:
ORA-00904: "YYYYMMDD": invalid identifier
I tried with simple sql with YYYMMDD and it works fine, the entry_date is a char field. in QUERY where i'm doing wrong here?
i want to exclude only data of some particular tables not complete table object when exporting using expdp.
View 13 Replies View RelatedI have a server configured to German & English. when i connect with SQLPLUS, i have German language server output, but when i do "alter session set nls_language='AMERICAN'" - it solves the issue for me.
I need the same for expdp command, but I don't know how to do this. I have tried to add a parameter nls_language, but expdp doesn't recognize it. Is it possible to somehow see server output of the expdp & writing it to the log file in English?
i succeeded to expdp to ASM diskgroup such as
create directory asmexpdir as '+RECO/FILTDB/EXPDP';
grant read,write on directory asmexpdir to oraasfs;
expdp oraasfs/oraasfs2301 directory=asmexpdir dumpfile=SBSR_EXP.dmp tables=TM_SFS_CUST_01 logfile=EXPDP_LOG:SBSR_EXP.log
SUCCESS MESSAGE
. . exported "ORAASFS"."TM_SFS_CUST_01" 387.2 MB 817684 rows
Master table "ORAASFS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ORAASFS.SYS_EXPORT_TABLE_01 is:
+RECO/filtdb/expdp/sbsr_exp.dmp
Job "ORAASFS"."SYS_EXPORT_TABLE_01" successfully completed at 03:34:59
And I like to run this daily and delete after 14 days. but it show error, what can be the solution to run this script?
#!/bin/bash
#Script to Perform Datapump Export backup Every Day
################################################################
#Change History
[code]...
I want to take an export of table MESSAGE, and filter it for the day of 17 JUL 2013 (just to limit the size). i used the following expdp command but its not working.
expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile=FA0001P_BG_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
But with select query i am able to retrieve the rows for the specific date.
select * from MESSAGE where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
Here is the command with syntax error.
[oracle@orcl log]$ expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile= DB_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
-bash: syntax error near unexpected token `('
I have to servers 'A' and 'B', On Server there is a schema with the name "test" having a table "t1". I want to import this t1 table to server B.
Is it possible to export dump using expdp to remote host.
I found that there is an option for this like "network_link". for testing this, I created a dblink from Server "B" to "A" named "vxmldb".
When I am using the below command on Server B there I am getting the following error.
C:>expdp directory=data_pump_dir logfile=test.log network_link=vxmldb schemas=test dumpfile=test.dmp
Export: Release 11.1.0.6.0 - 64bit Production on Tuesday, 05 June, 2012 14:22:07
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Username: system/vxmldb@vxmldb
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39170: Schema expression 'TEST' does not correspond to any schemas.
In above command
directory ---> Server "B" location
network_link ----- > dblink name which is created on Server "B" to access Server "A"
schemas ------ > schema name which is to be exported . Exists on Server "A" DB
username/password ---- >> higher level username/password for Server "A".
@connectString ----- >> connecting to Server "A"
We are DB users (not DBAs) and used always exp/imp bevore application upgrade.
Was googling arround and read something like "Oracle Data Pump - Time to let go of Exp / Imp". It seems exp/imp is obsolete.
Our system doesn't have "expdp" command
> find . -name expdp
>
is this because of too old SQL*Plus?
> sqlplus
SQL*Plus: Release 8.1.7.0.0 - Production on Tue May 29 16:05:28 2012
(c) Copyright 2000 Oracle Corporation. All rights reserved.
Enter user-name: ^C^C
- does our DBA need to give us privileges to run expdp/impdp?
- is that true that a expdp/impdp dump will be on the Oracle server (not the client machine)?
I am having one prod and one devl with prod having stream setup.
I have to refresh devl with prod , but if i will go by full expdp then db_links also get imported into the devl and may cause problem in devl.
Is there any other way using expdp to exclude the stream objects while doing import.
From some day I have this error during export data pump:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31626: job does not exist
ORA-31687: error creating worker process with worker id 1
ORA-31687: error creating worker process with worker id 1
ORA-31688: Worker process failed during startup.
This error is random, if I retry after few minutes the expdp work correctly.
How we can overwrite existing dump file for expdp in oracle 10g because everytime we excute expdp and dmp file exist we get below error
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "C:scott_emp.dmp"
ORA-27038: created file already exists
OSD-04010: <create> option specified, file already exists
We have one feature in 11g reuse_dumpfiles=y ,which doesnt work in 10g, I want something which can overwrite existing dumpfile in 10g?
I am using expdp/impdp to migrate 4 TB database from solaris to Linux.But the import process is taking forever.
View 13 Replies View RelatedI'm taking export dump using expdp of some schema's of total size is 300GB. This is the par file:
DIRECTORY=expdp
FILESIZE=32212254720
DUMPFILE=expdp_schema01.dmp,expdp_schema02.dmp,expdp_schema03.dmp,expdp_schema04.dmp,expdp_schema05.dmp,expdp_schema06.dmp,expdp_sche ma07.dmp,expdp_schema08.dmp,expdp_schema09.dmp,expdp_schema10.dmp,expdp_schema11.dmp,expdp_schema12.dmp,expdp_schema13.d
[code]....
here one biggest schema size is 250GB and the total size of all the schema's is 300GB. The file where am taking the dump has 350GB space but even then the expdp failed saying
ORA-39095: Dump file space has been exhausted: Unable to allocate 8192 bytes
why it failed and how to restart it and make sure it runs successfully without error.
I am having export script which performs normal export operaion on full database. As no I want to convert the same script with expdp (datapump). So what are the changes I need to take care.As per my knowledge I have to perform following task:
1. Create a dicrectory
2. change command from exp to expdp with directory name mentioned.
I have granted connect,resource privilege to a user 'TestU' and have create several synonyms in the schema using the public database link. What worries me is that the user 'TestU' can view the all_ and users_ dictionary views, which makes him accessible to all public database links and all user specific objects definition.
Is there a way to restrict the user to see only the synonyms and avoid looking at their definitions
I am trying to remap schema from dump file to new user but my import fails and giving me error.
-----------------------------------
Using username "oracle".
Last login: Tue Jan 17 17:41:30 2012 from 192.168.100.11
[oracle@cvs ~]$ dbstart
Processing Database instance "cvsdbm": log file /home/oracle/oracle/product/10.2.0/db_1/startup.log
[code]....
There are plenty of synonyms say (80) and few tables say (3) exist under Jack schema. There are no records in all the 3 tables but the size of each tables shows 1MB and Schema (jack) size is 15mb.doubt it could be the synonym size.
Note:
All these synonyms belongs to other schema in same database.
Suppose two users (CONNECT and RESOURCE roles) A and B who work on the same project. How can B create synonyms of all tables of A without asking admin to do that? Of course, we assume that these users have the CREATE SYNONYM privilege. My problem is that B does not have access to user_tables of A, so he cannot obtain the list of A's tables to create synonyms.
View 3 Replies View Relatedhow to give grants and privileges that are assign to schema(clerk) to new schema(manager).
View 1 Replies View Relatedhow to find the versions of exp and imp utilities of database server from windows command prompt?
Note: Currently i have 10.2.0.10 oracle software installed on my local machine.