Export/Import/SQL Loader :: Assigned To ROLE X Be Transferred To Role Y Via Datapump Import
Oct 18, 2013
i have user with the name 'Rob' and this user has been assigned a role 'MY_SRC_ROLE' . I developed a table under rob schema and granted access to this table via role GRANT DELETE, INSERT, SELECT, UPDATE ON rob.emp TO MY_ SRC_ ROLE; I have 100 more users & they have been granted this role 'MY_SRC_ROLE'. These 100 users can now access emp table via Role 'MY_SRC_ROLE' without any issues. Now i took a datapump export & performed datapump import on target server which is also HP Unix with 11.20.3 .
On target server i have user 'JACK' and a role called 'MY_WORK_ROLE'. 5000 users have been granted 'MY_ WORK_ ROLE' on this server. I have used remap tablespace clause & remap schema clause in datapump import script. Once i performed an import , due to schema remap , i can see JACK now owns table 'emp', however grants are still not there, I tried searching on Google & oracle documentation, if somehow we can remap ROLE GRANTS also while doing datapump imp, but i couldn't find supporting syntax. can i assume datapump import is not capable to handle this particular scenario ? I was able to do it by manipulating sqlfile and replacing role name in that but i am looking for a sol. within datapump itself. how can grants assigned to ROLE 'X' be transferred to 'Role Y' via datapump import.
what is the exact use of "str '<EORD>" statement in control file ?, while populating data from .dat file into database table.
When I am trying with this option, I am able to load a single record into a database table. If I remove this statement form control file, complete records from file get populated into table. I am using Linux server for imp data with sqlloader.Also, Does there any difference to use this statement in control file on Linux server and on AIX server ? Here is the structure of my ctl file.
load data infile 'data/EZMAIL/CDSWEB.EZMAIL.dat' *"str '<EORD> '"* into table CDSWEB.EZMAIL fields terminated by '#<EOFD>#' trailing nullcols (
datapump on Windows 2003/R11.2 I have a batch file that creates a daily dump of a schema in the DATA_PUMP_DIR - However, it doesn't! The script is as follows: REM Script to perform data pump export of the user01 schema and move to the
For some reason when this is run from a remote server as a batch it fails to create a file although the output from the scripts has no errors apart from move statements and the expdp output is all good (it states that the file was created in the expected location).If the expdp command is run on the server itself it is all good.
the database (11gR2) is located on Linux server. A business application is installed on a Windows server with an Oracle client 11g.The application is able to start a datapump export, but as matter of fact the dumpfiles are always written to the Linux-server. The directoryobject is defined as DATA_PUMP_DIR (which is the default directory).
Now we are supposed to change the datapump export in a matterthat the dumpfiles get written to the Windows server. Creating a new directory (e.g. c:datapump) and starting than the datapump from theclient always raises the errors
ORA-39002: ...ORA-39070: ...ORA-29283: ...ORA-06512: in "SYS.UTL_FILE", Zeile 536ORA-29283: ...
Is it possible at all to start a datapump export from a Windows client and writing the dumpfiles to the Windows server itself? Or do the dumpfiles always written to the database-server?
Why do export-import require temporary tablespace? Since export-import do behave like DMLs, when does temporary tablespace be needed by datapump utility?
But the query of import is still runing even not showing any amount of rows to be imported.
i already make the tablespace in which the table was previosuly before dropping but when i check the sapce of tablespace that is also not consuming one error i got preiviously while performing this task is:
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production With the Partitioning, OLAP and Data Mining options Master table "CDR"."SYS_IMPORT_TABLE_03" successfully loaded/unloaded Starting "CDR"."SYS_IMPORT_TABLE_03": cdr/********@tsiindia directory=TEST_DIR dumpfile=CAT_IN_DATA_042012.DMP tables=CAT_IN_DATA_042012 logfile=impdpCAT_IN_DATA_042012.log
[code]....
i check streams_pool_size it will show zero and then i make it to 48M and after that
SQL> show parameter streams_pool_size; NAME TYPE VALUE ----------- streams_pool_size big integer 48M
BANNER ---------------------------------------------------------------- Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod PL/SQL Release 10.2.0.1.0 - Production CORE 10.2.0.1.0 Production TNS for 32-bit Windows: Version 10.2.0.1.0 - Production NLSRTL Version 10.2.0.1.0 - ProductionOS version:
Windows 7 64bit I have schema(scott) export with schema level option and imported with different name as (scott1).At regular period of time i need to import the scott to scott1 without affecting existing records.such as *1. Need to append new created records.* *2. Need to append updated records.*
for the above requirement I did in the following way expdp xxxx/******** schemas=SCOTT directory=dumpdir dumpfile=SCOTT_28-SEP-2012.dmp logfile=exp_SCOTT_28-SEP-2012.log imported in the following way impdp xxxx/******** AS SYSDBA REMAP_SCHEMA=SCOTT:SCOTT1 directory=DUMPDIR dumpfile=SCOTT_28-SEP-2012.dmp logfile=imp_SCOTT2_28-09-2012.log TRANSFORM=SEGMENT_ATTRIBUTES:n TABLE_EXISTS_ACTION=APPEND.
The problem is i couldn'table to append the records to existing tables the log error show such ways.
ORA-31684: Object type USER:"SCOTT1" already exists Processing object type SCHEMA_EXPORT/SYSTEM_GRANT Processing object type SCHEMA_EXPORT/ROLE_GRANT Processing object type SCHEMA_EXPORT/DEFAULT_ROLE Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA [code].....
Import: Release 11.2.0.3.0 - Production on Tue Apr 23 13:10:51 2013 Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Starting "IMPDB"."SYS_IMPORT_TABLE_01": abc/******** directory=DATA_PUMP_DIR network_link=TESTAR logfile=net_import_proddev.log TABLES=impdb.abc parallel=12 REMAP_SCHEMA=IMPDB:ABC
aix 6.111.2.0.3 I have an expdp dump from prod to be imported to our test database.I have imported it using impdp, but to my surprise the tables were imported but lots of indexes were not created? even If I have used TRANSFORM=SEGMENT_ATTRIBUTES:N just to use the default USERS tablespace. How do I import the indexes separately, skipping the tables and other objects?
I try to transfer data from one database to another one through data pump via SQL Developer (data amount is quite important) exporting several tables. Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
"Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...= ORA-39001: argument value invalid ORA-39000: .... ORA-31619: ...
The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
I am trying to migrate a table to a new table that has the field sequence changed and also has a new field added. My main question is if it is possible to have datapump add values to the new field in the target table.For example:
-original table has fields a, b, d, c -new table has fields b, c, d, a, e
I want to load the new table and also include adding values for field e. In this case, field e is a year field, so it should be loaded with '2012'..Does datapump have the ability to do this? Is reorganizing the fields going to cause me any problems? We are on oracle version 11.2.0.3
When I do the import the of succeeding dump, I drop the existing schema "SQL> drop user username cascade;" and import dump by " impdp system .... ". I would like to import a dump to an existing instance but only data import and will leave the current packages and other metadata untouched and unchanged on the said existing instance.
1. Do i need to drop user before the import if my requirements are the above?
2. If i need to drop user, what should be script.
3. For the import itself, what parameter should i use?
4. What are the necessaries I need to consider before doing the import.
Imagine you have 100 schemas backed up (expdp) in a dumpfile and you want to import just one schema from that dumpfile in a DB. You can specify just that one schema you want using SCHEMAS parameter in the impdp. But things are not straightforward when you want use REMAP_SCHEMA.
Here is my scenario: ===================
I took the expdp dump of schemas A and B in one go. So, dumpfile has objects from both A and B.The dumpfile name is : schemas_AandB.dmpNow , I want to create schema C from A using REMAP_SCHEMA parameter
-- Putting each parameter in a separate line for readability impdp PSTREF/PSTREF_123 DIRECTORY=ADET_EFX_DIR DUMPFILE=schemas_AandB.dmp LOGFILE=CreatingCfromA-Impdp.log REMAP_SCHEMA=A:CEverything goes fine. Schema C is created from Schema A in the dumpfile.
But impdp is trying to create schema B as well because schema B was present in the dumpfile. Since the schema B and its objects are already in the DB , I get the following errors.
ORA-31684: Object type USER:"B" already exists ORA-31684: Object type PROCEDURE:"B"."SP_CLEAREXPIREDSESSIONDATA" already exists ORA-31684: Object type PROCEDURE:"B"."SP_DELETESESSIONDATA" already exists ORA-31684: Object type PROCEDURE:"B"."SP_DELETESTATECONTEXTINFO" already exists
[code]...
Trying to avoid schema B in the dumpfile from being imported by specifying SCHEMASBut I got the following error ORA-39065: unexpected master process exception in MAIN ORA-12801: error signaled in parallel query server PZ99, instance oracth214:HEWRAC1 (1) ORA-01460: unimplemented or unreasonable conversion requestedMaybe REMAP_SCHEMA and SCHEMAS parameters won't work together.
Is there any way to prevent the impdp from importing user B and its objects ?
I have taken expdp dump from 11g database running in Development env.Now i want to import this dump into 10g database running in QA env.
while taking export from 11g database i used this script and backup was sucesssful
expdp system TABLES=sss_exp_test.EXP_SB_HEADER_TMP VERSION=10.2 DIRECTORY=RMSDEV_IMP_DIR DUMPFILE=EXP_SB_HEADER_TMP.dmp LOGFILE=EXP_SB_HEADER_TMP_expdp.log
When i trying to import this dump in 10g getting error.
impdp system TABLES=sss_exp.EXP_SB_HEADER_TMP DIRECTORY=RMS_DATA_PUMP DUMPFILE=EXP_SB_HEADER_TMP.dmp LOGFILE=EXP_SB_HEADER_TMP_impdp.log
Import: Release 10.2.0.5.0 - 64bit Production on Sunday, 14 October, 2012 19:58:53
Copyright (c) 2003, 2007, Oracle. All rights reserved. Password:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options ORA-39002: invalid operation ORA-39040: Schema expression "SCHEMAS" must identify exactly one schema.
When we are doing impdp , with remap_table=a:b, does it create / import indexes also on the new table b. f indexes get imported / created on table b , what will be their names ? ( different from names of table a indexes?)
Export and import of data in oracle forms...i have created 02 boutons one for export his trigger like this:
eclare alrt number; v_directory varchar2(200) := 'c:ackup'; --- that if the C Drive not the Drive that the windows had installed in it. path varchar2(100):='back_up' ||to_char(sysdate,'dd_mm_yyyy-hh24_mi_ss'); v_exp varchar2(200) := 'exp hamada/hamada2013@orcl file = ' ||v_directory ||'' ||path ||'.dmp'; [code]....
this code is correct he expot not only the data but also the creation of the table ....for exemple i do export and everything is good until now and i find the .dmp in the folder backup .. but when i deleted all data from my app and try to import this .dmp iit show me error it tell me thet the table phone is already created...just export the data of phone not the creation of table and data ???? or how can i import just the data from this .dmp ??
But what I noticed is that user HIEL has a connect role but YONC does not. (Result set is below) Have searched for information but did not find what I was looking for. I did read something about backward compatibility.I'm leaning towards that thought since the other user YONC does not have this role.
Database: 11g R1
SQL> select * from dba_role_privs where grantee in ('YONC', 'HIEL') order by grantee; GRANTEE GRANTED_ROLE ADM DEF --------------------------- --------------------------- --- --- YONC WCAIMS_INQUIRY NO YES YONC ENDUSER NO YES HIEL WCAIMS_CSR_ADDR NO YES
How do i map student_newId field to student_id field in STUDENT DB table so that new id should be inserted in student_id column. How do i specify the mapping in control file. I dont want to create a new column in student table. In control file i will specify the below, Is this a best approach?. Do we have any othe way?
I am using expdp command to export the table by specifying Query parameter. But i am unable to export the table based on the condition.
Ex:EXPDP username/password dumpfile=employee.dmp logfile=emp.log directory=DATADIR_EXP TABLES=EMPLOYEE query=EMPLOYEE:"UPDATED_TIME >= '04-JUN-13' AND UPDATED_TIME >= '05-AUG-13'" Estimate in progress using BLOCKS method...Processing object type TABLE_ EXPORT/ TABLE/ TABLE_ DATATotal estimation using BLOCKS method: 3 GBORA-31693: Table data object "<username>"."EMPLOYEE" failed to load/unload and is being skipped due to error:ORA-00933: SQL command not properly endedMaster table "<username>"."EMPLOYEE" successfully loaded/unloaded...Dump file set for <username>.SYS_EXPORT_TABLE_01 is: E:IMPDPemployee.dmpJob "<username>"."SYS_EXPORT_TABLE_01" completed with 1 error(s) at 12:34:45 Oracle 11g,
I am trying to export selective data from one of my prod database tables. But not succeeding. I was keep on trying for the past 2 hours.
OS : SOLARIS SPARC ORACLE - 10G Query --> WHERE E3RECV_DT LIKE '201305%' (I need to export this query data)
Below Script i am using =============== exp E3USER@SGEBAPU2 statistics=none consistent=n buffer=100000000 file=exp_pipe_file TABLES=IFDATA query="WHERE E3RECV_DT LIKE '201305\%'" log=PGTB_IFDATA_conditional.log