Server Utilities :: Syntax For Exporting Only Procedures Of Particular User
Mar 7, 2011What is the syntax for exporting only the procedures of a particular user.
View 1 RepliesWhat is the syntax for exporting only the procedures of a particular user.
View 1 RepliesI created an externally authenticated user in database. And can login without password with below syntax.
SQL> connect / @TESTDB
Connected.
SQL> show user;
USER is "SCOTT"
This scott user has a proxy permission to another DBuser PROXY_USER. Previously I used to login using below syntax.
connect scott[proxy_user]/password_for_scott@TESTDB
So now, what syntax should be used for this "ExternallY Authenticated" user to login as a proxy user?
I am Trying to Export my database and whenever I try to login it is giving Ora-Error 1017 Invalid Username/Password.
If I Login as System and Manager it is accepting but I am not able to Export all my Database.
know the process of exporting only the table structure of a Database without the actual content of it.
Note:: I don't know how many tables are present in the DB.
i am using a schema which i need to take a backup of meta data only i am using exp utility
exp shan/shan@shan file=/backup_dump/shan.dmp log=/backup_dump/shan.log owner=shan rows=n
but it will return me below error, i have only access to user shan our client cant allow me to use system or sysdba schema or any other required grants or privileges so is there any way to take metadata backup of user shan from user shan.
EXP-00008: ORACLE error 942 encountered
ORA-00942: table or view does not exist
EXP-00024: Export views not installed, please notify your DBA
EXP-00000: Export terminated unsuccessfully
I am exporting a table that is 3 GB in size and also Partitioned with option NOCOMPRESS specified.
Now when i export it with COMPRESS=N option of exp utility then it should take 3 Gb in target server but will exporting it with COMPRESS=Y will save some storage during import or once NOCOMPRESS option specified on partition has no impact on exp utility COMPRESS=Y option and it will take 3 GB space in both cases
Is this true that whether u specify COMPRESS=N|Y during export it does not matter the size will be 3 GB always after import?
I would like to export few tables from the physical standby which is in read only mode.
I have tried both the exp and expdp methods and could successfully export and import the tables from physical standby using exp unfortunately the expdp does not allow this process from a read only database.
Does this mean that we still have to use the exp feature instead of expdp ?
Note : I would expect a proper response from experts and no unwanted comments like "Contact Oracle support" or "Paste the entire command here" or "Read the Manuals" or "Why i am exporting from Standby and not from Primary" etc.
extract a huge amount of data from a couple of views... the problem is that they want it in TXT files with fixed record length. There will be like 6 files, for a total amount of about 10GB.
export those tables in the fastest possible way? If I'm not mistaken exp and expdp can't create txt files, so do I really need to use utl_file or spool?
I taking export using consistent parameter. Theoretically i can understand . practically i couldn't understand how it works.
for ex
I am updating tab1 table under sams user. table having one lakh records.
while updating the query using consistent=y and consistent=n. i mean
exp sams/sams file=cons.dmp owner=sams consistent=y
exp sams/sams file=cons2.dmp owner=sams consistent=n
then both files imported to separate user(sam ,san).
Updated info not visible in san and sam user.
I want to know practically how it works. I need perfect example. while using consistent=y and consistent=n
I am trying to export a partition of a table and import it to another database. I get the below error when I try to import.
ORA-14400: inserted partition key does not map to any partition
If I export the table(for that particular partition) and import the table(after dropping the table) in destination, the partitions and sub partitions are created without any problem.
The table is Range Partitioned and Sub partitioned in List. So I had to perform the below operation if I want to retain other data in the Destination table.
1. Drop the existing partition
2. Create the partition and sub partition, same as source
3. Execute imp
In fact I had to perform step#2, as if I split the partition also, the sub partition gets replicated in the new partition, which again throws the same error. Is there better way of managing the partitions and subpartition in destination with exp/imp utility, so that I need not perform step#1 and step#2 manually.
Export /Import
==============
While exporting schema's
i couldn't export dump file to exact location i mean see following query : -
QUERY
=====
exp file=ackupfile1.dmp,ackupfile2.dmp,ackupfile3.dmp
owner=(order,purchase) filesize=5m as os level ,
I fould those dump files files home directory.
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile1.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile2.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile3.dmp
[oracle@localhost ~]$ pwd
/home/oracle
when i listing
rw-r--r-- 1 oracle oinstall 72 Jun 20 21:17 afiedt.buf
drwxr-xr-x 3 oracle oinstall 4096 Jun 17 10:07 Desktop
-rw-r--r-- 1 oracle oinstall 71 Jun 19 20:42 ed.hup
drwxr-xr-x 2 oracle oinstall 4096 Aug 6 19:38 backup
-rw-r--r-- 1 oracle oinstall 2826240 Aug 6 19:39 expdat.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile1.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile2.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile3.dmp
Dump file goes to home path even if i mentioned appropriate location.
I'm trying to export a relatively large database but it's a bit more complicated than that.For one schema I need a full export / import (data included).
For another 10 schemas I need them empty, with the exception of a table in some of them which needs to be exported / imported with all data inside.Is it possible to do this with datapump utility (impdp, expdp)?
Afterwards I will be running some scripts to populate the DB instance with critical data / metadata.
I have taken database backup using exp command and when I try to import in other pc the foreign keys are not imported. It saying error message that no matching unique key or primary key for this column.
how will i take backup including with primary keys?
Sunddenly my exports hangs at 'exporting cluster definitions'. I had been using this database since last 4 years and it never cause a problem or hangs at this level. here i'm pasting my screen details. it is my production db.
[oracle1@wbh_as1 smbshare]$ exp wb/wb
Export: Release 9.2.0.1.0 - Production on Thu Dec 23 00:02:44 2010
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
Enter array fetch buffer size: 4096 >
Export file: expdat.dmp > wb
(2)U(sers), or (3)T(ables): (2)U >
Export grants (yes/no): yes >
Export table data (yes/no): yes >
Compress extents (yes/no): yes >
Export done in US7ASCII character set and AL16UTF16 NCHAR character set
server uses WE8ISO8859P1 character set (possible charset conversion)
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user WB
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions for user WB
About to export WB's objects ...
. exporting database links
. exporting sequence numbers
. exporting cluster definitions
I want to take an export of table MESSAGE, and filter it for the day of 17 JUL 2013 (just to limit the size). i used the following expdp command but its not working.
expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile=FA0001P_BG_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
But with select query i am able to retrieve the rows for the specific date.
select * from MESSAGE where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
Here is the command with syntax error.
[oracle@orcl log]$ expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile= DB_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
-bash: syntax error near unexpected token `('
I ran the following control file in sql loader:
LOAD DATA
INFILE "gateway.csv"
truncate
INTO TABLE GATEWAY
Fields terminated by ","
Optionally enclosed by '"'
trailing nullcols
[code]....
and I got the following error:
zcyds891:/opt/oracle> sqlldr gwcem/gwcem@pfs control=gateway.ctl log=/tmp/ldr.log bad=/tmp/bad.log
SQL*Loader: Release 9.2.0.8.0 - Production on Tue Dec 7 05:07:59 2010
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
SQL*Loader-350: Syntax error at line 12.
Expecting "," or ")", found "INTERGER".
GATEWAYPROTOCOL INTERGER,
^
The current OS is Solaris, SunOS 5.10.
I tried to import a dump in 11g that was taken in oracle 9i. The import started but it hangs after some time. Exactly say it check only the character set of the DB's then it hangs. let me know if there are any specific procedures to import a dump from 9i to 11g directly.
View 8 Replies View RelatedI have to copy all the procedures in database to local folder and extension( or file type) of each procedure is ".proc".
I did tried with dbms_metadata but as there are 300 procedures it consuming time & I want separate file for each one.
select
dbms_metadata.GET_DDL('PROCEDURE',u.object_name), u.object_name
from
user_objects u
where
object_type = 'PROCEDURE'
AND object_name in( 'P1');
Procedures, triggers and other object with character changed.
Every character was replaced by ã? the content of procedures and other objects after importing a schema for production approval.
Example:
EXCEPTION
WHEN OTHERS THEN
p_msgerro := p_msgerro ||
' Erro na selec?o da Unidade Administrativa. ' ||
The character (?) deveria ser (ã)
i want to do a schema export from Database A. There are hundreds of users under this schema.I have to import this schema into other database say B. My question's are:
1) Do i need to pre-create only schema user or all the users under it.
2) Will the schema export all the roles,procedures,packages,synonyms,funsctions and triggers?
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
View 4 Replies View RelatedI would like to know if 'user creation definition' is exported in user mode export if export is done with DBA role..If it is Not, does it mean we always need to precreate the user before we import the dump created using 'user mode export'?
View 2 Replies View RelatedI have exported data of one user an importing into another schema at another server. when i am trying to imoport it is working fine for quite no of imports into tables, but after some time it starts giving me below mention error...
IMP-00008: unrecognized statement in the export file:
<
IMP-00008: unrecognized statement in the export file:
<
IMP-00008: unrecognized statement in the export file:
<ے
IMP-00008: unrecognized statement in the export file:
+A
IMP-00008: unrecognized statement in the export file:
[code]...
I'm trying to move a complete tablespace with all associated schemas/users and data.
The command that I'm using is as follows:
impdp DP_USER/DP_USER directory=datapumps network_link=backup tablespaces=SDE
DP_USER has the IMP/EXP_FULL_DATABASE, as does the target of the database link.
The error that I get is the following:
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39083: Object type TABLE:"SDE"."VERSION" failed to create with error:
ORA-01918: user 'SDE' does not exist
The SDE tablespace has objects from the SDE schema, but the SDE schema does not seem to be created along with the tablespace. How do I cause it to be?
I want to create such a script to clone the Database user with the new name. Just like we do normal import and export I want that i should enter just the username of the existing user and username of the new user I want to get created, the password for the same.
It should create the new user with all roles and the default roles and privileges of old user.
I have a question to clarify regarding user creation during export and import.
Will user get created along with roles,privileges by default when using impdp command ?
I need to export a user with all the tables. But I need to export data into only few tables, need to omit the data of few tables.
Ex I dont want to export data of Audit tables with AU prefix.
we have every day full export backup in eacly morning. but some tables's data has been delete unforutnaltely & structure of these tables intact. how do i import only some tables of a user from
daily full export backup? . this has to be done immdediately.
I need a clarification on the below query:
1) DROP USER MK CASCADE;
2) Created user
3) Created objects like procedure,index... and granted privileges.
4) Now i am performing the import as below.
impdp system/.... SCHEMAS=MK DIRECTORY=EXPBKUP DUMPFILE=ABC_Export.dmp LOGFILE=ABC_imp.log INCLUDE=TABLE TABLE_EXISTS_ACTION=REPLACE
But nothing is imported.
Is this the problem of the parameter "INCLUDE=TABLE TABLE_EXISTS_ACTION=REPLACE"? as the user is new.
I want to know if it is possible, to run IMP program without connecting with database user , for example :
imp '/ as sysdba' file=f.dmp fromuer=u1 touser=u2 log=flog.log
that permits to perform scripts without passwords in.