Server Utilities :: DBMS-DATAPUMP.metadata-filter And Long Table List?
Jun 9, 2010
I have a problem with DBMS_DATAPUMP.metadata_filter.Let's suppose that I need to export a huge list of tables (a,b,c,d,e,f,g,h,i...). Let's suppose that the list is dynamic do NOT want to use
DBMS_DATAPUMP.metadata_filter (handle => h1,
NAME => 'NAME_EXPR',
VALUE => 'IN (''a'', ''b'', ...)',
object_type => NULL);
DBMS_DATAPUMP.metadata_filter (handle => h1,
NAME => 'NAME_EXPR',
VALUE => 'IN (SELECT a.export_object_name FROM my_export_table a, user_objects b WHERE a.export_object_name = b.object_name AND b.object_type = ''TABLE'')',
object_type => NULL
);
but it results in error.
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
ORA-00942: table or view does not exist
have a requirement to load .dmp files into existing staging tables and there is package to load the ODS tables from staging.So,I thought of using DBMS_Datapump utility to import the data from .DMP files to the Tables and this need be automated.
--Create Directory CREATE OR REPLACE DIRECTORY test_dir AS 'C:Test'
--grant Access to the User GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
--Script to import DECLARE l_dp_handle1 NUMBER; BEGIN l_dp_handle1 := dbms_datapump.OPEN(operation => 'IMPORT',
[code]...
Errors
ERROR at line 1: ORA-31634: job already exists ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 938 ORA-06512: at "SYS.DBMS_DATAPUMP", line 4590 ORA-06512: at line 4
Expdp directory=xxx.dmp dumpfile=aaa.dmp logfile=xxx.log FULL=Y : :: : : :: : : : ; Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 24.87 MB Processing object type SCHEMA_EXPORT/USER
[code]...
then my export hangs..... checked in alert log nothing found.and then killed the job and reran again but same....checked the status and it's saying EXECUTING.
I'm currently busy with database consolidation, so I'm searching for a solution to generate some useful DDL to prepare the new target database before importing the application's data. This should include TABLESPACE DDL and all additional users with their grants.
So first I thought of developing a simple script, which will create the CREATE TABLESPACE DDL but with transformed datafile paths.But my throws some errors and I don't understand why:
ORA-31604: invalid NAME parameter "STORAGE" for object type TABLESPACE in function SET_FILTER declare l_hObject NUMBER; l_Objddl CLOB;
but it will return me below error, i have only access to user shan our client cant allow me to use system or sysdba schema or any other required grants or privileges so is there any way to take metadata backup of user shan from user shan.
EXP-00008: ORACLE error 942 encountered ORA-00942: table or view does not exist EXP-00024: Export views not installed, please notify your DBA EXP-00000: Export terminated unsuccessfully
I got an assignment to create Oracle 11g db. I will be provided the full datapump export dump of an Oracle 10g db in linux. I need to import it to 11g Database in Windows. I have no information about the tablespaces, users etc I have created db with system,sysaux,undotbs temp and users tablespaces.
As a part of our back up we used to export the production data every day using Original export Utility but from 11g original export Utility is de supported and also datapump doesn't support XML Objects so is there any other way to export the full database else any option to export xml Object using datapump.
We have three instances on the RAC, When I create a directory is there a default instance that it gets created on? When I execute an datapump export on instance 1, it works but on 2 and 3 it fails with error relating to directory.
Error - ORA-39002: invalid operation ORA-39070: Unable to open the log file. ORA-29283: invalid file operation ORA-06512: at "SYS.UTL_FILE", line 475 ORA-29283: invalid file operation
(1) Does imp utility rebuild the indexes while loading data ? or it simply takes the rows from dump and load into test system without building from scratch ?
(2) I am trying to replace 'exp' and 'imp' with datapump utilities ? But, I am confused about the parameters to be used ?
(a) Can I load both data and meta data at the same time (Using CONTENT=ALL option) ? (b) I am planning to implement this in two steps :
first load only metadata using - CONTENT=METADATA_ONLY TABLE_EXISTS_ACTION=REPLACE
I'm trying to deploy the schema using DATAPUMP API. The user from which the schema get deployed has the direct privilege of CREATE USER (not through role). But got the insufficient privileges error.
Processing object type SCHEMA_EXPORT/USER ORA-31685: Object type USER:"SCOTT1" failed due to insufficient privileges. Failing sql is: CREATE USER "SCOTT1" IDENTIFIED BY VALUES '4EBB0DDE3C79FE47' DEFAULT TABLESPACE "USERS" TEMPORARY TABLESPACE "TEMP2" PROFILE "APP_PROFILE".
But the user get created successfully when run the CREATE statement manually.I have created the user manually and again run the deployment procedure. Got the below error for ROLE_GRANTS.
Processing object type SCHEMA_EXPORT/ROLE_GRANT ORA-39083: Object type ROLE_GRANT failed to create with error: ORA-01932: ADMIN option not granted for role 'EXP_FULL_DATABASE' Failing sql is: GRANT "EXP_FULL_DATABASE" TO "SCOTT1"
The user has EXP_FULL_DATABASE with ADMIN Option and IMP_FULL_DATABASE with ADMIN option direct privileges.which privileges the user needs to deploy the schema successfully?
I am using Datapump import using database link to import an entire schema from another Server but it gives issues with constraints.I tried to first only import the metadata and then disable the constraints and import data and enable constraint but in this case the temp tablespace keeps filling up and i am out of space. Is there any method to do a full import including constraints and indexes.
I have analyzed that, datapump estimation is 9.902GB. When i check size of .dmp file, it's shows 1.44Gb.
Export: Release 11.2.0.1.0 - Production on Fri Apr 5 02:00:05 2013 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. ;;; Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Starting "SYSTEM"."SYS_EXPORT_FULL_01": system/******** dumpfile=expdp_LVGITRN_30_24_050413.dmp directory=DP_DIR logfile=expdp_LVGITRN_30_24_050413.log full=y exclude=statistics Estimate in progress using BLOCKS method... Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA Total estimation using BLOCKS method: [bold]9.902 GB [/bold]
Is it possible to filter each row in a select statement based upon a list that you loop through. For instance in a where clause having an in statement but rather than selecting any value in the in statement looping the main select through each value in the "in" filter.
Eg:
CREATE TABLE TRANSAC( Item_id number , Transaction_Date DATE, Category_id number )
You could then write
SELECT MAX(Transaction_Date) from TRANSAC where category_id = '141'
and then where category_id = '142' then '143' etc.
I understand i could use an inline view instead of a where clause but the issue is that i already have the list of values I want to compare against.
I was cloning a schema user1 as user2 in the same database.
user1 had quota on 2 tablespaces user1_data and user1_index.
I created user with name as user2.
I created tablespace user2_data only and granted user2 unlimited quota on that tablespace only (did not grant him 'resource' role or unlimited tablespace privilege) Now exported user1 schema as follows
during import i encountered following errors for so many constraints
"ALTER TABLE "table2" ADD CONSTRAINT "constraint_name1" FOREIGN KEY ("CTR_ID") REFERENCES "table1" ("CTR_ID") ENABLE NOVALIDATE" IMP-00003: ORACLE error 2270 encountered ORA-02270: no matching unique or primary key for this column-list IMP-00017: following statement failed with ORACLE error 2270:
I found that the it happened as the primary key of table1 was not created for which error was logged in the log file
. . importing table "table1" 19441 rows imported IMP-00015: following statement failed because the object already exists: "ALTER TABLE "table1" ADD CONSTRAINT "T1_PK79" PRIMARY KEY ("CTR_" "ID") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 STORAGE(INITIAL 65536 F" "REELISTS 1 FREELIST GROUPS 1) TABLESPACE "USER1_INDEX" LOGGING ENABLE " . . importing table "table5" 0 rows imported
However, I checked that the T1_PK79 does not exist in the user2 schema though it exists in user1 schema Neither the index for priamry key (T1_PK79) existed in user2 schema not the table <table1> existed before this import Then what could be the reason that I am getting an error "IMP-00015: following statement failed because the object already exists"?
I assume tablespace for index would not be an issue here as other indexes got created properly in user2_index tablespace during this import.
I tried this twice, once with user2 schema and then with user3 schema as well (with different tablespace), but result is the same.
There were no users connected to the database during export and no background jobs were modiying any data in schema user1 while export.
I am trying to reoarganise a tablespace with Enterprise Manager from manual to automatic, but script generation gives the following warnng :
Quote:Reorganization includes a table with a LONG column. To support reorganization of tables with LONG columns that are greater than 32Kbytes the external procedure MGMT$REORG_MOVELONGCOMMAND must be configured properly. It has been determined this external procedure is not currently configured as expected. Configure SQL*Net appropriately to allow it to call the external procedure service process.
I made changes suggested by [URL] ..... in the section Using Reorganize Objects with LONG Columns but the warning persists. I noticed however that $ORACLE_HOME/lib/libnmuc.so is an empty file (0 bytes).
Oracle Database 10g 10.1.0.2.0 Solaris (SunOS 5.9 Generic_117171-07 (64-bit)) Oracle home /data/lun1/oracle/product/10.1.0/db_1 {ORACLE_HOME}/network/admin/tnsnames.ora # tnsnames.ora Network Configuration File: /usr/oracle/product/10.1.0/db_1/network/admin/tnsnames.ora # Generated by Oracle configuration tools. [code]....
Upon form load, TList will be populated with predefined item. The behavior i am trying to achieve is to have a text item so user could entered specific text which will then filter the values in TList .
I am not sure if Oracle has a simple solution for a problem that I have in retrieving data?
Conditions:
As per rule we have to hold data from 7 - 10 years so the data can be searched at any given time.Don't want to purge data and keep the historic data separate so the request can be redirected based on the year.
I am looking for a solution similar to windows .ocx search. When a application is requesting an ocx (active X control) the search happens in C;\windows\system32 and if the ocx doesn't exists then the search is done on the entire system. Similarly can a table be partitioned (just the term I am using not discussing Oracle partition here) so I can partition data from 2011,2012 and 2013 to search first and if the searched data doesn't exists then search the other partition (data from 2003-2010) ?
I am receiving two large export files from a vendor, so I have no control over the contents. I need to import these into our database. The two export files are very similar, except the one has slightly differenet columns in it. So, export file 1 may have a table:
COLUMN_A COLUMN_B COLUMN_C
The second file may have:
COLUMN_A COLUMN_B COLUMN_D
At the destination, I have a table that has:
COLUMN_A COLUMN_B COLUMN_C COLUMN_D
Is there a parameter that would let me interchangably import either (or both) files into this destination table? This is my first attempt at data pump - but I know using import this has caused me issues. Not sure if the same limitations exist? Will the missing columns cause it to fail?
have 2 Oracle server. One with Suse Linux (Oracle 10.2.0.4.0) and one with Windows 2003 Server x64 (Oracle 11.2.0.1.0). I made a mistake by installing Oracle 11 x32 on a x64 server. Nevertheless it works for about half a year. Then my backups with datapump don't work. I changed the SGA_TARGET down to 1024M and the datapump works again. Now I want to renew the Windows server and want to import the datapump schemes to the Linux server.
I've just installed an Oracle 11.2.0.1.0 64 bit server on my windows 7 machine in order to play around while using attempting to run
exec dbms_stats.gather_table_stats('eii','v2x4e')
I get the following: ORA-20000: Unable to analyze TABLE "EII"."V2X4E", insufficient privileges or does not exist ORA-06512: at "SYS.DBMS_STATS", line 20327 ORA-06512: at "SYS.DBMS_STATS", line 20360 ORA-06512: at line 1
My initial google searches indicate that I need the select any table and analyze any privileges. I don't think that can be right/appropriate - but I've granted them anyway to no avail.
Select * from user_tables
returns tables in the System and sysaux tablespaces, but not my own schema/tablespace?
How to create DBMS_SCHEDULER job for particular instance,we have 3 instance we want to schedule few jobs in Instance 2,how to schedule in Particular instance.
I have scheduled few jobs (scheduled for each day) which has not run today, which was running successfully till yesterday. When I query dba_scheduler_jobs, i could see last_run_date as yesterday's date 6am and next_run_date is still showing today's date and time (it should show tomorrow's date/time).
When performing test recoveries of a database we often wish to prevent database jobs from running application processes such as (a) sending emails; or (b) logging events to a central logging database. In order to accomplish this, we set the job_queue_processes to zero before opening the database.
This means that no dbms_jobs run on the open instance, but it does not prevent dbms_scheduler jobs from running.How can we prevent any dbms_scheduler jobs from running?
I have looked at setting the MAX_JOB_SLAVE_PROCESSES attribute using DBMS_SCHEDULER.SET_SCHEDULER_ATTRIBUTE, but this cannot be set to zero.
I have a requirement to import text files which are generated from 3d modelling software xsteel where it records all geometric information and i want to import this information into oracle table.