Server Utilities :: How To Create Parameter Files

May 20, 2011

How to create parameter files i.e. (.par) files for expdp/impdp..

View 1 Replies


ADVERTISEMENT

Server Utilities :: Importing Text Files?

Sep 16, 2011

I have a rather complicated process to import text files into my DB.I'm given thousands of files every day, separated by "," and with 80 fields each. With a bash script, I take the 45 fields I need and then split each file into x number of files grouping the rows by three fields.Then I use SQL Loader to insert them into de DB.

The problem is that now I must insert on two tables and the "WHEN" clause doesn't allow the use of > and <.

To make things a litle clearer take this text file (already splited and grouped and ready to be inserted):
...
1,1,135,1900,0,12,114,2011/08/25 17:19:00,135,...
1,1,135,1900,0,13,119,2011/08/25 17:19:00,136,...
1,1,135,1900,0,14,117,2011/08/25 17:19:00,137,...
1,1,135,1900,0,15,113,2011/08/25 17:19:00,138,...
1,1,135,1900,0,16,119,2011/08/25 17:19:00,139,...
...
When field 6 is higher or equal to 14, it must go to table a.When field 6 is lower than 14, it must go to table b.I can't use external tables as I'm in a different server.

View 1 Replies View Related

Server Utilities :: Merge Two Control Files?

May 14, 2012

1)Is there a way to load two tables from two input files in one control file?

I have two control files.

LOAD DATA
INFILE 'C:
ame.txt'
BADFILE 'C:
ame.bad'
DISCARDFILE 'C:

[code]....

Can i load emp_tab and job_table using one control file and two input files name.txt and job.txt?

2)Is there a way to pass the path as a parameter in the control file?

In the job below, when i execute the sqlldr , can I pass C:/job.txt as an input instead of specifying it in control file?

View 3 Replies View Related

Server Utilities :: Loading Multiple XML Files Using SQL*Loader?

May 9, 2011

I am trying to load multiple XML files into Oracle DB using SQL Loader. The filenames of the XML files starts with a description and then numbers, where the numbers are different each time.

Here's my CTL file:

LOAD DATA
INFILE *
INTO TABLE XML_TABLE TRUNCATE
xmltype(XML_TABLE)
FIELDS
(

[code]....

I don't want to keep having to go into the ctl file and change the numbers of the xml file. Is there a way where I could just load all .xml files that begins with 'description'? Like maybe

description*.xml

View 1 Replies View Related

Server Utilities :: Export Table In Multiple Files?

Jun 18, 2013

exporting a big table (many rows = 3.000.000). Using the command exp the error message returned is "expdat.dmp > EXP-00028: failed to open expdat.dmp for write". Is there a possibility to export this table in multiple files (as a splitter)?

View 10 Replies View Related

Server Utilities :: Load Data Into More Tables From Many Files

Jan 20, 2012

I want to load data into more tables from many files ,based on first column value,which is FILLER field.i am trying to test this scenario with two oracle tables with similar definition. and load one record on each table using WHEN/POSITION keywords. for this , i added first column as reference column in the data which i have in ctl file itself.

1st table loaded with 1st record. But, 2nd record not loading.if i missed anything with WHEN/POSITION keyword ?

This is the error in log file for 2nd table(WD1):

Record 2: Rejected - Error on table WD1, column TAB.
ORA-01841: (full) year must be between -4713 and +9999, and not be 0

Table WD1:
0 Rows successfully loaded.
1 Row not loaded due to data errors.
1 Row not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
[code]....

View 9 Replies View Related

Server Utilities :: Import With Buffer Parameter

Jan 12, 2011

We are having daily syncing script which use to drop and import of one major schema.

When we do import with parameter BUFFER=209715200, it takes approximatively 4 hours & when we do without the parameter it takes 6 hours.

So i humbly request to above scenario of import process with buffer parameter in concern.

Also to expose, import is done in freeze hours.

View 2 Replies View Related

Server Utilities :: Cannot Access NLS Data Files Or Invalid Environment Specified

May 15, 2010

I exported oracle database from one server in a dmp file. then i imported it in another oracle database server. when i saw the imported data the columns which were storing german data is in rubbish characters.

then i remember that the database from where i exported is having nls language as german. i executed this statement to set the nls on the new server

alter system set nls_lang=german SCOPE=SPFILE

but now my database is not getting started always giving me error - cannot access NLS data files or invalid environment specified.

i also set the path NLS_LANG=german in the solaris environment.

View 1 Replies View Related

Server Utilities :: How To Export One Table To Few Separated Dump Files

Mar 24, 2010

I have a table with 9 regions. How can I export them in 9 seperated dump file?

View 4 Replies View Related

Server Utilities :: Loading Multiple Excel Files To Different Tables

Mar 29, 2012

I have a bunch of data in 50 excel files. I need to load all these 50 files into 50 different tables. I would like to do this in one script. I went through the forum to get this information, people suggested create a shell script etc or list the sqlldr command multiple times etc.

provide some clarity on this as to what's the best approach.If it is through shell scripting provide the shell script and instructions to execute it. Iam new to shell scripting.

View 5 Replies View Related

Server Utilities :: Inserting Targz Files Into Table From Specified Folder?

Sep 20, 2012

I used search option but i didn`t find answer to my question. My problem is with importing files into table( 2 columns > 1 number, 2 clob ). I can`t use SQL Loader becouse i need to load 30.000 tar.gz files from 1 folder, IMPORT option in SQL developer didn`t work too.

View 1 Replies View Related

Server Utilities :: Multiple Export / Import Dump Files?

Apr 25, 2011

I am trying to export/import of a schema who's size is around 60 GB.

Export parfile goes like this..
file=expdmp1.dmp, expdmp2.dmp, expdmp3.dmp, expdmp4.dmp, expdmp5.dmp, expdmp6.dmp, expdmp7.dmp
filesize=10240M
log=explog.log
owner=owner1

Import parfile goes like this..

file=impdmp1.dmp, impdmp2.dmp, impdmp3.dmp, impdmp4.dmp, impdmp5.dmp, impdmp6.dmp, impdmp7.dmp
filesize=10240M
log=implog.log
fromuser=owner1
touser=owner2
ignore=y

I am going to run this on production. So want to check it..

View 2 Replies View Related

Server Utilities :: IMPORT (Impdp) Multiple Dump Files

Aug 2, 2012

i have more than 100 dumpfiles to import into my oracle 11g database. i know how to import(impdp) for same named dumps but here all the dumpfile names are totally different(ex: aa.dmp,bb.dmp,).

View 3 Replies View Related

Server Utilities :: Exporting Schema Using Consistent Parameter

Aug 5, 2012

I taking export using consistent parameter. Theoretically i can understand . practically i couldn't understand how it works.

for ex

I am updating tab1 table under sams user. table having one lakh records.
while updating the query using consistent=y and consistent=n. i mean

exp sams/sams file=cons.dmp owner=sams consistent=y
exp sams/sams file=cons2.dmp owner=sams consistent=n

then both files imported to separate user(sam ,san).
Updated info not visible in san and sam user.

I want to know practically how it works. I need perfect example. while using consistent=y and consistent=n

View 2 Replies View Related

Server Utilities :: ROWS Command Line Parameter

Jan 11, 2013

I am new to SQL*loader and I would like to know what is the maximum number of ROWS that can be loaded in a Conventional bind array while specifying the command line parameter.

View 2 Replies View Related

Server Utilities :: Exporting Schema Using Filesize Parameter?

Aug 6, 2012

Export /Import
==============

While exporting schema's

i couldn't export dump file to exact location i mean see following query : -

QUERY
=====

exp file=ackupfile1.dmp,ackupfile2.dmp,ackupfile3.dmp
owner=(order,purchase) filesize=5m as os level ,

I fould those dump files files home directory.

-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile1.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile2.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile3.dmp
[oracle@localhost ~]$ pwd
/home/oracle

when i listing

rw-r--r-- 1 oracle oinstall 72 Jun 20 21:17 afiedt.buf
drwxr-xr-x 3 oracle oinstall 4096 Jun 17 10:07 Desktop
-rw-r--r-- 1 oracle oinstall 71 Jun 19 20:42 ed.hup
drwxr-xr-x 2 oracle oinstall 4096 Aug 6 19:38 backup
-rw-r--r-- 1 oracle oinstall 2826240 Aug 6 19:39 expdat.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile1.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile2.dmp
-rw-r--r-- 1 oracle oinstall 5242880 Aug 6 19:38 expfile3.dmp

Dump file goes to home path even if i mentioned appropriate location.

View 7 Replies View Related

Server Utilities :: Export Illegal Parameter Value In OCI Lob Function?

Jan 21, 2011

We are trying to export our production data .We got this error

. . exporting table EA_BLOB
EXP-00056: ORACLE error 24801 encountered
ORA-24801: illegal parameter value in OCI lob function

how to overcome this error ?

View 2 Replies View Related

Server Utilities :: Load 780 CSV Files Into 12 Tables Created In Database - Sql Loader?

Jul 22, 2011

I have 780(12*65) csv files generated from 65 databases.Now I have to load this 780 csv files into 12 tables created in my database for some monitoring and reporting purpose.to call the sql loader I am plannig to create 780 lines like below.

sqlldr abc@tns/pwd control='E:htmlctlhtml_broken_jobs_rpt.ctl' log='E:htmlreportloghtml_broken_jobs_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_db_size_rpt.ctl' log='E:htmlreportloghtml_db_size_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_fragmentation_rpt.ctl' log='E:htmlreportloghtml_fragmentation_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_index_stats_rpt.ctl' log='E:htmlreportloghtml_index_stats_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_invalid_object_rpt.ctl' log='E:htmlreportloghtml_invalid_object_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_long_running_queries_rpt.ctl' log='E:htmlreportloghtml_long_running_queries_rpt.log'

we know creating 780 control files is the difficult task.So I have created only 12 control files. is there any mechanism to pass a varible (planning to declare it in the sqlldr line) to the infile clause like below in sql loader?

infile "E:htmlreportoutput&a_html_broken_jobs_rpt.csv"

here a is the variable name. it will change every 12 csv files once.

or

is there anyother way to achive this?

View 8 Replies View Related

Server Utilities :: Syntax Error In Using Query Parameter In Expdp

Aug 17, 2013

I want to take an export of table MESSAGE, and filter it for the day of 17 JUL 2013 (just to limit the size). i used the following expdp command but its not working.

expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile=FA0001P_BG_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')

But with select query i am able to retrieve the rows for the specific date.

select * from MESSAGE where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
Here is the command with syntax error.
[oracle@orcl log]$ expdp SYSTEM directory=DATA_PUMP_DIR dumpfile=DB_16_08_2013.dmp logfile= DB_16_08_2013.log TABLES=schema.MESSAGE QUERY=schema.MESSAGE:where created_on between to_date('17-July-13 00:00:00','DD-Mon-YY hh24:MI:SS') and to_date('17-July-13 23:59:00','DD-Mon-YY hh24:MI:SS')
-bash: syntax error near unexpected token `('

View 3 Replies View Related

Server Utilities :: Query Parameter - Export Subset Of Table Using RowID

Aug 15, 2012

I am exporting using query parameter. I am trying to export subset of table using rowid.

SQL> select rowid , name from tab1;

ROWID NAME
------------------ ---------------
AAAM0rAAEAAAAGMAAA sam
AAAM0rAAEAAAAGMAAB sona
AAAM0rAAEAAAAGMAAC rose
AAAM0rAAEAAAAGMAAD chris
AAAM0rAAEAAAAGMAAE san
.................. ....
.................. ....

command given as

exp sam/sam tables=tab1 file=exprwid.dmp query="where ROWID='AAAM0rAAEAAAAGMAAA'" log=log1.log

Export done in US7ASCII character set and AL16UTF16 NCHAR character set
server uses WE8ISO8859P1 character set (possible charset conversion)

About to export specified tables via Conventional Path ...
. . exporting table TAB1
EXP-00056: ORACLE error 911 encountered
ORA-00911: invalid character
Export terminated successfully with warnings.

how can i export this record ?

View 4 Replies View Related

Server Utilities :: How To Create Sequence

Jul 16, 2011

Every time sequence should be start from 1.can i know how to create sequence? for ex:-once i have uploaded 100 records this time sequence is generating from 1 to 100.next time i will upload again 100 records this time sequence is generating(Starting) from 200 to 300.but it should be generate from 1. how? i have created like this

CREATE SEQUENCE XX_SEQUENCE
MINVALUE 1
MAXVALUE 99999999999
START WITH 1
INCREMENT BY 1
CYCLE
NOCACHE;

View 2 Replies View Related

Server Utilities :: Loading Multiple Input Files Into Multiple Tables

Jul 9, 2012

NGFID;RECTYPE;RECNAME
25;7;POLES
PARENT
CHILD;1401;9845075;2020
817;8;SUPPORT
PARENT
CHILD

Required output:-

AREA_SRNO = 1
AREA_NAME = '3rivieres.export.ngf'

File :-mauri.export.ngf

NGFID;RECTYPE;RECNAME
257;7;POLES
PARENT
CHILD;1401;9845075;2020
8174;8;SUPPORT
PARENT
CHILD

Required output:-

AREA_SRNO = 2
AREA_NAME = 'mauri.export.ngf'....etc

CREATE TABLE NGF_REC_LINK
(
AREA_SRNO NUMBER(2),
AREA_NAME VARCHAR2(40),
NGFID NUMBER(20),
TABLENAME VARCHAR2(40),
PARENT VARCHAR2(200),
[code].......

find the ctl file (ngf_test.ctl) and modify the ctl file as per my requirement.

View 6 Replies View Related

Server Utilities :: Data Pump For Exporting And Importing Extremely Large Data Files

Sep 24, 2010

I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?

View 4 Replies View Related

Server Utilities :: Tool To Create CTL File?

Oct 25, 2011

Are there any GUI based tools that can auto generate a CTL file based off a CSV input? I'd love something like this since I have quite a few SQL*LDR projects coming up!

View 2 Replies View Related

Server Utilities :: How To Create External Table

Aug 10, 2012

im trying to create an external table, and i load my data without no problem, and everything is fine, but i got some behavior with one column that i would like to know whats behind scenes, OK let's get the example:

[*] Sample Data
Line 1:333 1111111112009100000000000080000000013450.33
Line 2:11111111111220091016000000004.48
Line 3:222222222 220091016000000004.48
Line 4:(This is a blank line left)

And this is my External Table Create Query:

CREATE TABLE EXT_TABLE_TEMP
(COL_A VARCHAR2(11),
COL_B VARCHAR2(1),
COL_C DATE,
COL_D NUMBER(12,2))
ORGANIZATION EXTERNAL

[code]....

As you can see i can upload my table with no problem but i always get 3 lines counting last blank line if i try LOAD WHEN COL_A != BLANKS, i dont know if its a problem of the blank space left between fixed fields length, but if i do LOAD WHEN COL_B != BLANKS i get correct result 2 lines instead of 3, i want to know why (missing fields...) and (reject rows...) are not working...

Note: COL_A could be 9-11 length, if length its 9 then 2 spaces left before next one...

View 4 Replies View Related

Server Utilities :: Unable To Create Public Synonym?

May 22, 2011

I have exported and imported a schema from one server to another. In the source schema, I have a public synonym. I do not know the name of that synonym. In the destination schema, the public synonym is missing. How to create the public synonym which is missing in the destination database?In the source, I queried dba_synonyms, all_synonyms... but it returns no rows selected.

View 3 Replies View Related

Server Utilities :: Unable To Create Control File?

Mar 8, 2011

I tried a lot to load data to table from excel(.csv) using sql*loader the oracle version of sql*loader doesn't support the control file created using notepad(.ctl).Though i given a filename with extension as .ctl it seems as a .txt file. Is there any alternate way to create it?

View 3 Replies View Related

Server Utilities :: How To Create Control File To Insert Data In Our Database

Dec 21, 2011

how to create control file and how to load the data through command window in our database using sql * loader.i am having structure in my database and .csv file in my desktop.

View 20 Replies View Related

Server Utilities :: SQL Loader - How To Dynamically Create Multiple Records Out Of One CSV Input Line

Jul 28, 2010

I need to load data from a CSV file where one of the CSV values determines how many records should be inserted.

Example of the input data:

KEYWORD;2;REC1_COL1_X,REC1_COL2_X;REC2_COL1_X;REC2_COL2_X
KEYWORD;3;REC1_COL1_Y;REC1_COL2_Y,REC2_COL1_Y;REC2_COL2_Y;REC3_COL1_Y;REC3_COL2_Y
KEYWORD;4;REC1_COL1_Z;REC1_COL2_Z,REC2_COL1_Z;REC2_COL2_Z;REC3_COL1_Z;REC3_COL2_Z,REC4_COL1_Z;REC4_COL2_Z
If the KEYWORD is found, then the next value determines how many value pairs will follow, and therefore how many rows should be created in the affected DB table.

As a result I hope to achieve this:

SELECT Column1, Column2 FROM testTable

REC1_COL1_X,REC1_COL2_X
REC2_COL1_X;REC2_COL2_X
REC1_COL1_Y;REC1_COL2_Y
REC2_COL1_Y;REC2_COL2_Y
REC3_COL1_Y;REC3_COL2_Y
REC1_COL1_Z;REC1_COL2_Z
REC2_COL1_Z;REC2_COL2_Z
REC3_COL1_Z;REC3_COL2_Z
REC4_COL1_Z;REC4_COL2_Z

I learned how to import data using Oracle SQL loader for cases where one input line more or less matches a (new) row in a DB table.

View 9 Replies View Related

Server Utilities :: ORA-01658 / Unable To Create INITIAL Extent For Segment In Tablespace USERS

Apr 4, 2013

while importing data i got this error in my log file.and i cannot import my data successfuly

in my log file error i found like this

ORA-01658: unable to create INITIAL extent for segment in tablespace USERS
IMP-00017: following statement failed with ORACLE error 1658:
IMP-00003: ORACLE error 1658 encountered
ORA-01658: unable to create INITIAL extent for segment in tablespace USERS
IMP-00017: following statement failed with ORACLE error 1658:

i can import my data using imp utility using this syntax

C:UsersAdministrator>imp tiger/****@tcs file=E:DUMP s.
dmp log=E:DUMP s.log fromuser=tiger121 touser=tiger statistics=none

this my user tiger default tablespace its uses and its a auto extend on and locally managed tablespace,and i have enough space on my drive also.

View 21 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved