SQL & PL/SQL :: Attach External File Using UTL_SMTP Package?
Feb 7, 2012I want to attache an external file to a email using UTL_SMTP package.
A file which attach will be present in directory.
I want to attache an external file to a email using UTL_SMTP package.
A file which attach will be present in directory.
how to send an email to multiple recipients. Is there a delimeter that i have to use to separate the email addresses, or do i have to do the utl_smtp.rcpt(c,email) mulitple number of times.
View 9 Replies View RelatedI tried to create a UTL_SMTP package using '@?/rdbms/admin/utlmtp.sql' script but there are no package body for UTL_SMTP is created. There are only UTL_SMTP package is create.
View 4 Replies View RelatedI can send mail with UTL_MAIL using this:
CREATE OR REPLACE PROCEDURE send_mail_file_seti IS
BEGIN
UTL_MAIL.SEND_ATTACH_VARCHAR2(
sender => 'mail@from.com',
recipients => 'mail@to.com',
message => '<HTML><BODY>See attachment</BODY></HTML>',
[code]........
What I want to do is send mail attaching a file from file system. For example, I have this file:
/oracle/pmp/file
I want to send it using a procedure like the one above.
I have an .xls file which needs to be the last page of an rdf report. I would prefer to not type up the contents of the file and lay them out. Is there some way I can link this to the footer section or any other alternative?
View 2 Replies View RelatedIt's been some time since I've written any PL/SQL, I'm getting Found 'CURSOR' Expecting: External Language for line 76 when I try to compile? :
1 create or replace PACKAGE pa_user_maint
2 AS
3
4
5 -- ------------------------------------------------------------------------------------
6 -- Exceptions
7 -- ------------------------------------------------------------------------------------
8
9 -- User Exists
10
11 ex_existingUser exception;
12 PRAGMA EXCEPTION_INIT (ex_existingUser, -01920);
13
[code].......
I have got a procedure that successfully creates an oracle external table and populates it with the contents of a file. This works fine until I have a situation where one of the fields is a VARCHAR2(2) and I try to insert say, a 5 character value. When this happens the record in question does not get populated in the external table (and rightly so), but I could do with working out if there is a discrepancy in the number of records in the file and the number of records that actually make it into the table so I could inform the user that there is a problem.
I have attached the code that creates the external table and populates it.
In a control file, the code is as follows:
load data
infile 'C:\Documents and Settings\xxxxx\Desktop\abc.txt'
APPEND
PRESERVE BLANKS
INTO TABLE table1
[code]...
When I run the above control file in sqlldr, I'm getting the error as
Record 1: Rejected - Error on table table1, column column3.
ORA-01481: invalid number format model
In the table the column3 data type is NUMBER(6,2).: The column size in table is 6 and position of column3 in control file is only 4. Also if possible let me know how the same data (send me 2 dummy records) which exactly works for the above control file especially for column3 where decimal number comes in the flat file.
For generating the flat file, for column3, i'm using LPAD(:value,4,0) in the select query column list.
Encountering an issue with an Oracle external table. We get the following error when we load a particular file in this table:
ORA-12801: error signaled in parallel query server P000
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-30653: reject limit reached
ORA-06512: at "SYS.ORACLE_LOADER", line 52
We have narrowed this down to a text field in the file that contains the following text (text obfuscated):
XXXXXXXX ł XXXXXXXXXXXXXXXXXXXXXXXXXXXXX
This is nominally 40 characters long, which matches the maximum field size, but it is being rejected because it appears the 'ł' character is causing Oracle to interpret the length as 41 characters instead. If I remove a X then the file loads without issue.
We tried this in a new schema we created where we added the same table and used the same file. There was no problem at all. The oracle database has the following settings:
NLS_CHARACTERSETUTF8
NLS_NCHAR_CHARACTERSETUTF8
NLS_LENGTH_SEMANTICSBYTE
NLS_LANGUAGEAMERICAN
The table is defined as follows:
CREATE TABLE XXXXXXXXXXXXXXXXXXXXXX.XXXXXXXXX_INTERFACE
(
XXXXXXXXXXXXXXXXXXXX VARCHAR2(40 CHAR),
XXXXXXXXXX VARCHAR2(40 CHAR),
XXXXXXXXXXXXXXXXX VARCHAR2(40 CHAR),
XXXXXXXXXXXXX VARCHAR2(40 CHAR),
[code]........
We tried adding the following attributes but they did not seem to make any difference:
CHARACTERSET UTF8
STRING SIZES ARE IN CHARACTERS
I have a simple external table
CREATE TABLE xyz
(
UNIQUE_ID VARCHAR2(255 BYTE),
FULL_BUSINESS_NAME VARCHAR2(255 BYTE),
ADDRESS_1 VARCHAR2(255 BYTE),
ADDRESS_2 VARCHAR2(255 BYTE),
CITY VARCHAR2(255 BYTE),
[code].......
I need to make the location parametrized so we don't have to hard code the file name file1. Can I do this in Oracle or do I need to do this in a shell script on the UNIX server?
My db version: Oracle 11g I have an empty csv file.I created a external table for the empty csv file.When I run:select count(*) from externaltblname;It returns 1. It should return 0 right. In the definition, I specified "SKIP 1"But still it returns 1. When I use this external table to load into a target table. It loads a single row with null values.How to fix this.
View 2 Replies View RelatedWe are using hand scanning machine for attendance, machine saved data in a TEXT file now I want to load data into my oracle base payroll system.
data saved in this format.
31201009240928000100000002690001
31201009240933000100000000060001
as per my understanding
20100924 is date
0928 is time
269 is employee code
but I am unable to understand, is this IN or OUT time?
I need to compare data of .csv file with the SELECT query result or with any table data.
Is there any possibilities in ORACLE to fulfill this requirement?
Actually what i am trying to do is to extract data form tables and place them in an external text file....i wrote the following code
FUNCTION
create or replace
FUNCTION dump_data ( p_query in varchar2,
p_separator in varchar2 ,
[Code].....
I need to load csv file using an external table.
Structure of External Table:
---------------------------
create table A (col1 varchar2(30), col3 varchar2(30), col5 varchar2(30));
CSV FILE:
-----------
col1,col2,col3,col4,col5
A,B,C,D,E
1,2,3,4,5
The table data should look like
COL1 COL3 COL5
A C E
1 3 5
need to skip the columns in CSV file.
Is it possible to trim the file name while loading into OWB through external tables?
Like suppose I am trying to read a file which has a timestamp value appended in its name. In that case loading into external file would give an error.
I have question on the following, that gets defined for the bad file and the log file
BADFILE 'bad_%a_%p.bad'
LOGFILE 'log_%a_%p.log'
What does the %a and % p indicate? Also if I wanted to get the value of %p and %a into a variable how would I do it? I want to be able to append %p and %a to the below variable, but unsure how to achieve it..
l_badfile := file_nm || '.bad' ;
I want to stored the excel or ms word document in oracle database. Is it possible to view that file from database. If i export full database it is included in that dmp.
View 5 Replies View Relatedusing oracle 10g currently create many external tables like so..
CREATE TABLE "XT_UNITS"
(
"Q1_2012" VARCHAR2(25 BYTE),
"Q2_2012" VARCHAR2(25 BYTE),
"Q3_2012" VARCHAR2(25 BYTE),
"Q4_2012" VARCHAR2(25 BYTE)
[code]....
is there any way I can use 1 flat file (csv) to populate many external tables ?
I'm trying to load a csv file into an external table and when I select the table 0 rows is the result.
The log file has the following errors:
KUP-04021: field formatting error for field DEPTNO
KUP-04023: field start is after end of record
KUP-04101: record 1 rejected in file /usr/tmpclie.csv
error processing column EMPNO in row 2 for datafile /usr/tmpclie.csv
ORA-01722: invalid number
This is the script for the table:
create table emp_ext (
EMPNO NUMBER(4),
ENAME VARCHAR2(10),
JOB VARCHAR2(9),
MGR NUMBER(4),
HIREDATE DATE,
[code]....
And this is csv:
7369,SMITH,CLERK,7902,17-DEC-80,800,20
7499,ALLEN,SALESMAN,7698,20-FEB-81,1600,300,30
7521,WARD,SALESMAN,7698,22-FEB-81,1250,500,30
7566,JONES,MANAGER,7839,02-APR-81,2975,,20
7654,MARTIN,SALESMAN,7698,28-SEP-81,1250,1400,30
7698,BLAKE,MANAGER,7839,01-MAY-81,2850,,30
7782,CLARK,MANAGER,7839,09-JUN-81,2450,,10
[code]....
How can I access this "file:///C:/Users/RI/m_1.html" external file within the apex page?
I created a page and button for link but struggling for above.
Below is the external table definition and the content of csv file, why the first record fails ?
0546-0*LB-CRP*16*"Tech", ZAO*29-DEC-2009*29-DEC-2010***A051453*RU*29-DEC-2009*0***21-MAY-2010*21-MAY-2010 --FAILED
0546-0*LB-CRP*16*ID"Tech", ZAO*29-DEC-2009*29-DEC-2010***A051453*RU*29-DEC-2009*0***21-MAY-2010*21-MAY-2010 -- SUCCESS
0546-0*LB-CRP*16*"Tech, ZAO"*29-DEC-2009*29-DEC-2010***A051453*RU*29-DEC-2009*0***21-MAY-2010*21-MAY-2010 -- SUCCESS
[code]....
I have external file like (.csv or .txt) which contain million of record...i wnat to upload it in backend by using form 6i/10g.
by using package text_io.fopen i read it and by using for..loop conventional method insert record into table...but it will take time..
Is there any way like we use bulk collect and FORALL in backend for inserting data into table..
Is there any way to read external file at a time and insert it ...so minimize inserting time....process will become fast.
Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
PL/SQL Release 10.2.0.1.0 - Production
CORE 10.2.0.1.0 Production
TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
NLSRTL Version 10.2.0.1.0 - Production
I am new in external table so i have tried following cmd.
create directory dir_1 as 'E:ora_dirt' ;
grant read, write on directory dir_1 to HR;
select * from all_directories;
create table emp_ext
(emp_id number,
emp_name varchar2(30)
[code]...
since I am not able to see DIR_1 in E: drive due to which i havnt created 'emp.dat' file and on executing select on external table i m geting expected error *"ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-04043: table column not found in external source: EMP_ID"*
how to create that file in directory "DIR_1" .
I have another question to ask on packages..
I have a dictionary under the schema
OWNERSYS
DIRECTORY_NAMEUTL
DIRECTORY_PATHc:oracleoradataspmap1utl
I have cretaed an external table to read data to a table from a csv file placed in "c:oracleoradataspmap1utl". The csv file name is say "pildata.csv"
I have cretaed a package to read data from the exteranl table and to insert it into a table.
INSERT INTO M_PILEINT SELECT
A.AREA AS "AREA",
A.SUB_FAC_DESC AS "SUB_FAC_DESC",
A.SCOPE_DETAIL AS "SCOPE_DETAIL",
A.MTO_ISSUE_DATE AS "MTO_ISSUE_DATE",
A.MTO_TAKE_BY AS "MTO_TAKE_BY",
A.COMMODITY_CODE AS "SECTION"
A.PILE_NAME AS "PILE_NAME"
FROM M_EXE_PILE A
(where M_EXE_PILE A is the external table which is reading from pildata.csv)
The package runs good and data is populated to M_PILEINT .Is there a way , I can rename the csv file (say to pildata_logxxxx.csv.. something like that) from within the package.Whenever the package is run , it will copy the data from exteranl table and renames the csv file to something else..?
I want to use UTL_FILE package to create OS file. How to resolve this error. Oracle11g under XP.
SQL> create directory my_dir as 'c: emp';
Directory created.
1 create or replace procedure test_1(md in varchar2)
2 is
3 file utl_file.file_type;
4 begin
5 file := utl_file.fopen(md,'abc.log','w');
6 utl_file.put_line(file,'EMPLOYE REPORT');
7 utl_file.fclose(file);
8* end;
SQL> /
Procedure created.
SQL> execute test_1('MY_DIR');
BEGIN test_1('MY_DIR'); END;
RROR at line 1:
RA-06510: PL/SQL: unhandled user-defined exception
RA-06512: at "SYS.UTL_FILE", line 98
RA-06512: at "SYS.UTL_FILE", line 157
RA-06512: at "SCOTT.TEST_1", line 5
RA-06512: at line 1
Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - 64bit Production
PL/SQL Release 11.1.0.6.0 - Production
"CORE 11.1.0.6.0 Production"
Is there a way to read Excel file using UTL_FILE package ?
Like our usual method:
l_utlfile := utl_file.fopen(p_dir, p_filename, 'R',2000);
p_filename is event.xls
Or do we need to convert that file to .csv or .txt ?
I need to create a bat file which include a query to run a package. I use plsql developer to develop the package. its username,password and database is user,pswd,db1 respectively. the query to run the package is "SELECT
COLUMN1 AS "LAST NAME",
COLUMN2 AS "FIRST NAME",
COLUMN3 AS "LOCATION"
FROM TABLE(PKG.GET_SUM('09-NOV-2010','12-NOV-2010')) "
what code should I write to create a bat file.
I'm trying to load xml file into table using dbms_xslprocessor.read2clob package, it loads for small file, but it throws READ ERROR for big file. I have given READ, WRITE permission on directory and also DBA role to user. Also I have attached file (change file extension to xml)
check below given code.
CREATE TABLE loadxmlfile (xmldata CLOB);
CREATE TABLE loadxml (id NUMBER, xmldata XMLTYPE);
create or replace PROCEDURE load_xmlfile
[Code]...
Note: Reason I'm doing is to remove NON BREAKING SPACE (NBS) character from xml file
I want to get all the column values in a table and save them into a text file.Beside UTL_FILE, is there any other method which will result better performance in writing to text file?
noted that the data does exist 32k.