Server Utilities :: Handling Errors In Batch File?
Aug 21, 2010
i get flat file and i have set up a control M job so that it runs at a particular time.
initially my control file was as below:
LOAD DATA
INFILE 'DFILEabcd.dat'
BADFILE 'BDabcd.bad'
REPLACE
INTO TABLE abcd_table
(A position(01:09) CHAR,
B position(11:12) CHAR,
C position(14:33) CHAR,
D position(37:50) char)
this was working fine. control M did not send FAIL message.but later i had to change the ctl file due to requirement. i had to add a when clause.
my code after modification is:
INFILE 'DFILEabcd.dat'
BADFILE 'BDabcd.bad'
REPLACE
INTO TABLE abcd_table
when A<>'10'
(A position(01:09) CHAR,
B position(11:12) CHAR,
C position(14:33) CHAR,
D position(37:50)
now the control M is sending an erro message after it runs the job. error is Return 5. thats all it gives.
i think it is due to errorlevel 1. in log file it says zero records inserted due to data error. then what is causing control M to send fail message??
sqlloader is loading all the required records correctly.
I run this script through batch file its working.Problem is dump file showing with this format C:dmp_wed.dmp.I want to date format like that C:dmp_18052011.dmp.
for date formatting.How I can add date format in a batch file.
I have another interesting SQL Loader issue. I have created 2 prior topics RE: "Trapping SQL Loader summary counts" and "Using Sql Loader issue". This new issue is along the same lines, but unfortunately the same input csv file has a date field that is has a different input format..
The 10G Oracle Table looks like... --------------------------- CREATE TABLE PTLIVE.MODULE_CSV_LOADS ( MODULE_ID NUMBER (20), MODULE_TAGNUMBER VARCHAR2(100), MODULE_SERIAL_NUMBER NUMBER (20,0),
[code]...
My initial control file looks like... ----------------- OPTIONS (SKIP=1) load data infile 'J:GrowerRound_ModulesCrop2011ProcessedBatch_2011Jul12_081326_746563.csv' BADFILE 'J:GrowerRound_ModulesCrop2011LogsBatch_2011Jul12_081326_746563.bad'
[code]...
The input csv data file (for this example) contains 3 records. The first is a heading record that is skipped. The 2nd record is correct and the field entitled "GMT Date" contains a value of "16/05/2011". The 3rd record, however, has a date of "2011/5/18", which get rejected by the above control file, as the output SQL Loader log indicates...
Record 2: Rejected - Error on table PTLIVE.MODULE_CSV_LOADS, column DATEPICKED. ORA-01861: literal does not match format string
Now we will be receiving literally hundreds of these csv files and in testing I have found that when I open the csv file (the default is Excel), and Excel must alter its display, as all date appear 100% okay. However, upon opening the csv file with Wordpad, I discovered the dates in the same file had these 2 different formats. So the control file was attempting to concat the input csv file "GMT Date" in one format with the input "GMT_TIME" to load the output value into the Oracle table column "DATEPICKED" ... but different date formats cause some records to be rejected.
It would be super if SQl Loader could use a IF clause or an OR clause to execute loading the date in one or more input formats. We only expect the 2 foramts DD/MM/YYYY or YYYY/MM/DD....however, there could be 6 different combinations in theory.
I thought about writing a Function or Procedure to analyst the input date and output a standard one to load into the Oracle column
I am trying to update a table with a MERGE command using a Host Array and I am getting a unique constraint violation. According to the Oracle documentation "sqlerrd[2]holds the number of rows processed by the most recently executed SQL statement. However, if the SQL statement failed, the value of sqlca.sqlerrd[2] is undefined, with one exception. If the error occurred during an array operation, processing stops at the row that caused the error, so sqlca.sqlerrd[2] gives the number of rows processed successfully."
So when I catch the error and try to print out the offending row in the host array, it points to the incorrect row. If I change the MERGE command to just an INSERT, sqlerrd[2] now points correctly to the row in error. Is there an issue with MERGE in this respect? How can I get the correct row with MERGE?
I was asked to do export/import of some schemas from 10g(linux) to 11g(AIX) using original expor/import method. I did not consider the character set and started doing export and import. while exporting, I get questionable statistics error in export log file. In the import log, I see the error like CREATE DATABASE LINK "xxxxxxxxxxxxx" CONNECT TO "xxxx" IDENTIFIED BY...
ORA-39002: invalid operation ORA-39070: Unable to open the log file. ORA-29283: invalid file operation ORA-06512: at "SYS.UTL_FILE", line 536 ORA-29283: invalid file operation
Is this error related to the permission in the OS level (windows 7 in my case)? I manually created the folder 'DATA_PUMP_DIR' in the specified directory path. Though the directory I created (DATA_PUMP_DIR) shows read-only in the general tab of the property, I am able to create files under the folder 'DATA_PUMP_DIR'.
I am getting following errors while importing data.
SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX ORA-39083: Object type INDEX failed to create with error: ORA-06550: line 2, column 1: PLS-00201: identifier 'CTXSYS.DRIIMP' must be declared ORA-06550: line 2, column 1: [code]......
Target Database machine OS: Red Hat Enterprise Linux Server release 5.4 , 64-bit Target Database version/addition: 10.2.0.4.0 EE
Therefore, I did an export using Oracle Data Pump:
Export: Release 11.2.0.1.0 - Production on Tue Jun 11 13:13:28 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Release 11.2.0.1.0 - 64bit Production Starting "SYSTEM"."SYS_EXPORT_SCHEMA_04": system/******** directory=DATA_PUMP_DIR dumpfile=exp_PRODSCHEMAS_20130611.dmp schemas=PRD_100,SHR_100 logfile=exp_PRODSCHEMAS_20130611.log VERSION=10.2.0.4.0 Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 1.026 GB Processing object type SCHEMA_EXPORT/USER Processing object type SCHEMA_EXPORT/SYSTEM_GRANT [code]....
does it have to do with the VERSION parameter value? what can I check to investigate?
i am trying to import full db export using datapump , i have too many errors for objects that is already exist . attached is the log file . thae steps i did so far
ORA-31626: job does not exist ORA-04063: package body "SYS.DBMS_INTERNAL_LOGSTDBY" has errors ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_INTERNAL_ LOGSTDBY" ORA-06512: at "SYS.KUPV$FT", line 949 ORA-04063: package body "SYS.DBMS_LOGREP_UTIL" has errors ORA-06508: PL/SQL: could not find program unit being called: "SYS.DBMS_LOGREP_UTIL"
SP file was removed during the testing and I started the database after making some changes in the pfile, and it works fine. But when I try to start the database using the SPFILE, it gives an error.
I am executing multiple PL/SQL files(.sql) with single batch file. The batch file sql.bat has got 3 sub sql sub-tasks to complete once its run. The sql.bat is show below
@Echo off
CD C:Report echo Loadin tables from text file Report.txt sqlplus security/password <c:Reportloader_security.sql echo Creating Security table sqlplus security/password <c:Reportcreating_security_final.sql echo Inserting text file Security table sqlplus security/password <c:Reportinsert_security_final.sql
PAUSE
The sql.bat runs perfectly if I double click on the sql.bat file separately. But if I call the sql.bat from a different batch file 'Final.bat' it throws the below error.
Error ----------- Executing SQL commands and loading file into SQL tables Loadin tables from text file Report.txt Error 6 initializing SQL*Plus SP2-0667: Message file sp1<lang>.msb not found SP2-0750: You may need to set ORACLE_HOME to your Oracle software directory [code].....
The Final.bat file calls other bat files too. It is as show below.
CD C:ReportSecurity echo Merging all Files CALL merge.bat
CD C:ReportSecurity echo Deleting old files CALL del.bat
CD C:ReportSecurity echo Executing SQL commands and loading file into SQL tables CALL sql.bat
I have a .bat file in my client system,which will open a web page after executing(after double clicking on it).I want to execute the same batch file from my pl/sql block.So,after executing my pl/sql block that .bat file should execute,and it should open the same web page.
I am trying to create a batch file which will be executed with windows scheduled task. This batch file will have sqlplus script running Oracle query. I can run this query from the command prompt, no problem,
When i double click .bat file, it will get proper info.But when i call this .bat file thru form, it will show blank screen..why this .bat file not running ?
I am running Oracle Database 10g R2 on windows 2003, I want to create a batch file to check if the database is idle or not, and if it is idle shut it down and start it up.
I have a particular sql code which works perfectly fine on sql developer. But if I run the same sql code through a batch file it does not get executed. It does not throw an error too.
SQL code - clean_tables.sql
begin execute immediate 'drop table external_tables'; execute immediate 'drop table security'; exception when others then null; end;
Batch file - Clean.bat
set ORACLE_SID=orcl set ORACLE_HOME=C:oracleproduct11.2.0dbhome_1 set PATH=C:oracleproduct11.2.0dbhome_1BIN
BEGIN dbms_scheduler.Create_schedule(schedule_name => 'RMAN_TICKER_STARTING', repeat_interval => 'FREQ=DAILY;BYHOUR=9; BYMINUTE=15,30,45,59', comments => 'schedule to run daily'); dbms_scheduler.Create_program (program_name => 'TICKER_PROGRAME',
[code]....
It was created successfully, but when I execute, it shows the error message.
BEGIN dbms_scheduler.Run_job ('RMAN_TICKER_JOB', TRUE); END; begin dbms_scheduler.run_job ( 'RMAN_TICKER_JOB',TRUE); end; Error at line 1 ORA-27370: job slave failed to launch a job of type EXECUTABLE ORA-27300: OS system dependent operation:accessing execution agent failed with status: 2
[code]....
But I have the CMD file in the location - "F:FEEDLGRTOOLSfeedlgr.cmd".
I have a requirement to read flat text file(around 15000 lines) residing at a client location from DB server and write into a table in One cell.
I tried UTL_FILE and DBMS_LOB but, i am not able to access client location to read the file as it reads path from Oracle Directory.
eg. my client path is 198.168.1.1 and my DB server is in unix say 192.168.1.10. file location is: \192.168.1.1shareabc.txt So I created One Oracle directory as MY_DIR having DIRECTORY_PATH as '\192.168.1.1share'. But both UTL_FILE and DBMS_LOB is not able to access the file.
Error Message: ------------- Unable to process CLOB -22288 ~ ORA-22288: file or LOB operation FILEOPEN failed No such file or directory
Few Details for reference: ------------------------- File Location: \192.168.1.1shareabc.txt Unix DB Server location: 192.168.1.10 Table : Test (filename varchar2(30), Content CLOB) Oracle Dir: MYDIR Directory_Path: \192.168.1.1share
I am trying to create a silent install as part of a batch file that does other things after. I'm running into the following issues
1) I need the batch file to wait until the install is done to continue. Trouble is that running setup with a response file opens up a new windows to run the install. I tried the wait command but doesn't work right either 2) Same issue with DBCA and a response file.
Any way to run the silent installs and have the batch file wait until the install is done and then continue? The less user intervention required, the better.
How to call a batch file existing in network path through oracle forms 6i. I am using following code in Forms 6i which is supposed to FTP some .text files to a UNIX box.
We are running Oracle 11.2.0.3 on a Windows 2008 R2 Server with an app server on the same. Our app has a search function that has recently started timing out on us. Sometimes intermittently but often now. I've ran a trace in our dev environment, below code, in an attempt to track down the issue but I'm finding very little useful information in relation to the errors I see in the trace file below. My experience in deciphering trace code is nill so how to interpret these errors.