I have a .bat file in my client system,which will open a web page after executing(after double clicking on it).I want to execute the same batch file from my pl/sql block.So,after executing my pl/sql block that .bat file should execute,and it should open the same web page.
I have a Batch file(.bat) where I get the user name password ans schema name as a input . Based on that input I connect the sql and run some update query from bat file itself. Now I have issue ,when the user give wrong credintials it ask for the credintials again.
Is there any way to check the credintials of oracle and give the invalid credintial error via bat file. The sample of my Bat file is as follows.
@echo off for /f %%i in ('sqlplus -s usename/password@db @H: est_db_connection.sql') do @set count=%%i echo %count% IF %count% EQU 1 ECHO ("Database connection working fine") IF %count% NEQ 1 ECHO ("Not able to connect to database")
I am executing multiple PL/SQL files(.sql) with single batch file. The batch file sql.bat has got 3 sub sql sub-tasks to complete once its run. The sql.bat is show below
@Echo off
CD C:Report echo Loadin tables from text file Report.txt sqlplus security/password <c:Reportloader_security.sql echo Creating Security table sqlplus security/password <c:Reportcreating_security_final.sql echo Inserting text file Security table sqlplus security/password <c:Reportinsert_security_final.sql
PAUSE
The sql.bat runs perfectly if I double click on the sql.bat file separately. But if I call the sql.bat from a different batch file 'Final.bat' it throws the below error.
Error ----------- Executing SQL commands and loading file into SQL tables Loadin tables from text file Report.txt Error 6 initializing SQL*Plus SP2-0667: Message file sp1<lang>.msb not found SP2-0750: You may need to set ORACLE_HOME to your Oracle software directory [code].....
The Final.bat file calls other bat files too. It is as show below.
CD C:ReportSecurity echo Merging all Files CALL merge.bat
CD C:ReportSecurity echo Deleting old files CALL del.bat
CD C:ReportSecurity echo Executing SQL commands and loading file into SQL tables CALL sql.bat
I am trying to create a batch file which will be executed with windows scheduled task. This batch file will have sqlplus script running Oracle query. I can run this query from the command prompt, no problem,
When i double click .bat file, it will get proper info.But when i call this .bat file thru form, it will show blank screen..why this .bat file not running ?
I run this script through batch file its working.Problem is dump file showing with this format C:dmp_wed.dmp.I want to date format like that C:dmp_18052011.dmp.
for date formatting.How I can add date format in a batch file.
I am running Oracle Database 10g R2 on windows 2003, I want to create a batch file to check if the database is idle or not, and if it is idle shut it down and start it up.
I have a particular sql code which works perfectly fine on sql developer. But if I run the same sql code through a batch file it does not get executed. It does not throw an error too.
SQL code - clean_tables.sql
begin execute immediate 'drop table external_tables'; execute immediate 'drop table security'; exception when others then null; end;
Batch file - Clean.bat
set ORACLE_SID=orcl set ORACLE_HOME=C:oracleproduct11.2.0dbhome_1 set PATH=C:oracleproduct11.2.0dbhome_1BIN
BEGIN dbms_scheduler.Create_schedule(schedule_name => 'RMAN_TICKER_STARTING', repeat_interval => 'FREQ=DAILY;BYHOUR=9; BYMINUTE=15,30,45,59', comments => 'schedule to run daily'); dbms_scheduler.Create_program (program_name => 'TICKER_PROGRAME',
[code]....
It was created successfully, but when I execute, it shows the error message.
BEGIN dbms_scheduler.Run_job ('RMAN_TICKER_JOB', TRUE); END; begin dbms_scheduler.run_job ( 'RMAN_TICKER_JOB',TRUE); end; Error at line 1 ORA-27370: job slave failed to launch a job of type EXECUTABLE ORA-27300: OS system dependent operation:accessing execution agent failed with status: 2
[code]....
But I have the CMD file in the location - "F:FEEDLGRTOOLSfeedlgr.cmd".
i get flat file and i have set up a control M job so that it runs at a particular time.
initially my control file was as below:
LOAD DATA INFILE 'DFILEabcd.dat' BADFILE 'BDabcd.bad' REPLACE INTO TABLE abcd_table (A position(01:09) CHAR, B position(11:12) CHAR, C position(14:33) CHAR, D position(37:50) char)
this was working fine. control M did not send FAIL message.but later i had to change the ctl file due to requirement. i had to add a when clause.
my code after modification is:
INFILE 'DFILEabcd.dat' BADFILE 'BDabcd.bad' REPLACE INTO TABLE abcd_table when A<>'10' (A position(01:09) CHAR, B position(11:12) CHAR, C position(14:33) CHAR, D position(37:50)
now the control M is sending an erro message after it runs the job. error is Return 5. thats all it gives.
i think it is due to errorlevel 1. in log file it says zero records inserted due to data error. then what is causing control M to send fail message??
sqlloader is loading all the required records correctly.
I am trying to create a silent install as part of a batch file that does other things after. I'm running into the following issues
1) I need the batch file to wait until the install is done to continue. Trouble is that running setup with a response file opens up a new windows to run the install. I tried the wait command but doesn't work right either 2) Same issue with DBCA and a response file.
Any way to run the silent installs and have the batch file wait until the install is done and then continue? The less user intervention required, the better.
How to call a batch file existing in network path through oracle forms 6i. I am using following code in Forms 6i which is supposed to FTP some .text files to a UNIX box.
I have countries, sites, states tables (total 3) in database (i have user id and password to connect to this database).
every week i need to extract data from these tables into excel files and i need to save those in shared drive for team use.
Currently i am connecting to database every time running sql query and manually exporting that latest data to excel and saving that as excel files in (G: eamcommon) folder with specific name.
output format should be :
excel (.xls) file names should - countries.xls,sites.xls,states.xls server name : ap21 output location : G: eamcommon ( G is shared drive).
i heard that we could create batch file to do this task and also we could use oracle procedure to do this task. but not sure which one is the best option.
when i am calling stored procedure with input and output parameters from batch file .. I am getting the following message ..
SQL*Plus: Release 10.2.0.1.0 - Production on Tue Oct 4 11:48:51 2011 Copyright (c) 1982, 2005, Oracle. All rights reserved. Connected to:Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options 14
PROCEDURE SP_SELCT_EMPLOYEE_DATA ( -- A_EMPLOYEE_ID IN VARCHAR2, --A_JOB_ID IN EMPLOYEES.JOB_ID%TYPE, P_EMPLOYEE_ID IN EMPLOYEES.EMPLOYEE_ID%TYPE, P_EMPLOYEE_NAME IN EMPLOYEES.EMPLOYEE_NAME%TYPE, P_EMAIL IN EMPLOYEES.EMAIL%TYPE,
I am trying to setup incremental backup on my windows OS based server using RMAN command in batch file. When I use batch file in OS scheduler it is working fine, when I am calling same batch file from my LOCAL desktop PC it throws errors as below.
D:> \3.193.211.19sgdba mankp_acressit.bat D:>rman catalog rman/******@acressit target / cmd file=E:sgdba mankp_arch.rcv log E:sgdba mansit_arch_rman_backup.log RMAN-00557: could not open MSGLOG "E:sgdba
[code]....
Both single and double quotes (' or ") are accepted for a quoted-string.
Quotes are not required unless the string contains embedded white-space.
RMAN-00571: =========================================================== RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS =============== RMAN-00571: =========================================================== RMAN-00556: could not open CMDFILE "E:sgdba mankp_arch.rcv"
[code]....
* on my DB server I am login using my administrator account, on my PC I dont have admin account.
* I have checked remote execution using local user as well as admin user.
* I have checked the permission and my local ID & EVERYONE has all permission in that folder.
I'm developing a new batch program retrieve data from oracle to excel.Normally i insert those data into single sheet. Is it possible if I want do this into multiple sheet.
Eg: I need to insert data base on branch category. different sheet for different branch but still in a single workbook?
In our application, we are allowing user to upload data using excel sheet in UI. We are using PHP script in UI and using SQL Loader to load data from excel sheet to temp_table.
The temp_table has a primary key.
Here my question is , Is there any way to put some batch id for every upload in that table in automatic way ? so that we can easily extract the data by using batch id . we are using Oracle 11g.
Oracle10g to Sybase12.5 Migration:- How a Oracle dump file can be converted to any text file/xls file which will be loaded in sybase database later through BCP.
means 1.Exporting objects as dump file from Oracle. 2.Is there any tool/process available that can convert this into csv/txt/xls file. 3.This files can be loaded in sybase.
We are having a major issues with the batch run. we are using oracle 11g db. We run the scripts to populate the tables and then call scripts to run the extractions. The issue here each time we run the sql it takes so much inconsistent time.We have created index and run the db stats then run the extractions.The sql sometimes takes 10 minutes or sometimes takes hours to run? This is major show stopper of the project.
Im looking for a query which returns the batch for which all the child should either be in 'A_STATUS','B_STATUS' or 'C_STATUS'. In this query im expecting a query which returns batch 2,3 and 4.
create table batch (batchid number); insert into batch values(1); insert into batch values(2); insert into batch values(3); insert into batch values(4);
I have a table which has plenty of rows. In production, I would estimate it to be from 30 millions to 300 millions. I need to update on column (flag) in all the rows (created before certain date).Now saying just:
UPDATE MyTable SET flag = 3 WHERE created < to_date('2010-10-08 23:59:59', 'YY-MM-DD HH24:MI:SS'); COMMIT;
Does not seem like a good idea - the commit-buffer would become too big.I will write a PL/SQL script for this. The question is, whether I should:
a) Update each row separately, and commit after every 10000 rows. ( WHERE RowId = [rowId] ) b) Update 10000 rows with set of dates ( WHERE rowId > [some_row_id] AND RowId < [some_row_id_2]
In the latter example the some_row_ids would naturally be fetched. The rowIds come from sequence. So which one would be more effective?I am not too familiar with PL/SQL or Oracle for that matter.
I'm having an issue with stale optimizer statistics for some SQLs that are run in a batch process. The problem is that the process runs many times during the day - sometimes 20 to 30 times. And each time, the tables are updated, i.e. rows are inserted or deleted, etc.
So eventually the optimizer statistics for those tables become stale and the performance of the SQLs start to slow down (a lot). How best to gather the optimizer stats on the tables so they don't become stale when the batch process runs each time? The problem is that I also can't add/modify the code in the batch process because it is delivered by the vendor as is.
1. Make the jobname distinct, because it keeps giving me multiple entries for each jobname 2. Add the the start_time of SOD_start_data9_UAT1 to end_time fodba_MUAT1 to get the combined duration 3. CONCAT jobnames SOD_start_data9_UAT1 and end_time fodba_MUAT1 4. Generate the last seven days batch run times 5. Generate a report into .csv format and email out 6. I have access to sqlplus and plsql developer
We have a critical application batch that runs daily and had a 7.5 hr window after which the application needs to come online. During the peak batch times the truncates are running very slow leading to slow down of the batch jobs quite considerably due to which the batch is going beyond the window.
The wait events that show up when the truncates are running are local write wait
enq: RO fast object reuse enq: CR block range reuse ckpt db file parallel write
The ASH reports show that the top sessions are those executing DMLs(insert, merge, update) and DDLs(Create/alter index & truncate). In addition to this it also shows that the blocking sessions are background wait events: CKPT and DBWR. Changes to DB configuration done with respect to addressing these issues are:
1) We have increased the DBWR processes to 2 2) Reduced the buffer cache size to 20G(from the original 30G) 3) Flushing the buffer cache before the batch begins in order to reduce the load on DBWR during the batch peak time 4) Set the parameter filessytem_io to SETALL(from none) 5) Tuned the EVA(SAN storage) to improve its performance - by distributing the loads evenly between the controllers, reducing the IO transfer block size, etc 6) Suggested using the reuse storage clause to improve truncate performance.
All of these have worked bring a semblance of control but the fact remains that the batch is generating more jobs(hence increasing data volume) over time due to it being the peak season. This causes an inevitable increase in the number of sessions all running DMLs and DDLs which are IO intensive operations.
Suggestions pending from our end:
1) Increase DBWR beyond 2. For this we need a H/W upgrade since we have maxed out the maximum number of DBWR that can be configured 2) Implementing asynchronous IO for DBWR which on HP-UX requires moving to raw disks. Hence have suggested using ASM. 3) Tuning the application to either reduce the IO generated or redistribute the jobs such that those with maximum loads don't run together
instead of truncating tables, can we rename the tables and delete them later . will this improve performance ?
I'm working on a Self assessment project regarding our tax returns. Currently, this is how it works - a return lodged generates a return number, but is batched later. In the change proposed, they want the same process whereby a return is generated still, but at a count of 10 returns generated on the same screen, a batch is to be created and these 10 returns will have to be added to that batch. We are on Oracle 10G and work with Forms, Reports 10G and TOAD/SQL Plus as tools so I was thinking of changing it on Post-Query but suggestions are to add on to System Parameter table.
I want run servral scripts in batch, and I use autorun.bat to call main.sql, which including servral scripts. If there has any pl/sql error in script, then the script will stop to run, but not exit SQL*Plus. If the pl/sql must exit, can it output the error messages in a file?
Please don't use "whenever sqlerror exit|continue...", because it will exit pl*sql tool or continue to run the other sql, it's not easy to know where the error happened
autoRun.sql --------------------------------------------------------------------------- sqlplus "sys/manager@ORADB as sysdba" @main.sql