SQL & PL/SQL :: Automate The Process Of Updating Sequence?
Jan 18, 2012
I have migrated database from postgresql to oracle...All sequences are migrated with their default values...(Start with 1) I already have 213 entries in a table and I want to begin using this for 214th entry ( replace with "start with 214")
How can I automate the process of updating "Start with" value of sequence with the max no of entry on my table every time I migrate data....
I have created a trigger that will automatically insert the next number from the sequence into the id column.
create trigger test_trigger
before insert on test
for each row
begin
select test_seq.nextval into :new.id from dual;
end;
/
I am replacing a composite PK with a new single PK as business changed. I am creating sequence to fill the new field that should present the PK but I need to fill it according to the values of the old composite PK.
I need to update ABC and set ABC_SERIAL = SEQ_ABC.nextval but i need this to be done according to the order of the old composite primary key... i.e. rows with COMP_PK1 = 1 are filled before rows with COMP_PK1 = 2 and so on..
how to automate a data from oracle into excel...i have a table "emp" in oracle database now i need colums of emp ex:firstname ,last name, id from that table into excel.
so i need a script which when you schedule it it should create a excel file in particular postion,i was told we have to crete a directory from sql and using utl_file then we have to write a script and then schedule that.out look in excel should be
I'm trying to create an install script to install Discoverer 10g R2 with its needed patch and opatches applied without any user interaction. I've already created the necesary response files and a batch file to sequence it. The installer should work when the it is placed on a server with the main folder shared and it does so flawlessly.
the user sees a dos window which is kindly stating that he has to wait for the primary installer to finish before hitting enter to start the patch installer.The problem I'm having is that, on slow networks, it takes a while for the primary installer of discoverer 10g to show up a window and of course the user isn't always patient enough to wait for it and hits enter before the primary installer is showing itself causing the patch installer to start before discoverer is completely installed.
Is there a way to avoid this? Or am I wrong in using a batch file to sequence this install? second problem is the needed interaction while applying opatches, can this be automated as well?
here is the contents of my batch file:
net use x: /delete net use X: \\servername\Oracle_cd\disco10gr2 /persistent:no @ECHO off cls :start
We have two databases one localdb with user rakdb and another one remotely remotedb with user rakdb .We need to be in sync with data in one table called om_item, where the users are inserting data on daily basis and the user sends us the insert script everday to run it on local databse to insert the new records in local database.I managed to create a file which records all the inserts into one text file in one directory.Can we have a scheduler to pick this text file from the specified folder and send mail using utl_mail.
CREATE TABLE ITEM (IT_CODE VARCHAR2(12),IT_NAME VARCHAR2(20)); INSERT INTO ITEM VALUES ('A','AAA'); CREATE OR REPLACE DIRECTORY MY_DIR AS 'C:TEMP'; CREATE OR REPLACE PROCEDURE it_status
[Code]..
Procedure created.
EXEC it_status HOST TYPE c: empaaaa.txt INSERT INTO ITEM (IT_CODE, ITEM_NAME) VALUES ('A','AAA'); COMMIT;
I have a proc created which dynamically creates scripts to be executed, e.g. using DBMS_OUTPUT.PUT_LINE it creates the following scripts to be executed:
Now, what I am really looking for is to explore options where we can spool the results into a file and run another proc to execute all of these proc through it.
I have to automate TDPOSYNC utility, it is a IBM tool for oracle backup.I tried except utility of UNIX in shell script, but due to some reason same utility i could not get on production server.Not i asked to use PL/SQL to automate the same.I am facing some problem
1. How to call TDPOSYNC commands from pl/sql
2. How to pass run time input parameter to the TDPOSYNC like user/password, date rage etc.
I would like to know that how can i automate the export from production to test server. I need direction to create process to import data from production (server A) to test server (server B).
I have been plagued by people logging into my database and making changes when a clone is in process.. Having said that ,I am looking to lock accounts and unlock them when I am done.
I envision my code looking something like this:
sqlplus -s / <<END SET PAGESIZE 0 SET FEEDBACK OFF SET VERIFY OFF; set heading off; spool /tmp/lockusers.sql select 'alter user ' || username || ' account lock;' from dba_users where username not in (....) and not locked?; spool off; END;
sqlplus -s / <<END @/tmp/lockusers.sql END;
When it comes time to unlock the accounts I want to be able to unlock those accounts I previously locked and not all of them. Is there a query, I can use that can tell me when the accounts were locked or some other way about going about this so I dont unlock accounts that were locked prior to my lock script running?
We are designing a three tiered system (client, application/web server, database server) that will allow clients through a web interface to select a text file from the operating system and load that file into a intermediate table (import database table). Many users will do this concurrently and data will load into a single table. The text files come in monthly for about 100 firms. No user is able to insert or update the data of another users data (there is a check out system). Their are about 30 to 40 users that will be using the system doing various functions but it is possible for 10 to 20 users to import data at one time. The files can have anywhere from 2000 to 25000 records at a record length of 398.I am concerned about having a good design strategy as well as decent performance.
Problems with each of the Oracle loaders.
1) External tables - Can not read data text files on the application server(which is where they want the text files to go) secondly you cannot create a instance of a external table. Multiple users will be using the external table to point to different text files and loading at the same time.
2) Sqlloader - is mainly a OS level tool and I am not sure how I could programatically point it to a different text file each time a user wants to load. The client will have to have the ability through code to point sqlloader to the correct file name.
I had a creative approach and was wondering if this would work. I would like to use external tables just like a connection pool. I would propose first a scheduled OS job to move files to the database server. I would create about 20 external tables with 20 different directory objects. Using a stored procedure for the user to call and pass in file name and audit info as needed. I would use a Load lock pool table (my invention) to load the name or a code for the external table in use. The procedure loads this code into my load lock pool table when a external table is in use and deletes the name when the load is completed. The procedure would check through a series of if statements whether a particular external table was in use. If in use (exist in load lock pool table) I would check the next available external table until a external table not in use is encountered. Now potentially 20 users at one time but not likely would be laoding into the same table at one time.My questions
1) Could Oracle handle this strategy? What do I need to consider performance wise with the possibility of so many users loading into a single table at one time?
2) Do any of you maybe have another strategy to do this?
I have written a below procedure to dump the table data to .csv file.But the problem is i have 20 tables which is holding 75 studies data. Means every table will have 75 studies related data.what i supposed to do is i need to export the data from 20 tables for each study. but this procedures requires me to run the procedure 75(studies)*20(tables) times. is there any technique instead of i manually giving the table name and study name , will it take from any text file where we defined 75 studies in that. or easy there any better way .
create or replace procedure dump_table_to_csv1(p_tname in varchar2, p_dir in varchar2, p_filename in varchar2) is l_output utl_file.file_type; l_theCursor integer default dbms_sql.open_cursor; [code]........
My database is 11.1.0.7 and 11.2.0.3 with TDE tablespace encryption, ASM db storage. The wallet needs to be opened for MRP to work in physical standby database. I already have the solution for the primary instances to automate wallet open (e.g. using startup trigger for 11.1.0.7). However, I cannot find solution to automate wallet open operation in standby instances (to issue ALTER SYSTEM SET ENCRYPTION WALLET OPEN IDENTIFIED BY ""').
Manual operation everytime standby instance is started is not feasible.
Find an appropriate script to automate Oracle DBs in one server? This db server have 6 instances. We always done the starting up and shutting down manually, although we have a reference script that does this but in Oracle v7.3.4. We do want to include the automatic start/stop of dbconsole for accessing it via OEM.
Any way to automate a job to execute several Oracle Reports to run at the end of every month and either email them to a list of people or possibly save them to a location on a network drive.
At first, I started out with 2 avenues of thought:
1) The following link raised the option of an Oracle Scheduler, but from my research, the scheduler only runs database jobs, shell scripts, batch files, etc.:
[URL]......
2) I was thinking about possibly creating a database job of some sort to call the reports, but from the following link, I've determined that one cannot create+call a stored/database procedure to run Oracle Report files. So there goes my second hope.
[URL].......
So, it appears that the scheduler won't meet my requirements and now I think I'm almost out of options.
My last resort is creating a database job that queries required information and spooling the data + various formatting "stuff" into a txt file. Then, that txt file would be my report. This seems a little savage and I would like to think that there's a ay to automatically run an .REP against a database and save an electronic copy of the report somewhere.
Is it possible to automate the execution of Oracle Reports and then saving the report in a particular format (xls/pdf/rtf/etc.)?
I am using oracle 9i, and having trouble with updating a table.
I get ORA-00001(unique constraint) Error on executing the sql below; I know sql below is little strange( which use unique key in 'SET' statement) . but It worked on My Oracle Server. but it didn't on Client's.
why this error occurs or why this error did not occur on my PC,
[Update sql](key is CD and SDATE) Update TBL1 set CD = 'A',
i was just working on one of my SQL assignments from my database management course, and thus far, this is the first that I just can't figure out. The question is:
Quote: Increase the credit limit of any customer who has any order that exceeds their credit limit. The new credit limit should be set to their maximim order amount plus $1,000. This must be done in 1 SQL statement
The bolded part is what I'm having trouble with.
What I have thus far:
UPDATE Customers SET CreditLimit = 1000 + (SELECT MAX(Amount) FROM Orders, Customers WHERE Cust = CustNum) WHERE CustNum IN ( SELECT Cust FROM Orders WHERE Cust = CustNum AND CreditLimit < Amount);
So there's two tables that I'll be working with, Customers (the table I'm updating), and Orders (the table where the order amount is found). With the code I have so far, it does seem to be updating the correct tables at the very least, but not with the correct values. It's essentially updating the CreditLimit column with the new value of 1000 + the maximum amount in the order table, which is very close to what I want it to do, but I want it to be 1000 + the maximum amount for that specific customer.
CustNum is the primary key for the Customers table, and Cust is the foreign key that links each together.
(about the formatting, it looked much prettier in SQL Worksheet Plus)
insert into topdUIDXML select '<filter name="test" topologyUID="1">' from dual; insert into topdUIDXML select '<filter name="test2" topologyUID="2">' from dual; insert into topdUIDXML select '<filter name="ftest" topologyUID="3">' from dual; insert into topdUIDXML select '<filter name="qtest" topologyUID="4">' from dual;
I am trying to find a way to update all of the rows in a table for a column EXCEPT for the very first row. I am not sure if this can be done while I enter my SET parameter or not. I have also thought about using an EXCEPTION in a stored procedure. For example, say I have the table listed below:
SQL> select * from example1;
CODE1 I_ID CODE2 J_ID NAME1 DATE1 ----- -------------------- ----- -------------------- -------------------------------- --------------- A 100 A 200 John 20111225 A 100 A 300 John 20111225 A 100 A 500 John 20111225 A 100 A 400 John 20111225 A 100 A 250 John 20111225 A 100 A 700 John 20111225 A 100 A 800 John 20111225 A 100 A 900 John 20111225 A 100 A 1000 John 20111225 A 100 A 1150 John 20111225 A 100 A 1275 John 20111225 A 100 A 3000 John 20111225
12 rows selected
I am wanting to update the table so that if there were more than 3 J_id's on the table for the same I_id then it will set all of the code1's and code2's to a C except for the very first one. Such as:
SQL> select * from example2;
CODE1 I_ID CODE2 J_ID NAME1 DATE1 ----- -------------------- ----- -------------------- -------------------------------- ---------------- A 100 A 200 John 20111225 C 100 C 300 John 20111225 C 100 C 500 John 20111225 C 100 C 400 John 20111225 C 100 C 250 John 20111225 C 100 C 700 John 20111225 C 100 C 800 John 20111225 C 100 C 900 John 20111225 C 100 C 1000 John 20111225 C 100 C 1150 John 20111225 C 100 C 1275 John 20111225 C 100 C 3000 John 20111225
12 rows selected
I have done some searches and haven't seen any results.
I have two tables, CREATE TABLE repos ( rep_key VARCHAR(10) NOT NULL, base_term VARCHAR(100) NOT NULL, blt_key INTEGER NOT NULL
[code]...
gloss table has the unique set of base_term as in repos. BLT_KEY will be primary_key in gloss and foreign key in repos.
Data in gloss table BLT_KEY BASE_TERM
1 base1 2 base2 3 base3
Now, I need to update the BLT_KEY in gloss to matching entries in repos. Can I do that in a update on select statement? like, UPDATE repos SET blt_key = (SELECT gloss.blt_key FROM repos, gloss WHERE repos.base_term = gloss.abase_term) This throws subquery returns more than one row.
And the end of update the repos table should look like, REP_KEY BASE_TERM BLT_KEY
Also I need a single query which can update on select as the no of records to be updated are more than 90000 in repos. So two step process would slow down the process
and I would like insert the same gross and net column values of ids 7 to 16 into columns with the ids 40 to 49 in the same order. therefore I would like to obtain the result that I describe below:
On a tab page should be displayed the result of four indifferent queries, each based on a stored procedure.At the moment, the queries are processed serially, by the statements:
I captured the process id of the browser. I wanted to know whether the process id of the browser is dead or not after we close the browser. How can I check the condition for the process id in oracle forms?