I need to read a huge number of rows, say in lakhs and then need to populate it in data block. Since it is having huge data am never able to run the form. it hangs after some time. when i test with few rows it is working. so no problem in coding.
how to count how many items present in a particular data block in oracle forms 10g.whether it is a text_item or display_item or list_item etc is there any method to do this.
i have written this when-button pressed but the problem is how to get the next item name in the block
i have multi data block filed. and checkbox field which based on control block...My task is when i check checkbox only one field should enabled and my mouse goes to that field
My item field based on data block and checkbox based on control block,while i checked chkbox1 , only item31 on that current record should be enabled and i changed value only on that field
when i checked chkbox1 , my cursor goes to item31...not item32
A block shouldn't have rows from multiple tables... Is that true? I read in one of the OTN thread (i don't exactly remember the thread name) that a block can have data from multiple tables. If it doesn't have, what's the table directory in block signifies?
I need to verify huge number of records in two different databases. Basically i wanted to check if same record exist in other database's table or not? but as the number of records are more than billions who would i verify?
checking record one by one would be so hectic and time consuming. any other option is there?
I am working on form which consist of two block, now i need to know total record in detail block, but in form structure i have multiple details entry aginst 1 master entry and after going for next master entry the details of privious master entry are going for posting that's why i am unable to use the currnt_record function. function to retrive the total number of records in details tab.
I have form with master detail relation ship (invoicing form) the detail block is tabular, displaying upto 7 records...
Now my clients wants to show the serial number along with each record while feeding the data (this includes when insert, editing, deleting / clearing records) he wants to see the serial number as in MS access / MS techniques.
I tried to use the system variable to use :system.cursor_record; but this dose not works.(in insert/edit/delete/clear record)
Here is my problem, i need to create some files with my own format(let say 5000 records each) from a huge data table (May contain 5 Million records). And i want this creation to be multi threaded.
so how can i form queries efficiently to fetch records like 1..5000 and 5001..10000 and so on. I can form some thing like select * from table where rownum<5000 and not exists ( already fetched records) . but it is not the efficient one.
Let us say, if i enter empno 10 (which is not there in database) in FIND Screen --> press FIND Button, it's showing up 'QUERY CAUSED NO RECORDS', Till this point it's working fine;. But after this, if i press CTR+F11 in block A, it's not pulling records. only this case it's not pulling records.
But if i enter something else in FIND Screen, if it returns any data, then if i press CTR+F11,it's pulling all records.
why it's failing to pull records if i try to query data in first case only.
I have 2 questions, because they can be inter-related I am posting it in a single post. These queries are related to Oracle(PL\SQL).
1. I am trying to increase the size of a field in a table which has almost 2 million records and the query for alteration runs for almost and hour and rollsback, wondering is there a better way of doing it.
2. I have modified the size of a field in a table from Varchar2(10) to Varchar2(20), now when I tried to rollback the modification it is not letting me to change the size from Varchar2(20) to Varchar2(10). No data has been inserted after the modification.
I have 2 questions, because they can be inter-related I am posting it in a single post. These queries are related to Oracle(PLSQL).
1. I am trying to increase the size of a field in a table which has almost 2 million records and the query for alteration runs for almost and hour and rollsback, wondering is there a better way of doing it.
2. I have modified the size of a field in a table from Varchar2(10) to Varchar2(20), now when I tried to rollback the modification it is not letting me to change the size from Varchar2(20) to Varchar2(10). No data has been inserted after the modification.
I'm currently working on a project which is to archive the old data and then purge the same data from the main table.
Here is a detail description:
There are around 50 odd tables from which I would need to archive the old data(matching certain filter conditions...not date based). Meaning I have to store the data in a temp table. Once stored in temp table then I would have to delete those rows from the main table. This temp table will be later exported and stored on ARchive database(a seperate database). These tables are very huge. One of the table is actually 250 GB in size. And all these tables have many indexes built - both normal and bitmap.The 250 GB size table has 40 million rows that need to be archived and purged. The total number of rows in the table are 540 million.On this table alone there are 50 bitmap indexes and 2 normal indexes. This table is partitioned based on date column.This date column is not used/useful in identifying the old data. There are around 20 tables which are quite similar in size to the above described table. Rest of them are little small when compared to the above table.
We have to execute this activity over a weekend which gives us about 48 hours time to complete the activity. Best possible ways to handle this activity. Most importantly should be able to complete the activity within the specified 48 hour window.
The solution what we are now thinking of is:
1. Create the temp table ---Create tmp_tbl as select * from main_table where <<conidtions identifying old data>>
2. Once the temp table is created. Make copy of indexes that exist on the main table and eventually drop them.
3. Execute a PL/SQL script to perform the bulk delete from main table and commit for every 100000 rows.
4. Once the bulk delete is finished then recreate the indexes on the main table using the copy made at earlier step.
Our main worry is about the step#4. Considering the size of these tables and the number of indexes to be built,we are not sure how long the index re-creation will run for each table.
depending on the possibilities we may have to split the activity in to 2-3 phases spreading across 2-3 weekends. Even then we are not sure whether we will be able to pull off this activity.
Need to change the precision of a column in a existing table. Statistics about the table
* has over 130 columns * More than 300 million records * Column to modify is #121 which has data * No primary key defined
Since the column has data, it is not possible to modify with a simple Alter.
Second option - create temp column in same table, update from original, put null in original, alter, update back from temp, drop the temp column. This approach is very expensive and time consuming.
I am trying to delete 3 million records of data from huge table which already consists of 3 billion records.
This is hitting performance of DB and halting other activities of my users. Is there any easy way to delete such data fast. I have tried with forall delete but it is even taking lot of time.
I have table which contains huge data. around 12 lakhs records. when I use sum function on accountname and docdate it gives wrong value. once I restart the server it gives the correct value. one or two days it gives correct value after that again I get the same problem. If I restart again it gives correct value.
I use Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 64 bit server on Linux.
i have created on query block , upon pressing the button there is a wehre clause in the block which will filter the records based on all alike items from two tables but my problem is , in one table there is no information of struct and in other there is struct information or data, what i want is even if i pass a parameter in the where clause all the records should be filtered.
we are using oracle database. We hold the company records in the database. The records of the company should be available at anytime but over years the database keeps growing. Now how to handle the old data. All the data is important but if this goes for few more years then we need more and more disk space to handle. Is there any efficient methodologies to handle the old data? For us old mean the data that is 10 year old.
extract a huge amount of data from a couple of views... the problem is that they want it in TXT files with fixed record length. There will be like 6 files, for a total amount of about 10GB.
export those tables in the fastest possible way? If I'm not mistaken exp and expdp can't create txt files, so do I really need to use utl_file or spool?
1. i want to create form for student's attendance.
2. for this purpose i create a form . and create 2 Data Blocks 1. Non Database Data Block (layout type: form) 2. Database Data Block (layout type: Tabular)
3. in Non Database Data Block i add 3 Items.. on the basis of i get the records from database and show them into my into my 2nd Data Block...
4. my problem is this that how can i get records from database and show them into my forms and then save thats records into my table by using Database Datablock.
i want to use cursor to get data from db to "control block "(db item =no ) this data had where clause depend on item on other block
this my code :
declare cursor get_sol is select SOL_STEP,PROB_ID from MI_SOLUTION where PROB_ID=:MI_FORM_PROB.PROB_ID; begin go_block('control'); [code]......
when am using when_validate_item trigger error raise :
FRM-40737:Illegal restricted procedure next_record in when_validate_item
that's the trigger ? or how to solve ? in case of execute query in what trigger i will write the same code to get data in case of execute query by user .
I have control block. In which user enters two values. Next is data block in which two items have property 'copy value from item' set to 'control block.item1' and 'control block.item2' resp. If thees items are made database items then in execute query will they be sent as parameters in addition to default where clause? if I check last query then query does not hv these parameters but the records returned are already filtered by control block items.
I wonder if there is some tiny technique to trace a duplicity on a block label without committing the records ( Maybe on Validate or new record instance)
I have one procedure to check duplicate but what I remember that there is something very smart provided from oracle to do that.
i have two tables emp data emp course emp data the primary key is emp_no and another table emp course emp_no is foreign key and desgin master block and desgin details block master block programmed list of value to show the employee and make it database item = no details block show all data i want only save in the details block only how can make it