SQL & PL/SQL :: Group By Gives Wrong Value In Huge Data Records?

Jun 18, 2012

I have table which contains huge data. around 12 lakhs records. when I use sum function on accountname and docdate it gives wrong value. once I restart the server it gives the correct value. one or two days it gives correct value after that again I get the same problem. If I restart again it gives correct value.

I use Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 64 bit server on Linux.

View 6 Replies


ADVERTISEMENT

Installation :: Wrong Initial Group For Oracle Linux Account

Oct 19, 2013

Before I install the Oracle database 11.2.0.1 in Linux server 2.6.39-400.209.1.el6uek.x86_64, I created the following groups

oinstall, dba, oper, and asmadmin.  groupadd oinstall  # required from training groupadd dba # required from training groupadd oper  # and group asmdba, asmoper from training groupadd asmadmin  #

optional from training I made a mistake when I created Oracle user account. I created it with dba as initial group“ useradd -g dba -G oinstall,oper,asmadmin oracle”, instead of “useradd -g oinstall -G dba,oper,asmadmin oracle” After all I installed Oracle database, now I have concerns and questions. Should I use user mod to update the Oracle user account to install as initial group or just leave it alone? If I now do “usermod –g oinstall –G dba,oper,asmadmin oracle”, will it break anything, any impact to the database?

View 10 Replies View Related

Deleting Huge Records From Table?

Apr 29, 2013

Consider tables A,B,C,D,E,F. all are having 100000++ records Tables B,C,D are dependent on table A (with foreign key constraint). When I am deleting records from all tables, table B,C,D are taking max 30-40 seconds while table A is taking 30-40 mins. All tables are having indexes.

Method I have used:

1. Created Temp table

2. then deleted all records from B,C,D,E,F for all records in temp table for limit of 500.

delete from B where exists (select 1 from temp where b.col1=temp.col1);

3. why it is taking too much time for deleting records in table A.

View 5 Replies View Related

Group Records With Less Than One Hour Separation And Count How Many Per Group

Nov 1, 2013

I'm trying to group sets of data based on time separations between records and then count how many records are in each group.

In the example below, I want to return the count for each group of data, so Group 1=5, Group 2=5 and Group 3=5

SELECT AREA_ID AS "AREA ID",
LOC_ID AS "LOCATION ID",
TEST_DATE AS "DATE",
TEST_TIME AS "TIME"
FROM MON_TEST_MASTER
WHERE AREA_ID =89
AND LOC_ID ='3015'
AND TEST_DATE ='10/19/1994';

[code]....

Group 1 = 8:00:22 to 8:41:22

Group 2 = 11:35:47 to 11:35:47

Group 3 = 15:13:46 to 15:13:46

Keep in mind the times will always change, and sometime go over the one hour mark, but no group will have more then a one hour separation between records.

View 4 Replies View Related

SQL & PL/SQL :: To Verify Huge Number Of Records In Two Different Databases

Jun 5, 2012

I need to verify huge number of records in two different databases. Basically i wanted to check if same record exist in other database's table or not? but as the number of records are more than billions who would i verify?

checking record one by one would be so hectic and time consuming. any other option is there?

View 11 Replies View Related

JDeveloper, Java & XML :: Load Huge XML File With Hundreds Of Records Into Oracle Table?

Jun 29, 2011

I need to load (using SQL Loader) an huge XML file, with several hundreds of records into an Oracle Table.The XML file schema is pretty simple, and it's anything like this:

<dataroot>
<record>
<companyname>LimitSoft S.A.</companyname>
<address>Street Number 1</address>

[code]...

I'm trying to use the help included in this link [URL]...

When they refer to schema[URL].... what should I use?? I do not need to use the Oracle website to register anything, right?

View 4 Replies View Related

Inserting Data In New Column Where Table Has Huge Data

May 26, 2013

I am trying to add a new column in a table and insert data from another column of same table.

alter table POSITION add INT_MK_DATA_ID number(10,0) null;
update POSITION set INT_MK_DATA_ID = INST_MARKET_DATA_ID;
commit

As there are huge number of records in the POSITION table ...its taking for ever to execute this query.

View 1 Replies View Related

PL/SQL :: Queries To Get Data In Batches From A Huge Data Table

Jan 3, 2013

Here is my problem, i need to create some files with my own format(let say 5000 records each) from a huge data table (May contain 5 Million records). And i want this creation to be multi threaded.

so how can i form queries efficiently to fetch records like 1..5000 and 5001..10000 and so on. I can form some thing like select * from table where rownum<5000 and not exists ( already fetched records) . but it is not the efficient one.

View 5 Replies View Related

Modifying Fields With Huge Data

Aug 26, 2011

I have 2 questions, because they can be inter-related I am posting it in a single post. These queries are related to Oracle(PL\SQL).

1. I am trying to increase the size of a field in a table which has almost 2 million records and the query for alteration runs for almost and hour and rollsback, wondering is there a better way of doing it.

2. I have modified the size of a field in a table from Varchar2(10) to Varchar2(20), now when I tried to rollback the modification it is not letting me to change the size from Varchar2(20) to Varchar2(10). No data has been inserted after the modification.

View 1 Replies View Related

SQL & PL/SQL :: Modification Of Fields With Huge Data?

Aug 26, 2011

I have 2 questions, because they can be inter-related I am posting it in a single post. These queries are related to Oracle(PLSQL).

1. I am trying to increase the size of a field in a table which has almost 2 million records and the query for alteration runs for almost and hour and rollsback, wondering is there a better way of doing it.

2. I have modified the size of a field in a table from Varchar2(10) to Varchar2(20), now when I tried to rollback the modification it is not letting me to change the size from Varchar2(20) to Varchar2(10). No data has been inserted after the modification.

View 2 Replies View Related

Archiving And Purging Data From Huge Tables?

Apr 22, 2013

I'm currently working on a project which is to archive the old data and then purge the same data from the main table.

Here is a detail description:

There are around 50 odd tables from which I would need to archive the old data(matching certain filter conditions...not date based). Meaning I have to store the data in a temp table. Once stored in temp table then I would have to delete those rows from the main table. This temp table will be later exported and stored on ARchive database(a seperate database). These tables are very huge. One of the table is actually 250 GB in size. And all these tables have many indexes built - both normal and bitmap.The 250 GB size table has 40 million rows that need to be archived and purged. The total number of rows in the table are 540 million.On this table alone there are 50 bitmap indexes and 2 normal indexes. This table is partitioned based on date column.This date column is not used/useful in identifying the old data. There are around 20 tables which are quite similar in size to the above described table. Rest of them are little small when compared to the above table.

We have to execute this activity over a weekend which gives us about 48 hours time to complete the activity. Best possible ways to handle this activity. Most importantly should be able to complete the activity within the specified 48 hour window.

The solution what we are now thinking of is:

1. Create the temp table ---Create tmp_tbl as select * from main_table where <<conidtions identifying old data>>

2. Once the temp table is created. Make copy of indexes that exist on the main table and eventually drop them.

3. Execute a PL/SQL script to perform the bulk delete from main table and commit for every 100000 rows.

4. Once the bulk delete is finished then recreate the indexes on the main table using the copy made at earlier step.

Our main worry is about the step#4. Considering the size of these tables and the number of indexes to be built,we are not sure how long the index re-creation will run for each table.

depending on the possibilities we may have to split the activity in to 2-3 phases spreading across 2-3 weekends. Even then we are not sure whether we will be able to pull off this activity.

The database we are using is Oracle 10g.

View 1 Replies View Related

Datatype Modification For Table With Huge Data

Aug 11, 2011

Below is my requirement.

Need to change the precision of a column in a existing table. Statistics about the table

* has over 130 columns
* More than 300 million records
* Column to modify is #121 which has data
* No primary key defined

Since the column has data, it is not possible to modify with a simple Alter.

Second option - create temp column in same table, update from original, put null in original, alter, update back from temp, drop the temp column. This approach is very expensive and time consuming.

Also the Column ID needs to preserved as #121.

View 3 Replies View Related

Bulk Deletes Of Data From Huge Table

Jun 13, 2013

I am trying to delete 3 million records of data from huge table which already consists of 3 billion records.

This is hitting performance of DB and halting other activities of my users. Is there any easy way to delete such data fast. I have tried with forall delete but it is even taking lot of time.

View 5 Replies View Related

Server Utilities :: Exporting Huge Amount Of Data?

Jul 25, 2011

extract a huge amount of data from a couple of views... the problem is that they want it in TXT files with fixed record length. There will be like 6 files, for a total amount of about 10GB.

export those tables in the fastest possible way? If I'm not mistaken exp and expdp can't create txt files, so do I really need to use utl_file or spool?

View 1 Replies View Related

Forms :: Handle Huge Number Of Rows In Data Block

Jun 14, 2011

I need to read a huge number of rows, say in lakhs and then need to populate it in data block. Since it is having huge data am never able to run the form. it hangs after some time. when i test with few rows it is working. so no problem in coding.

View 4 Replies View Related

Oracle 11g Express Edition - Load Huge Data Into Table

Nov 6, 2012

I am using oracle 11g Express Edition, I have a file of .csv forma, Which has a data of size 500MB which needs to be uploaded into oracle table.

Which would be the best method to upload the data into table. Data is employee ticket history which is of huge data.

How to do the mass upload of data into oracle table.

View 3 Replies View Related

SQL & PL/SQL :: Group By Matching Records

Dec 19, 2012

I have one table employee where there are 4 fields ,emp_code,emp_locn,emp_job_code,emp_job_desc the problem is i am trying to prepare a group reports based on location and emp_job_code there is a duplication of data in the emp_job_desc ,

For example there is a job_code E2 Which has two different job_descriptions for two different employees like E2-PAINTER-SPRAY, E2- PAINTER -SPRAY, another example is E1-rigger , E2-RIGGER and so on.Is there a method to match them together as one description.

View 7 Replies View Related

SQL & PL/SQL :: Duplicate Records And Group By

Jun 22, 2010

My duplicate records have been detected by First Name, Last Name, Name, and City.

such as

select FirstName, LastName, Name, City, count(*) as Num of Duplicate from TABLE
GROUP BY FirstName, LastName, Name, City
having count (*) > 1

It gives the duplicate record. Now I need all the columns and the each duplicate record in the select, so I can see why these records are duplicate.

View 6 Replies View Related

PL/SQL :: How To Extract Records Based On Group

Aug 31, 2012

I have a table with with 2 colums serialnumber and brand .

each brand may have multiple serialnumber .

I want to extract 10 serialnumber for each brand .

View 2 Replies View Related

SQL & PL/SQL :: Export Data To CSV - Wrong Diacritics?

Jun 20, 2013

I would like to need export data to csv file, but I got problems with diacritics.The simply PLSQL looks like:

declare
f utl_file.file_type;
cursor c1 is Select ACTIVITY_SUB_TYPE
from the_table;
begin

[code]...

After run of plsql the record in the csv file looks like "Vypršanie skuš.lehoty kontakt"So there is a problem with that diacritics.

View 7 Replies View Related

SQL & PL/SQL :: Difference In Number Of Records In GROUP BY And PARTITION BY

Feb 17, 2012

If I run the following query I got 997 records by using GROUP BY.

SELECT c.ins_no, b.pd_date,a.project_id,
a.tech_no
FROM mis.tranche_balance a,
FMSRPT.fund_reporting_period b,
ods.proj_info_lookup c,
ods.institution d
WHERE a.su_date = b.pd_date
AND a.project_id = c.project_id
AND c.ins_no = d.ins_no
AND d.sif_code LIKE 'P%'
AND d.sif_code <> 'P-DA'
AND a.date_stamp >='01-JAN-2011'
AND pd_date='31-MAR-2011'
GROUP BY c.ins_no,
b.pd_date,
a.project_id,
a.tech_no;

I want to show the extra columns a.date_stamp and a.su_date in the out put so that I have used PARTITION BY in the second query but I got 1079 records.

SELECT c.ins_no, b.pd_date,a.date_stamp,a.su_date, a.project_id,
a.tech_no,
COUNT(*) OVER(PARTITION BY c.ins_no,
b.pd_date,
a.project_id,
a.tech_no)c
[code]....

why I got 1079 records.how to show the two extra columns in the out put whcich are not used in GROUP BY clause.

View 8 Replies View Related

SQL & PL/SQL :: Extracted Data Is Displayed In Wrong Format

Feb 14, 2013

I use SQL to extract data from Quality Center (QC) to excel. I have a field type String. It contains the following values.

1) 1161, 1162, 1163
2) DHM, 162
3) DTH, 163
etc

But when i extract this to excel the data is displayed as
1) 116111621163
2) DHM, 162
3) DTH, 163

The value in the first row is displayed with out commas. How to extract the data as it is in the field?

View 7 Replies View Related

SQL & PL/SQL :: Select Query Like Operator Getting Wrong Data

Dec 25, 2012

I am using the following query with like 'T_%', i am getting 80 rows out of which the first table_name doesn't even have a beginning part 'T_%'.

the first table name has not started with 'T_', why is it appearing.

*********************************************************************
SELECT 'Truncate table epic500.'||table_name
FROM user_tables where table_name like 'T_%' order by table_name;
*********************************************************************
output:
Truncate table epic500.TEMP_ENC_DEL
Truncate table epic500.T_ACCOMMODATION_CODE

View 4 Replies View Related

Reports & Discoverer :: Report Generating Wrong Data In Windows 7

May 7, 2012

I am actually having problems with a report that is generating wrong data in Windows 7 only. I have the same report working correctly on many Windows XP Computers.

View 9 Replies View Related

Data Guard :: Unable To Transfer Archivelog Wrong Pass File

Aug 2, 2010

I am creating physical standby database through Rman duplicate command from 2 node rac cluster. rman do all its work. now am try to start the mrp process on physical standby database. I am getting following errors

------------------------------------------------------------
Check that the primary and standby are using a password file
and remote_login_passwordfile is set to SHARED or EXCLUSIVE,
and that the SYS password is same in the password files.
returning error ORA-16191
------------------------------------------------------------
ORA-16191: Primary log shipping client not logged on standby
------------------------------------------------------------

I copied the same pass file from primary to standby and many times verify the same but i got the same error.

View 4 Replies View Related

Server Utilities :: Geometric Data From Text To Table And Wrong CTL Upload Into Table

Jul 11, 2013

I have a requirement to import text files which are generated from 3d modelling software xsteel where it records all geometric information and i want to import this information into oracle table.

CREATE TABLE dstv_head ( wo_no VARCHAR2(12),struct VARCHAR2(12),rev_no NUMBER,
mark VARCHAR2(12),pos VARCHAR2(12),grade VARCHAR2(12),qty NUMBER,PROFILE VARCHAR2(24),TYPE VARCHAR2(12),
len NUMBER,width_web NUMBER,width_bottom NUMBER,flange_thk NUMBER,web_thk NUMBER,radius NUMBER,kgm NUMBER,
kgm1 NUMBER,kgm2 NUMBER,bevel_plus NUMBER,bevel_minus NUMBER,holes_yn VARCHAR2(1),holes_v_yn VARCHAR2(1),
hole_x_dim NUMBER,hole_y_dim NUMBER,hole_dia NUMBER,no_of_holes NUMBER)

-- All the data which has to go under specific field for example **9005.nc1 will go into wo_no field, 1239401A will go under struct.

ST
** 9005.nc1 --WO_NO
1239401A - STRUCT
1 -REV_NO
9005 -MARK
9005 --POS
S275JR --GRADE
2 --QTY
[code]....

View 24 Replies View Related

SQL & PL/SQL :: How To Group By Data

Sep 8, 2011

My requirement is Data from a TableA has to be provided as an overall view

TABLEA
ID ENTITY REQ_FLG PAR_FLG EXT_FLG
CONV_1 ACCNT Y Y Y
CONV_1 PROD Y Y N
CONV_1 ADDR Y N N
CONV_2 DID Y N N
CONV_2 ORDER Y N N

Required to show the data in report as

ID Expand View_Report Populate ENTITY QRY_STATUS
CONV_1 Expand Report Populate
ACCNT Y Y Y
PROD Y Y Y
ADDR Y N N
CONV_2 Expand Report Populate
DID Y N N
ORDER Y N N

Where "Expand", "Report", "Populate" are provided as Hard coded values in query.

Sample Query.
SELECT ID
,'Expand' AS EXPAND
,'Report' AS VIEW_REPORT
, 'Populate / Reset' AS POP
, DECODE(MN_TBL.ENTITY,NULL,NULL,ENTITY) AS ENTITY
, REQ_FLG || ' ' || PAR_FLG || ' ' || EXT_FLG AS QRY_STATUS
FROM TABLEA
GROUP BY GROUPING SETS
((ID), (ENTITY, REQ_FLG , PAR_FLG , EXT_FLG ))
ORDER BY CONVERSION_ID, ENTITY

Above query works fine, where single ID is present

ID Expand View_Report Populate ENTITY QRY_STATUS
CONV_1 Expand Report Populate
ACCNT Y Y Y
PROD Y Y Y
ADDR Y N N

But when more than one ID is present the entire thing collapses

View 5 Replies View Related

PL/SQL :: Cannot Get Wanted Data With MAX And GROUP BY Function

Jun 15, 2012

I have a table like below:

COLUMN     TYPE
USER_ID     VARCHAR2 (10 Byte)
PROCESS_ID     VARCHAR2 (30 Byte)
END_TIME     DATE(STAMP)
TO_LOC     VARCHAR2 (12 Byte)
TO_LOC_TYPE     VARCHAR2 (15 Byte)
FROM_LOC      VARCHAR2 (12 Byte)
ITEM_ID     VARCHAR2 (25 Byte)
CASES     NUMBER (12,4)
LMS_UDA1      VARCHAR2 (250 Byte)
ZONE     VARCHAR2 (2 Byte)

I only want get one record with all columns, only have one clause MAX(END_TIME) But the other column have difference value. when i use MAX(END_TIME) and GROUP BY USER_ID,PROCESS_ID,CASES,... the sql didnot give one record, It give many records

View 6 Replies View Related

SQL & PL/SQL :: Display Data Based On Group And Percentage?

Feb 12, 2012

I have the following requirement, where I have to display the data based on the group and links when input is given as month. I have written the following code, which is good to display for group. But I want to display for all the groups.
CREATE TABLE target_data
(
T_LINK VARCHAR2(50)
, t_mon varchar2(6)
, t_grp varchar2(30)
, t_views NUMBER

[code]...

When I run this, I get output as

t_grp count_80 goal
grp3 1 0.1

But I want the output as

t_grp count_80 goal
grp3 1 0.1
grp2 1 0.1
grp1 ... ...
grp0 ... ...

View 16 Replies View Related

PL/SQL :: Retrieve Data By Using Both Group By And Order By Clauses?

Nov 16, 2012

I have a table name as angdata77 having attributes like asigno..i want to retrieve data from angdata77 by using both group by & order by clauses.. for total count..am using the query as

select asigno,count(*) from angdata77 group by asigno order by asigno;

Is there any other query for retrieving the data from angdata77

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved