Building Data Warehouse From Single Flat File

May 2, 2012

I am trying to build a data warehouse for Consumer Price Index and so I have downloaded data from the Bureau of Statistics.It is in excel format and since I am working with Oracle Warehouse Builder I have converted it to .csv file so that I can use it as a data source.

Question1: Is it practical to use single .csv file as a source of data for a data warehouse?

Question2: I have 3 dimensions tables and a fact table.The dimensions are one for the Region(as the date is organized in region,states etc),two is the consumer goods and services (as the data is organized in groups of goods and services, services/goods types) and finally time(year and month),

Now how am I going to do the mapping here?Is it possible to do a one to one mapping here as all data required by the dimensions is located in the .csv file.

View 4 Replies


ADVERTISEMENT

SQLLDR With CLOB Data In Single Flat File?

Nov 24, 2012

I am migrating data from DB2 to Oracle. I used DB2 export to extract the data specifying lobsinfile clause. This created all the CLOB data in one file. So a typical record has a column with a reference to the CLOB data. "OUTFILE.001.lob.0.2880/". where OUTFILE.001.lob is the name specified in the export command and 0 is the starting position in the file and 2880 is the length of the first CLOB.

When I try to load this data using sqlldr I'm getting a file not found.Attached is a copy of the control file and output from testing

PS. I cant use the DB2 option LOBSINSEPFILES which creates a separate file for each CLOB column because the table has over 14 million rows....and creating 14 mil files causes OS inode problems...

Attached File(s)
 sqlldr.txt ( 2.05K )
Number of downloads: 3
 

View 5 Replies View Related

SQL & PL/SQL :: Export GB Data In Flat File

Nov 25, 2011

Database: Oracle 8i

My query is:

I have large data in the one table(approx. 2GB and above). I want to load the data in one flat file.

When i use spool - it is loading half data and remaining it is corrupting. In UNIX, i kept .sql file and with that i am exporting in .dat file. How to export the large data into flat file(.dat). is there any way to load command which wil be used in UNIX.

View 8 Replies View Related

Client Tools :: How To Spool Clob Data Into Flat File

Oct 11, 2011

I want to spool CLOB data into flat file ?

View 6 Replies View Related

SQL & PL/SQL :: Oracle Stored Procedure To Load Flat File Data

Oct 24, 2013

I need to create an Oracle Stored Procedure to read a Flat file(pipe delimited) and load the data into an Oracle table. I believe the file should be located in any of the path as logged in dba_directories table or it can be anywhere on the local client machine?

View 14 Replies View Related

PL/SQL :: Load Data From Flat File To Target Table With Scheduling In Loader

Sep 7, 2013

I have requirement as follows.  I need to load the data to the target table on every Saturday. My source file consists of data of several sates. For every week i have to load one particular state data to target table.  If first week I loaded AP data, then second week on Saturday karnatak, etc.

Provide code also how can i schedule the data load with every Saturday  with different state column values automatically.

View 2 Replies View Related

Data Warehouse Optimization

Jun 3, 2013

My data warehouse application involves partitioned tables where indexes are originally unusable on the last partition and only built until the next partition is created. We have a query tool that our users use to query this table that has an option "include not indexed data", which is essentially telling the tool whether to include that last partition in the query. IF this is checked, and they are filtering against on of the indexed fields, there is the potential for an Oracle error stating it tried to use an unusable index so our tool basically builds the query like this:

select ... from (
select ... from table where partition_key < (last usable partition key)
union
select /*+ NO_INDEX */ ... from table where partition_key >= (last usable partition key)
)
where
index_field = :value

I have had a difficult time getting reasonable data to test this myself, so I'm asking the question here:

Is Oracle probably pushing that outer filter into the inner individual queries in the UNION? If we were to move the index_field filter into the inner query against each of the individual queries in the union, would it make a difference performance-wise?

View 2 Replies View Related

Data Warehouse - How To Insert / Access Data In The Tables

May 30, 2011

I created a data warehouse in oracle 10g n with three Dimension and one cube after that it crates 4 tables . How to use an insert sql statement to insert data in those tables n how to access them.

View 7 Replies View Related

Move Data From Physical StandBy To Data Warehouse?

Oct 1, 2012

We have this archicture:

OLPT DB --> OLPT DB (Physical Standby, active dataguard) --> Data warehouse DB

We only allowed to connect to OLPT DB (Physical Standby, active dataguard) from Data warehouse DB. If there is possibility to use some of Oracle "native" method of data extraction (replication) from OLPT DB (Physical Standby, active dataguard) to Data warehouse DB.

As far as I know we cannot create materialized view log in OLPT DB (Physical Standby, active dataguard) in order to do data replication, but maybe there is some others ways?

View 3 Replies View Related

SQL & PL/SQL :: Clob To Flat File?

Nov 20, 2012

Just wanted to export a clob field to .txt file, the maximum length of the clob field exceeds the limit 32767. So only partial data is exported to flat file. is there any way to export the entire data available in clob field irrespective of the size or lenght.

Length of the clob is 301829

l_file := UTL_FILE.fopen('LOCALDIR', '3.txt', 'w', 32767);
LOOP
DBMS_LOB.read (v_text_exp, l_amount, l_pos, l_buffer);
UTL_FILE.put(l_file, l_buffer);
l_pos := l_pos + l_amount;
END LOOP;

View 14 Replies View Related

SQL & PL/SQL :: Reading Flat File With Header?

Jul 3, 2012

working on loading the data from flat file into table and below given is the validation condition given.I checked the UTL_FILE build in package but not able to figure out, how to identify the column header in flat file.

1. Skip the header, if any. The header is the first record, and starts with '000'
2. Skip the trailer, if any. The Trailer is the last record, and starts with '999'
3. Log an error, but continue if a line exceeds 512 characters
4. Log an error, but continue if a line is blank

View 5 Replies View Related

How To Calculate Sizes Of Archive / Redo In Data Warehouse DB

May 24, 2011

Before I begin, I want to clarify that I am newbie in the administration of data warehouse.I need to know how to calculate the sizes of the archive and redo on data warehouse DB, in order to make an initial sizing of the BD on disks level.

Is there a formula to calculate the size?

View 1 Replies View Related

SQL & PL/SQL :: Producing Pipe Delimited Flat File

Oct 20, 2011

I have a task to code a procedure and function in sql developer that will extract data within a date range (Jan 1 to April 3) from a source (source_name: expenses)and produce a text-file in pipe-delimited format.

View 2 Replies View Related

Performance Tuning :: Best Disk Config For SME Scale Data Warehouse

May 15, 2013

We are working on a Data warehouse (ard 50G ) architecture with the following acquired environment:

Single server X3650 M4 Dual CPU ( 16 core in total ) with 48G ram
Oracle standard 10g x64
Windows 2008 x64
128 SSD x 8
IBM ServeRAID M5110e SAS/SATA Controller

Due to budget concern, we will be running the App server(Business OBjects 4.0 w/ Tomcat and DB server on the same machine. ) We have a user base of around 30 ppl on the app server.

We intend to have external redundancy using IBM raid card on raid 10 configuration. I wonder what kind of disk config yield better performance if we only have write update in the morning and 95% read for the rest ?

Raid 1 for OS (128SSD x 2 including DB logfile )
Raid 10 for DB server ( 128 SSD x 6 )

I heard ASM provides better disk management but just wonder it increase performance in anyway.

View 2 Replies View Related

SQL & PL/SQL :: Debugging Stored Procedure / Populate Data Warehouse Dimension

Nov 20, 2011

The following code is a stored procedure I plan to use to populate a Data Warehouse dimension using data from two OLTP tables which already exist in my database. Notice that in my cursor select statement, I calculate an attribute using substr and instr, and I also assign a true or false value to a flag using a CASE statement.

CREATE OR REPLACE PROCEDURE populate_product_dimension
AS
v_Count NUMBER := 0;
v_NumRecs NUMBER;
/*Declare a cursor on the following query which returns mulitple rows of data from product and price_hist tables*/
[code]....

In my mind, Product_Code is declared correctly in the Cursor declaration Select statement.

View 10 Replies View Related

SQL & PL/SQL :: Unable To Load A Flat File Through Oracle Loader?

Sep 28, 2011

Issue: Unable to load a flat file through Oracle Loader

Below is the script that is being used:

drop table dl_fact_fac_data_xtern;
create table dl_fact_fac_data_xtern
(

[Code].....

After rnning this script, it prompts that table has been created; but once I fire the select command on the table I receive the following errors :

ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-00554: error encountered while parsing access parameters
KUP-01005: syntax error: found "data": expecting one of: "double-quoted-string, identifier, single-quoted-string"
KUP-01007: at line 10 column 11
ORA-06512: at "SYS.ORACLE_LOADER", line 19
29913. 00000 - "error in executing %s callout"
*Cause: The execution of the specified callout caused an error.
*Action: Examine the error messages take appropriate action.

View 2 Replies View Related

PL/SQL :: Create Multiple External Tables From Same Flat File?

Nov 28, 2012

using oracle 10g currently create many external tables like so..

CREATE TABLE "XT_UNITS"
(

"Q1_2012" VARCHAR2(25 BYTE),
"Q2_2012" VARCHAR2(25 BYTE),
"Q3_2012" VARCHAR2(25 BYTE),
"Q4_2012" VARCHAR2(25 BYTE)
[code]....

is there any way I can use 1 flat file (csv) to populate many external tables ?

View 4 Replies View Related

PL/SQL :: NLS Parameters - Exporting Couple Of Views Into Flat File

Jan 11, 2013

In one of my projects I am exporting a couple of views into a flat file. The export utility is generic and uses dynamic sql to generate a flat file. We have a test environment and a production environment. On both the code is the same. We noticed that the output is different between the environments although it is supposed to be the same. If I export a view in the production I will get a record like this:

0020110107O0000000001|OTHER|07.01.11 08:06:00,296000|07.01.11 08:04:41,008000||0|0|EUR||NOT_FROZEN|MVOIP||IS_NORMAL_VERSION|MODIFIED|6863475590797607166|8648564326455689103|8011808169304472215|||CCP||||10000580||||DEKA|PS

In the test environment it will be like this:

0020110107O0000000001|OTHER|07-JAN-11 08.06.00.296000 AM|07-JAN-11 08.04.41.008000 AM||0|0|EUR||NOT_FROZEN|MVOIP||IS_NORMAL_VERSION|MODIFIED|6863475590797607166|8648564326455689103|8011808169304472215|||CCP||||10000580||||DEKA|PS

The code I am running is not changing any settings explicitly. It looks like this and it will be run as EXECUTE IMMEDIATE:

DECLARE
v_sql         VARCHAR2 (32000);
v_sql_count   NUMBER             := 0;
v_error       VARCHAR2 (4000);
v_new_file    UTL_FILE.file_type;
BEGIN
[code]........
  
I also tried to do the following on production in order to get it equal to the test environment:

BEGIN
EXECUTE IMMEDIATE    'ALTER SESSION SET NLS_LANGUAGE = AMERICAN '
|| 'NLS_NUMERIC_CHARACTERS = ''.,'''
|| 'NLS_TIMESTAMP_FORMAT = ''DD-MON-RR HH.MI.SSXFF AM''';
END;

This would change the formatting for the timestamp columns for almost all files. Almost. Two of those files remain unchanged and still show the decimal separator from the old setting:

0020110107O0000000001|OTHER|07-JAN-11 08.06.00,296000 AM|07-JAN-11 08.04.41,008000 AM||0|0|EUR||NOT_FROZEN|MVOIP||IS_NORMAL_VERSION|MODIFIED|6863475590797607166|8648564326455689103|8011808169304472215|||CCP||||10000580||||DEKA|PSAny

[URL]......

[URL]......

[URL].....

[URL]......

View 2 Replies View Related

SQL & PL/SQL :: How To Load Flat File Into Database At Regular Interval Time

Apr 5, 2011

how can we load a Flat file into a Database At Regular Interval Time.

View 2 Replies View Related

SQL & PL/SQL :: Creating Store Procedure That Will Accept A Username From A Flat File?

Apr 8, 2012

I'm trying to create a store procedure that will accept a username from a flat file but i don't know how to do read file into store procedure.

Below is a sample store procedure by itself i created to add user which created okay but when i execute I got the error displayed below.

create or replace procedure addUsers(userNam in varchar2)
is
begin
EXECUTE IMMEDIATE 'CREATE USER'||userNam||'IDENTIFIED BY "pass1234" DEFAULT TABLESPACE USERS'||'QUOTA "1M" ON USERS'||
'PASSWORD EXPIRE';
end addUsers;
/

[code].....

View 21 Replies View Related

Server Utilities :: Import A Flat File Into Oracle And Update Another Table

Jul 5, 2013

I have a text file called ReturnedFile.txt. This is a comma separated text file that contains records for two fields.... Envelope and Date Returned.

At the same time, I have a table in Oracle called Manifest. This table contains the following fields:

Envelope

DateSentOut
DateReturned

I need to write something that imports the ReturnedFile.txt into a temporary Oracle table named UploadTemp, and then compares the data in the Envelope field from UploadTemp with the Envelope field in Manifest. If it's a match, then the DateReturned field in Manifest needs updated with the DateReturned field in UploadTemp.

I've done this with SQL Server no problem, but I've been trying for two days to make this work with Oracle and I can't figure it out. I've been trying to use SQL*Loader, but I can't even get it to run properly on my machine.

I did create a Control file, saved as RetFile.ctl. Below is the contents of the CTL file:

LOAD DATA
INFILE 'C:OracleTestReturnedFile.txt'

APPEND
INTO TABLE UploadTemp
FIELDS TERMINATED BY "'"
(
ENVELOPE,
DATERETURNED
)

If I could get SQL*Loader running, below is the code I came up with to import the text file and then to do the compare to the Manifest table and update as appropriate:

sqlldr UserJoe/Password123 CONTROL=C:OracleTestRetFile.ctl LOG=RetFile.log BAD=RetFile.bad

update Manifest m set m.DateReturned =
(select t.DateReturned
from UploadTemp t
where m.Envelope = t.Envelope
*)

That's all I got. As I said, I can't find a way to test it and I have no idea if it's even close.

View 2 Replies View Related

Server Utilities :: Flat File Tab Delimiter Import Messed Up With SQL Loader

Jan 4, 2012

I am trying to import data from below content.

Flat File

PARENTCHILDALIAS
PLAN_PCOTDefaultPlanning Customer
1001_BTPCOTDefaultGeneral Planning Customer
2000_BTPCOTDefaultNational Account Planning Customer
3000_BTPCOTDefaultDistributor Planning Customer
3010_BTPCOTDefaultEducation Planning Customer
3020_BTPCOTDefaultResearch Planning Customer
OPT1_PCOTDefaultOption 1 Planning customer
OPT2_PCOTDefaultOption 2 Planning customer
OPT3_PCOTDefaultOption 3 Planning customer

The problem here is , When you try to import to a table which has same columns . I skipped the first line when loading . The issue here is the second field is getting split in to the two columns . for e.g. :- Default goes to Child and Remaining goes to the Alias.

infarct there is a tab at the end of the each line. How to set the Sql loader settings correctly so that I can populated the end column in CHILD column only.!!!!

OPTIONS ( SKIP=1)
LOAD DATA
INFILE 'FlatFile.txt'
BADFILE ''FlatFile.bad'
DISCARDFILE ''FlatFile.dsc'

INTO TABLE "table"
FIELDS TERMINATED BY X'9'
OPTIONALLY ENCLOSED BY "''" TRAILING NULLCOLS
(PARENT,
CHILD,
ALIAS CONSTANT '')

View 5 Replies View Related

Server Utilities :: Load Selected Records From Flat File Using SQL Loader

Nov 24, 2011

Load the selected records from the flat file using SQL*Loader.

I have a flat file it's having 100 records, I want to load first 10 records from the file using SQL*LOADER.

View 8 Replies View Related

PL/SQL :: Spool Data From Tables Into Flat Files

Oct 23, 2012

I am trying to spool data from tables into flat files. I am using the following scripts to accomplish it

1. A cmd file (windows) that makes a call to a sql file
2. The SQL file which generates another query file at the run time, depending upon the table name passed to it
3. The run time query file , that executes the final query and spools the data into a txt file | delimited

For e.g. :

Actual command passed C:Spool_utilityspool_utility TABLE_NAME

E.g. of the spool utility file :

@echo off

SET dbuser=XX@YY
SET dbpw=xxxx

echo %date% - %time% - Start > %1%log.txt

echo START

sqlplus -s %dbuser%/%dbpw% @spool_utility.sql %1>%1.txt

echo %date% - %time% - Done >> %1%log.txt
echo DONE

E.g. of the spool_utility.sql

set echo off
SET newpage 0
SET feedback off
SET linesize 32767
set pagesize 0
[code]........

The above file generates a table_name.sql file with the actual table name at run time and gets executed and the output is written to the table_name.txt file.

This works perfectly fine. But the issue is when someone passes some wrong table name or if there is a actual run time error while executing the query , the error with details itself itself gets written to the end spool file.

For e.g. : if i do this just to generate an error and execute it from command line, the query generates an error and writes the error to the spool file , but at the command prompt where I executed the command I do not see any error and the process seems to have run perfectly well

set xxx on xxx off as above

spool &1.sql;

Prompt Select * from &1 where rownum><10---this will cause the issue

spool off
set termout ON
@ &1

EXIT

Eg of spool file generated :

from     table_name WHERE rownum><=10 *
ERROR at line 62:
ORA-00936: missing expression

My question is, is there any way i can capture this runtime error and return this error to my calling sql script spool_utility.sql and then propagate it to the calling command file and do some tasks for eg removing the spool file and writing the actual error to a log file . Basically any way to know at my OS calling level that the entire spooling operation was unsuccessful.

View 8 Replies View Related

Export/Import/SQL Loader :: Migrating Data From Solid Database To Oracle - Flat Files

Aug 14, 2012

I am migrating data from a Solid Database to Oracle, I am using Flat Files to do that.

1.- I download the data to flat files from Solid
2.- I move the files to Oracle server
3.- I upload the data to Oracle

Now, I have done the 90% of the data base, but I have found some tables that has description columns and in this description the users writes enters, so when I try to upload the data to Oracle SQL loader cannot recognize this characters.

Example:

'25','0.','5.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
'26','0.','2.','0.','0.','0.','0.','3.','0.','0.','0.','0.','0.','',''
'27','0.','1.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
'28','0.','1.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
'29','0.','38.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
'30','0.','13.','0.','0.','0.','0.','0.','6.','0.','6.','0.','0.','|SE RECHAZA B20CS50SNW ^M
^M
SE RECHAZAN CINCO PZAS ^M
DOS MOD. HSC15I41EH,DOS MOD. HSK15I41EH |Agregó: 06/06/2009 12:22:50
|','DEV. A PROV.'
'31','0.','50.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
'32','0.','9.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
'33','0.','2.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''

How can I solve this ?

View 3 Replies View Related

Oracle Warehouse Builder 11.2 - Deploy To New Repository?

Feb 24, 2012

I have a big probleme with owb 11.2.

1. I have export MDL file from owb 10.2
2. I created owb repoisitory on server where I have installed my database 11gr2.
3. I have installed OWB 11gr2 on another machine.
4. Imported MDL file to new repository.

How I deploy to my new repository?

View 2 Replies View Related

Building A Machine For Express

Sep 15, 2010

I'm about to build a machine whos prinary focus will be running a 'hobby' app that uses oracle express as its db. some info on the db..main tabl has 30 fields and 500K records,all other tables add up to about 20% of that size.

I run huge queries that return most of the db in about 200K rows with 50 fields.these queies contain many 'partion fields' using 'rank', 'percect rank' etc. My old pc really struggles with thhis stuff and im going to invest about 600 euro (exculing monitor etc) in a new mahcine.

general terms what i should get, im thinking about OS, hard drive (1 for data, 1 for indexs?), mutli core processer?, type and size of ram. Any othr consideration i should have.

View 3 Replies View Related

Application Express :: Building App That Connects To DB2 / UDB?

Jul 30, 2012

I have NO experience with APEX but - have been browsing the documentation in considering it's use as a web application/dashboard reporting utility.

However - I'm wondering if it allows for the ability to connect to DB2 as well as Oracle...

i.e. - building a form that allows a user to 'pull' data for analysis from a DB2 instance and - generate reporting results that would then be stored in an Oracle instance.

View 4 Replies View Related

Reports & Discoverer :: Building Across / Down Group Report?

Nov 25, 2012

I have an Across Group Report, but can't create a design / layout what i want (explained more detail as shown on the images).

Current report when it's running:

[URL]

The final layout that what i want:

[URL]

Environments:

- Oracle Developer Reports 10g R2
- Oracle Database 10gR2

View 2 Replies View Related

Server Utilities :: Datapump And Index Building

Jan 13, 2012

Currently we are using "exp and imp" utilities to unload from production and load into Dev server. While importing, we are following below steps

(1) Load only data [by specifying INDEXES=N in the par file]
(2) Unlock statistics
(3) Load indexes, other objects [by specifying ROWS=N]

After doing these steps, both data, indexes and others objects are loaded. To verify indexes, we are checking DBA_INDEXES.

DBA_INDEXES :
-------------
OWNER INDEX_NAME TABLE_NAME STATUS LAST_ANALYZED
----- ---------- ---------- ------ -------------
MYSCH CP_INDEX_1 CP_TABLE_1 VALID 14/JAN/12

Question :-

(1) Does imp utility rebuild the indexes while loading data ? or it simply takes the rows from dump and load into test system without building from scratch ?

(2) I am trying to replace 'exp' and 'imp' with datapump utilities ? But, I am confused about the parameters to be used ?

(a) Can I load both data and meta data at the same time (Using CONTENT=ALL option) ?
(b) I am planning to implement this in two steps :

first load only metadata using - CONTENT=METADATA_ONLY TABLE_EXISTS_ACTION=REPLACE

then, load data - CONTENT=DATA_ONLY.

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved