I have created a softlink to a data file orig_file.dat using the command ln -s orig_file.dat orig_file_link.dat
Now is it possible to create an external table using this linked file orig_file_link.dat
I am using this linked file to minimise the space usage because every month we get different data files and external table creation is done as a batch job copying the data file to another file whenever a new data file arrives. External table is created with the new file. so i need to check whether it is possible with the linked file instead of copying the file.
While creating external table how can I specify a particular decode condition for a date field that comes in as '2099-99-99' i want to change it to '2099-01-0001', how i can translate it
I already have this in the access parameters..
Incoming_DATE CHAR(20) DATE_FORMAT DATE MASK "YYYY/MM/DD"
I made a directory and a external table, pointing to a file in the created directory. It appears that using unc path in my directory, won't work with my external table. Although the unc path points to the same server as where the database server is located.Is my assumption correct?
create or replace directory DIVA_POK as '\tvm011gm_diva_dir$ronnendbga_pok'
I am using an external table with numbers and joining the external table to fetch data and Insert into another table.I am altering the table within procedure to pass csv file name as parameter as It'll change each time. It's working fine but in case if my grouping id is having two gids, then Header from csv file is also inserted as row in to coupon_personlization_members.
I gave skip 1 in external table but for second GID, header is inserted as row.I don't want header to be inserted as row but header will be there on csv file.
Here is my procedure : CREATE OR REPLACE PROCEDURE proc_coupon_load( p_grouping_id NUMBER, p_file_name VARCHAR2)
i am trying to create a external table.Table is getting created but there is no data.While checking the log file, its saying
KUP-04021: field formatting error for field CUST_CODE KUP-04026: field too long for datatype KUP-04101: record 1 rejected in file /data/mmi_mig_data/PCOM_CUSTOMER_1.csv KUP-04021: field formatting error for field CUST_CODE KUP-04026: field too long for datatype KUP-04101: record 2 rejected in file /data/mmi_mig_data/PCOM_CUSTOMER_1.csv
But according to me in CSV file everything looks to be okay.External Table:
Anyway, I've loaded 5 .csv files through an external table and after doing it I tried to delete them.
But this error comes "Cannot delete 'filename': It is being used by another person or program".
I closed Oracle Developer and tried again deleting them manually, and the result was the same.
Tried restarting and deleting one .csv and it worked, but when I open sql dev and tried deleting the other files couldn't do it.
The question is: files that were used on external tables can't be deleted if developer is working?
The thing is that I've created a Stored Procedure that delete the files and obviously can't work. So, I should delete every time I load a csv file after restarting the computer.
When I am recreating the External table with NOLOGGING keyword at the bottom of the synatx, it's showing "operation not supported". Else are there any other points that I should look to increase the execution time of the above statement ?
i just posted another topic where i heard about external table and i had a few questions concerning them. I thought it was best to create a new topic than to continue on the other one...
I noticed that to create an external table the CTL is like this: CREATE TABLE emp_load (FIELDS description) ORGANIZATION EXTERNAL (TYPE ORACLE_LOADER DEFAULT DIRECTORY ext_tab_dir ACCESS PARAMETERS (RECORDS FIXED 62 FIELDS (employee_number CHAR(2),
[Code]...
1) This creates an external table, but, is it possible to Create a normal table in a CTL file? For physical tables, the table has to exist right?
2) if you create a view linked to 2 external tables and if the CSV files are updated each day, the external tables will be updated automatically, and the view will be updated as well?
3) Can't there be any synchronisation problems?
4) What happens if a select request (or someone requests on the view) while the CSV file is being updated?
5) Is there anyway you can protect the accesses from those tables/views when the CSVs are being updated?
6) Is it possible to create an index on these sort of tables?
7) Is it possible to index a view?
8) Are external tables visible on a tool like sql developper?
the following situation, I have a directory named /dat/global/stock/ inside this i will get files named differently for example below.abcdef.112dfgrt.2......
Here i want to load this file one by one into the external tables and generate one more file based on some enrichment.
Step 1. Have to take first file and to load into the ext table. Step 2. Enrichment Step 3.File generation.
Now here i am facing a problem that in that particular directory i usually get 1000 files so i need to get file one by one and to put in one more directory. how can i get file one by one and generate file by using oracle loader
So I'm here creating some tables and if theres a guide where I can take a look at some Constraints Declarations , Mostly cause i have some values on the table that cannot be negative so i need to set constraints to be positive
I am trying to create a new user who will be able to do a delete from all of the tables that only I MYSELF created. I created my user and granted access but realized I may not have done it right..
is it simply SQL>grant delete on <TABLESPACE> to <new user>; ? or do i need to specify the grant the delete on my tablespace to new user?
I am facing a problem while creating a program using oracle forms.These two table i am using in creating following program .
SQL> desc electricity ; Name Null? Type ----------------------------------------- -------- ---------------------------- ID NOT NULL NUMBER NAME VARCHAR2(40) PREV_READING NUMBER(20) PRESENT_REDAING NUMBER(20) TOTAL_UNITS NUMBER(30) AMOUNT NUMBER(30) [code]....
What i want that whenever user writes a value in "PRESENT_Reading" Column then it stores value in "ID table" .For example if current month is July ,Then when user write a value in present Reading column ,it stores it in JULY Column of Id table ..i Write following trigger (Key_commit) On present_reading Column .
if sysdate=to_char(sysdate,'MON','Jan') Then :electricity.PRESENT_REDAING := :ID.Jan ;
elsif sysdate=to_char(sysdate,'MON','FEB') Then :electricity.PRESENT_REDAING := :ID.Feb ;
elsif sysdate=to_char(sysdate,'MON','MAR') Then :electricity.PRESENT_REDAING:= :ID.MAR ;
But when i run my form it does not work ,and raised unhandled Exception .I think i need to change some properties of PRESENT_READING column of electricity table.
"Create a view named vuPassFailRate that will show the pass rate and fail rates of each test."
I have a table named Test_ID containing the following columns:
TEST_ID TEST_NAME PASSING_GRADE
And another table named Test_History containing the following columns:
TEST_ID STUDENT_ID SCORE
I'm assuming i have to create an inline view, and to work out the pass/fail rates i need to do something along the lines of (For pass rate) Where SCORE is greater than or equal to PASSING_GRADE, and TEST_ID equals TEST_ID, divide by a count of SCORE and multiply by 100. I just cant work out the nested select statements, and work out the formula using two columns in two tables.
I have been staring at this problem for so long now i cant see the wood for the trees.
I'm having trouble with a little assignment. "Create a view named vuPassFailRate that will show the pass rate and fail rates of each test." I have a table named Test_ID containing the following columns:
TEST_IDTEST_NAMEPASSING_GRADE
And another table named Test_History containing the following columns:
TEST_IDSTUDENT_IDSCORE
I'm assuming i have to create an inline view, and to work out the pass/fail rates i need to do something along the lines of (For pass rate) Where SCORE is greater than or equal to PASSING_GRADE, and TEST_ID equals TEST_ID, divide by a count of SCORE and multiply by 100. I just cant work out the nested select statements, and work out the formula using two columns in two tables. I have been staring at this problem for so long now i cant see the wood for the trees.
I have a database in my Windows 2003 server and oracle 9i installed on it working find. we are using our client pc on the network to access the database on the server through our application program which is build under java swing.I want to create a client application which will use the browser to access the database only for generating reports.
I created a mv for one of the partitioned tables but on viewing the mv capabilities it still shows PCT is set to 'N'.
create materialized view MV_summary_SEC refresh fast start with sysdate nEXT SYSDATE + 1/24 enable query rewrite as [code]....
If i remove the sub query and create the mview like this,then PCT is enabled.
create materialized view MV_summary_SEC refresh fast start with sysdate nEXT SYSDATE + 1/24 enable query rewrite as select PERIOD , SUM(SUM_WEB_HITS) from summary ,date_table where PERIOD >= DATE_TABLE.CUR_DATE group by PERIOD
Is it simply because oracle doesn't support PCT if the definition contains subqueries ?
I am trying to restore to a backup instance on a backup server. When I try to recreate the tables I keep getting ORA-01659: unable to allocate MINEXTENTS. The tablespaces and datafiles on both servers show as the same size in OEM.
I have dropped all tables and OEM shows tablespaces are empty. Then I run a script to recreate all tables. Most of the tables don't get created because their TS is full. After the script to recreate all tables runs, the main tablespaces are full, more full than on the production machine. I have also tried ALTER TABLESPACE xxx COALESCE; on each tablespace right after dropping all tables and before recreating them to reclaim free space. Why is it full? I've only dropped and created the tables, there shouldn't be any data in them yet.
ORA-01659: unable to allocate MINEXTENTS beyond 2 in tablespace PLUS_T...The backup instance was already there, all I did was drop the tables. Here's what I ran on prod to build a script to recreate the tables on backup server. Got it off Burleson somewhere.
SELECT DBMS_METADATA.GET_DDL('TABLE',u.table_name) ||'; ' FROM USER_TABLES u;
I have a view on base tables holding historical data for previous 60 months(one table per month) with union all operators.create index on those base tables will improve performance or creating a primary key with disabled novalidate will improve for retrieving data?
The view has around 8 million data and used as a fact table with 4 dimension tables.A DTS package from MSSql side refreshes OLAP cube by retrieving data from these tables in oracle.
I am writing a procedure that will be called from a java wrapper.
The procedure do a lot of data manipulations and in between i am creating global temp table and saving the data into it for each request thats given as a parameter to the procedure. After all the processing i have to write the data from this global temp table into a physical table and atlast drop the temp table.
Create or replace proc_name ()
update table........
delete from ..........
CREATE GLOBAL TEMPORARY TABLE TSAAG ( supplier_id numeric(10) not null, supplier_name varchar2(50) not null, contact_name varchar2(50) )
insert into............
drop table TSAAG;
End;
creating a global temp table inside a procedure is expensive...
Do we have anything like creating table before and calling the instanse of it in procedure.
I have got a procedure that successfully creates an oracle external table and populates it with the contents of a file. This works fine until I have a situation where one of the fields is a VARCHAR2(2) and I try to insert say, a 5 character value. When this happens the record in question does not get populated in the external table (and rightly so), but I could do with working out if there is a discrepancy in the number of records in the file and the number of records that actually make it into the table so I could inform the user that there is a problem.
I have attached the code that creates the external table and populates it.
I'm experiencing some problems when trying to import an 50Mb XML file to an Oracle database. In this XML file I do have several data from customers:
- Name - Address - Contacts
I also have several tables within the DB that would receive this information:
- Customer - CustomerAddress - CustomerContacts
The problem is that, with my XML Transformation and correspondent insertion onto the databse I'm having an huge problem of time expended. I'm having more than 3 hours to insert over 180.000 records on those tables. what can I do to accelerate the process?