I made a directory and a external table, pointing to a file in the created directory. It appears that using unc path in my directory, won't work with my external table. Although the unc path points to the same server as where the database server is located.Is my assumption correct?
create or replace directory DIVA_POK as '\tvm011gm_diva_dir$ronnendbga_pok'
I am using an external table with numbers and joining the external table to fetch data and Insert into another table.I am altering the table within procedure to pass csv file name as parameter as It'll change each time. It's working fine but in case if my grouping id is having two gids, then Header from csv file is also inserted as row in to coupon_personlization_members.
I gave skip 1 in external table but for second GID, header is inserted as row.I don't want header to be inserted as row but header will be there on csv file.
Here is my procedure : CREATE OR REPLACE PROCEDURE proc_coupon_load( p_grouping_id NUMBER, p_file_name VARCHAR2)
I have created a softlink to a data file orig_file.dat using the command ln -s orig_file.dat orig_file_link.dat
Now is it possible to create an external table using this linked file orig_file_link.dat
I am using this linked file to minimise the space usage because every month we get different data files and external table creation is done as a batch job copying the data file to another file whenever a new data file arrives. External table is created with the new file. so i need to check whether it is possible with the linked file instead of copying the file.
i am trying to create a external table.Table is getting created but there is no data.While checking the log file, its saying
KUP-04021: field formatting error for field CUST_CODE KUP-04026: field too long for datatype KUP-04101: record 1 rejected in file /data/mmi_mig_data/PCOM_CUSTOMER_1.csv KUP-04021: field formatting error for field CUST_CODE KUP-04026: field too long for datatype KUP-04101: record 2 rejected in file /data/mmi_mig_data/PCOM_CUSTOMER_1.csv
But according to me in CSV file everything looks to be okay.External Table:
Anyway, I've loaded 5 .csv files through an external table and after doing it I tried to delete them.
But this error comes "Cannot delete 'filename': It is being used by another person or program".
I closed Oracle Developer and tried again deleting them manually, and the result was the same.
Tried restarting and deleting one .csv and it worked, but when I open sql dev and tried deleting the other files couldn't do it.
The question is: files that were used on external tables can't be deleted if developer is working?
The thing is that I've created a Stored Procedure that delete the files and obviously can't work. So, I should delete every time I load a csv file after restarting the computer.
When I am recreating the External table with NOLOGGING keyword at the bottom of the synatx, it's showing "operation not supported". Else are there any other points that I should look to increase the execution time of the above statement ?
i just posted another topic where i heard about external table and i had a few questions concerning them. I thought it was best to create a new topic than to continue on the other one...
I noticed that to create an external table the CTL is like this: CREATE TABLE emp_load (FIELDS description) ORGANIZATION EXTERNAL (TYPE ORACLE_LOADER DEFAULT DIRECTORY ext_tab_dir ACCESS PARAMETERS (RECORDS FIXED 62 FIELDS (employee_number CHAR(2),
[Code]...
1) This creates an external table, but, is it possible to Create a normal table in a CTL file? For physical tables, the table has to exist right?
2) if you create a view linked to 2 external tables and if the CSV files are updated each day, the external tables will be updated automatically, and the view will be updated as well?
3) Can't there be any synchronisation problems?
4) What happens if a select request (or someone requests on the view) while the CSV file is being updated?
5) Is there anyway you can protect the accesses from those tables/views when the CSVs are being updated?
6) Is it possible to create an index on these sort of tables?
7) Is it possible to index a view?
8) Are external tables visible on a tool like sql developper?
the following situation, I have a directory named /dat/global/stock/ inside this i will get files named differently for example below.abcdef.112dfgrt.2......
Here i want to load this file one by one into the external tables and generate one more file based on some enrichment.
Step 1. Have to take first file and to load into the ext table. Step 2. Enrichment Step 3.File generation.
Now here i am facing a problem that in that particular directory i usually get 1000 files so i need to get file one by one and to put in one more directory. how can i get file one by one and generate file by using oracle loader
I have got a procedure that successfully creates an oracle external table and populates it with the contents of a file. This works fine until I have a situation where one of the fields is a VARCHAR2(2) and I try to insert say, a 5 character value. When this happens the record in question does not get populated in the external table (and rightly so), but I could do with working out if there is a discrepancy in the number of records in the file and the number of records that actually make it into the table so I could inform the user that there is a problem.
I have attached the code that creates the external table and populates it.
I'm experiencing some problems when trying to import an 50Mb XML file to an Oracle database. In this XML file I do have several data from customers:
- Name - Address - Contacts
I also have several tables within the DB that would receive this information:
- Customer - CustomerAddress - CustomerContacts
The problem is that, with my XML Transformation and correspondent insertion onto the databse I'm having an huge problem of time expended. I'm having more than 3 hours to insert over 180.000 records on those tables. what can I do to accelerate the process?
problem on oracle 11gR2 where i have to import data from a source database to an existing table without truncate or drop the target table in the target database.
we have found something called table_exist_action=append in impdp.
I need to do some analysis and research to find if any of these 8000 tables I have in a spreadheet that are going to be spit off into a separate database are used by any of our PeopleSoft processes.
I'm assigned to Student Records identified by NTSR.
So for table AAP_ETHNIC_PMPT, I could do something like
SELECT * FROM PSSQLTEXTDEFN where SQLID like 'NTSR%' AND upper(sqltext) like '%PS_AAP_ETHNIC_PMPT%';
But how can I automate this and search 8000 rows in column A2 of a spreadsheet? What other tables other than PSSQLTEXTDEFN or PSPROJECTITEM can I use to search for values of NTSR?
when i am writing dump from external table, it is accessing records from dump.but when i am trying to access other dumps(create thru expdp) it is giving error.the logic i am following is mentioned below-
CREATE OR REPLACE DIRECTORY "DIR_GMS" AS 'D:Gopal_works est_env_files'
GRANT READ ON DIRECTORY dir_gms TO gopal; GRANT WRITE ON DIRECTORY dir_gms TO gopal;
New point: -- taking export thru expdb expdp hr/hr tables=EMPLOYEES directory=DIR_GMS dumpfile=HR_EMP.dmp logfile=expdpEMP.log then i created one EXTERNAL TABLE TO access it.
function isnumeric( p_string in varchar2) return boolean as l_number number; begin l_number := p_string; return TRUE; exception when others then return FALSE; end;
It is not recognizing function IS NUMERIC. Is there any way, I can do this through External Table.
I'm getting error like
The following error has occurred:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "identifier": expecting one of: "comma, date_format, defaultif, enclosed, (, ltrim, lrtrim, ldrtrim, notrim, nullif, optionally, ), rtrim, terminated" KUP-01008: the bad identifier was: isnumeric KUP-01007: at line 9 column 16 ORA-06512: at "SYS.ORACLE_LOADER", line 14 ORA-06512: at line 1
I have an external table. The table gets created successfully. Once the table is created when I try to access it, I get the following error :
ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error error opening file /cmpnt/dev/test/ADE_TEMP_4992.log
The default directory is valid and does not have any issues. The IP address of my DB server and the server from which I am connecting to the DB are different. Is this is the issue ? However , all SQL queries are working fine except this one.
i had small query on external tables. i had (.csv) file outside the database. In this file, one column will be added monthly. i need to create the external table dynamically by adding the column for every month.
How the procedure can be created for this requirement.
then it gets " select * from ext_schoolof " BUT when i use procedure, it creates external table but when I try to get " select * from ext_schoolof ", then I get errors
i need access to a view running on another database server. i configured my db link and tried to compile the view. but the following message occurs: ora-12154: TNS: Connection Identifier..
I have to cleanup data from our tables (Production Environment) that contain millions of rows. The question is apart from the solution of the partitioned tables what alternative recommended solution suggests Oracle?
To delete these tables by using a cursor PL/SQL block or to import all the database and in the tables that we want to remove the old rows to use the QUERY option of the data pump utility.
I have used both ways and i have to admit that datapump solution is much much faster than the deletion that suffers from I/O disk.The question again is which method from these two is more reliable and less risky for the health of the database.
I came across an implementation where data from DB2 tables are moved to Oracle tables, for BI solutioning, using some oracle procedures called from MS SQL DTS packages which are scheduled jobs.Just being curious, can this be done using OWB or ODI rather than the above detour. I suppose there are some changes being done in those procedures before the data is being loaded into Oracle tables, can't this be done using OWB/ODI? Can it be scheduled too as jobs using OWB/ODI?
load data infile 'C:\Documents and Settings\xxxxx\Desktop\abc.txt' APPEND PRESERVE BLANKS INTO TABLE table1
[code]...
When I run the above control file in sqlldr, I'm getting the error as Record 1: Rejected - Error on table table1, column column3. ORA-01481: invalid number format model
In the table the column3 data type is NUMBER(6,2).: The column size in table is 6 and position of column3 in control file is only 4. Also if possible let me know how the same data (send me 2 dummy records) which exactly works for the above control file especially for column3 where decimal number comes in the flat file.
For generating the flat file, for column3, i'm using LPAD(:value,4,0) in the select query column list.
SQL> select * from oldemp8; select * from oldemp8 * ERROR at line 1: ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "(": expecting one of: "comma, defaultif, nullif, )" KUP-01007: at line 7 column 16 ORA-06512: at "SYS.ORACLE_LOADER", line 19
SQL>
what is the syntax error in the above command. I place the notepad file properly.i create external table before many time but cant find any this type of error.
While creating external table how can I specify a particular decode condition for a date field that comes in as '2099-99-99' i want to change it to '2099-01-0001', how i can translate it
I already have this in the access parameters..
Incoming_DATE CHAR(20) DATE_FORMAT DATE MASK "YYYY/MM/DD"