SQL & PL/SQL :: How To Insert BLOB Files Into Tables
Dec 4, 2011
I created a music database, and I'm having trouble inserting the audio, video, and lyrics (.doc) into their respective tables. I searched through the forums and found some example code, but I'm not sure how to modify it to fit my purposes.
What I need is a procedure that can insert a complete record into the track table (including an .mp3 file for each row), one that can insert a record into the lyrics table (including .doc file for each row), and a procedure that can insert a single record into the Video table (including an .mv4 file).
I am trying to insert an image files into a blob column using the following code.
CREATE TABLE MY_IMAGE_TABLE ( ID NUMBER, NAME VARCHAR2(20), IMAGE BLOB); CREATE OR REPLACE DIRECTORY MY_FILES AS '\dppdb-dev est'; GRANT ALL ON DIRECTORY MY_FILES TO PUBLIC;
[code]....
However, I am getting the this error
ORA-22288: file or LOB operation FILEOPEN failed
I've done some tracing on toad and it seems that the file am trying to insert exists yet am unable to open it.
I have a table with a column of type blob. Now i want to create a procedure which will insert into that table. But I don't like to create a directory. How can i solve this.
I want to insert a row like this:
insert into table x(image) values('d:photo est.jpg');
I want to store a pdf file into a database column of BLOB type. The pdf file on the client system not on the database server. Is there any way i can achieve this?
Here I am explaining the process of how I am trying to insert pdf file into oracle database.
create or replace directory files as 'c:/welcome/';
(physical directory is created in the system also., both in server and client machine)
Create or replace PROCEDURE procloadMetaPdf (Filename IN VARCHAR2) is temp_blob blob:=empty_blob(); location BFILE; Bytes_To_Load Integer:=0; auto_Id number; Begin
[Code]...
procudure creating successfully
but when executing
exec procloadMetaPdf('help.pdf');
displaying the following error:
ERROR at line 1: ORA-22285: non-existent directory or file for FILEOPEN operation ORA-06512: at "SYS.DBMS_LOB", line 605 ORA-06512: at "SCOTT.PROCLOADMETAPDF", line 14 ORA-06512: at line 1
(line 14 is : DBMS_LOB.OPEN(location , DBMS_LOB.LOB_READONLY)
is it possible to upload very large files in oracle's tables. For example 1-2 gigabyte video file or even more. In other words is it possible to use oracle as file server to upload very large files and store them?
Anyway, I've loaded 5 .csv files through an external table and after doing it I tried to delete them.
But this error comes "Cannot delete 'filename': It is being used by another person or program".
I closed Oracle Developer and tried again deleting them manually, and the result was the same.
Tried restarting and deleting one .csv and it worked, but when I open sql dev and tried deleting the other files couldn't do it.
The question is: files that were used on external tables can't be deleted if developer is working?
The thing is that I've created a Stored Procedure that delete the files and obviously can't work. So, I should delete every time I load a csv file after restarting the computer.
I am trying to spool data from tables into flat files. I am using the following scripts to accomplish it
1. A cmd file (windows) that makes a call to a sql file 2. The SQL file which generates another query file at the run time, depending upon the table name passed to it 3. The run time query file , that executes the final query and spools the data into a txt file | delimited
For e.g. :
Actual command passed C:Spool_utilityspool_utility TABLE_NAME
set echo off SET newpage 0 SET feedback off SET linesize 32767 set pagesize 0 [code]........
The above file generates a table_name.sql file with the actual table name at run time and gets executed and the output is written to the table_name.txt file.
This works perfectly fine. But the issue is when someone passes some wrong table name or if there is a actual run time error while executing the query , the error with details itself itself gets written to the end spool file.
For e.g. : if i do this just to generate an error and execute it from command line, the query generates an error and writes the error to the spool file , but at the command prompt where I executed the command I do not see any error and the process seems to have run perfectly well
set xxx on xxx off as above
spool &1.sql;
Prompt Select * from &1 where rownum><10---this will cause the issue
spool off set termout ON @ &1
EXIT
Eg of spool file generated :
from table_name WHERE rownum><=10 * ERROR at line 62: ORA-00936: missing expression
My question is, is there any way i can capture this runtime error and return this error to my calling sql script spool_utility.sql and then propagate it to the calling command file and do some tasks for eg removing the spool file and writing the actual error to a log file . Basically any way to know at my OS calling level that the entire spooling operation was unsuccessful.
I want to load data into more tables from many files ,based on first column value,which is FILLER field.i am trying to test this scenario with two oracle tables with similar definition. and load one record on each table using WHEN/POSITION keywords. for this , i added first column as reference column in the data which i have in ctl file itself.
1st table loaded with 1st record. But, 2nd record not loading.if i missed anything with WHEN/POSITION keyword ?
This is the error in log file for 2nd table(WD1):
Record 2: Rejected - Error on table WD1, column TAB. ORA-01841: (full) year must be between -4713 and +9999, and not be 0
Table WD1: 0 Rows successfully loaded. 1 Row not loaded due to data errors. 1 Row not loaded because all WHEN clauses were failed. 0 Rows not loaded because all fields were null. [code]....
I have a bunch of data in 50 excel files. I need to load all these 50 files into 50 different tables. I would like to do this in one script. I went through the forum to get this information, people suggested create a shell script etc or list the sqlldr command multiple times etc.
provide some clarity on this as to what's the best approach.If it is through shell scripting provide the shell script and instructions to execute it. Iam new to shell scripting.
I have 780(12*65) csv files generated from 65 databases.Now I have to load this 780 csv files into 12 tables created in my database for some monitoring and reporting purpose.to call the sql loader I am plannig to create 780 lines like below.
we know creating 780 control files is the difficult task.So I have created only 12 control files. is there any mechanism to pass a varible (planning to declare it in the sqlldr line) to the infile clause like below in sql loader?
the following situation, I have a directory named /dat/global/stock/ inside this i will get files named differently for example below.abcdef.112dfgrt.2......
Here i want to load this file one by one into the external tables and generate one more file based on some enrichment.
Step 1. Have to take first file and to load into the ext table. Step 2. Enrichment Step 3.File generation.
Now here i am facing a problem that in that particular directory i usually get 1000 files so i need to get file one by one and to put in one more directory. how can i get file one by one and generate file by using oracle loader
I have 2 object tables. Location and a department. Department references a location Object. But this wont insert. I get "0 rows created".
CREATE OR REPLACE TYPE location_objtyp AS OBJECT ( locationID NUMBER, name VARCHAR2(48) ); / CREATE OR REPLACE TYPE dept_objtyp AS OBJECT ( deptID NUMBER, name VARCHAR2(48), locationID_ref REF location_objtyp ); [code]...
This does not insert. I get no error. Only - "0 rows created".
i have 100 records in table1,like as we have more 15 tables without data. the issue is how can i insert above table1 data(100 records) into different 15 tables in single sql command.
I have a form which has three detail portions. I want that when I press SAVE, it should insert data in two tables & then run the specific code & then insert data in other two tables.
I am using Developer 6i. Couldn't find out the proper trigger or related thing.
I am trying to insert records in multiple tables. I know how to view data using joinig, but unable to understand how to insert records in multiple tables using Joining. I searched it on net, but didn't find much. I have also tried to write a code, but it is not working, I have seen some examples on different websites where people are using SELECT in INSERT statement for joining. What is the correct Syntax to INSERT record in Multiple tables.
Insert into library_users, library_users_info (library_users.username, library_users.password, library_users_info.address, library_users_info.phone_no) VALUES (...)
We get data from our customers which we load into temporary tables.The goal is to consolidate this data into one single table.
Following are the rules:
1) final table should have all the columns from all the tables. If there are common column(s) then add only one column with that name.
2) the join would be based on all the common columns
3) if there is a common row, we merge the row into one (example, the row with DOMAIN = ACME.COM)
4) There could be 'N' number of tables
Following is the most realistic data.
1) T1/T2/T3 has the sample data which cover most of our test cases
2) We are expected to transform the data from T1/T2/T3 as depicted in table T4.
3) we might have more than 3 tables in our production environment, so the query should work for N tables.
4) I have given the explanation of how each row should be derived to be inserted in T4
5) the only information we have to work with is the TABLE_NAME(s) and its metadata from USER_TAB_COLUMNS
DROP TABLE T1; DROP TABLE T2; DROP TABLE T3; DROP TABLE T4;
[code].....
Explanation for each row:
row1) This row comes from T1 and T2 (not T3 because HOSTNAME would not match) row2) This row comes from T1 and T3 (not T2 because HOSTNAME would not match) row3) This row comes from T1 and T3 row4) This row comes from T2 and T3 row5) This row comes from T3
How can I search in Nested Tables ex: (pr_travel_date_range,pr_bo_arr) using the SQL below and insert the result into a new Nested Table: ex:g_splited_range_arr.
Here are the DDL and DML SQLs;*Don't worry about the NUMBER( 8 )*
CREATE OR REPLACE TYPE DATE_RANGE IS OBJECT ( start_date NUMBER( 8 ), end_date NUMBER( 8 ) ); CREATE OR REPLACE TYPE DATE_RANGE_ARR IS TABLE OF DATE_RANGE; DECLARE g_splited_range_arr DATE_RANGE_ARR := DATE_RANGE_ARR( );
[code]...
Or can I create a VIEW with parameters of Nested Tables in it so I can simply call
SELECT * BULK COLLECT INTO g_splited_range_arr FROM view_split_date(g_travel_range,g_bo_arr);
I'm writing a Procedure which Updates or Inserts data in Multiple tables. Selected fields of 10 tables need to be updated or Inserted. For this I created a table which comprises of fields related to all 10 tables. Then I write Procedure. Under this I create a Cursor which uploads the data from the newly created table which contains different fields of 10 tables. Then I write Update and Insert statements one by one for all 10 tables.
Sample Procedure below. ------------------------------------------- Create or replace procedure p_proc as spidm spriden.spriden_pidm%type; cursor mycur is select * from mytable; begin for rec in mycur [code]...... ----------
Note: I created table on my server because data is coming from different server. They will upload the data in the table from there I pick and update the tables. Is updating or Inserting data in different tables one by one is correct?
I am trying to insert rows in two tables using sql loader.
I have two tables in database as
SQL> desc name Name Null? Type ---------------------- -------- ------------ ID NUMBER NAME VARCHAR2(20) BD DATE
SQL> desc name3 Name Null? Type --------------------- ----------- ------------- ID NUMBER NAME VARCHAR2(20) BD DATE
I created controlfiles as
[oracle@DBTEST sqldri]$ cat datafile.ctl options (direct=true) load data INFILE * into table name truncate when id='1'
[code]....
when i run sql loader as
[oracle@DBTEST sqldri]$ sqlldr hr/hr control=/u01/sqldri/datafile.ctl SQL*Loader: Release 10.2.0.1.0 - Production on Tue Aug 7 23:30:07 2012 Copyright (c) 1982, 2005, Oracle. All rights reserved. Load completed - logical record count 2.
no rows is inserted..the log file contain entries as
[oracle@DBTEST sqldri]$ cat datafile.log SQL*Loader: Release 10.2.0.1.0 - Production on Tue Aug 7 23:30:07 2012 Copyright (c) 1982, 2005, Oracle. All rights reserved. Control File: /u01/sqldri/datafile.ctl Data File: /u01/sqldri/datafile.ctl Bad File: /u01/sqldri/datafile.bad