Upload Very Large Files In Tables?

Nov 2, 2008

is it possible to upload very large files in oracle's tables. For example 1-2 gigabyte video file or even more. In other words is it possible to use oracle as file server to upload very large files and store them?

View 2 Replies


ADVERTISEMENT

SQL & PL/SQL :: How To Upload PDF Or JPG Files From Table To Computer

Feb 4, 2012

i have this table structure create table file (id number, media_file blob).how i upload pdf or jpg files from this table to computer for example to C:myfiles

View 3 Replies View Related

Maintain Large Tables / Cleanup Data From Our Tables

May 18, 2011

I have to cleanup data from our tables (Production Environment) that contain millions of rows. The question is apart from the solution of the partitioned tables what alternative recommended solution suggests Oracle?

To delete these tables by using a cursor PL/SQL block or to import all the database and in the tables that we want to remove the old rows to use the QUERY option of the data pump utility.

I have used both ways and i have to admit that datapump solution is much much faster than the deletion that suffers from I/O disk.The question again is which method from these two is more reliable and less risky for the health of the database.

View 5 Replies View Related

Application Express :: Upload CSV Files To Table Using 3.2 Version

Jul 30, 2013

Using Apex 3.2 version , Need to  Upload .CSV Files  to the Table (T_UPLOAD),

View 11 Replies View Related

Forms :: Upload Files From Client To Application Server Using 10g

Oct 3, 2010

I have used webutil_file_transfer.Client_To_AS_with_progress to upload files from client to Application Server using Forms 10g.However, now i want to save file in database and not upload to database as blob.I mean I want to save the file from client TO a folder available in the database server.I was wondering, there is no documentation available on WEBUTIL.

View 6 Replies View Related

Server Administration :: Large Trace Files Generated In Udump?

Oct 24, 2011

We have Oracle 10g(10.2.0.4) RAC on AIX 5.3, there are 3 RAC instances on each node.From Oct 9th, we found one of the instance in node 1 generated a large size trace files in udump. The largest size of the trace file take up 3G. But there's only about 15G $ORACLE_BASE direcotry. After some time, we should delete some trace files for releasing the space.

Here is a part of alert log when this issue happen:

Sun Oct 9 23:18:15 2011
Errors in file /oracle/app/oracle/admin/bzywk/udump/bzywk1_ora_3166258.trc:
ORA-00600: internal error code, arguments: [17087], [0x70000010DF9F580], [], [], [], [], [], []
Sun Oct 9 23:18:16 2011
Trace dumping is performing id=[cdmp_20111009231816]

[code]....

I checked with some trace files, I found all these files contains a huge info of processes.

View 17 Replies View Related

Application Express :: Upload From Excel For Several Tables

Jan 7, 2013

Pretty new to APEX and was hoping to accomplish with native functionality.

Using the latest APEX, there is nice functionality that creates a wizard for you for a single table. I wanted to have a single wizard but pick a table then have the ability to map my fields for insert/update.

View 2 Replies View Related

SQL & PL/SQL :: Unpivoting Large Tables

Feb 6, 2012

I have a 27 million row table in the following format:

MEDCLM_MTH_SUM_KEY PRIMARY_DIAG_CD DIAG_CD2 DIAG_CD3 DIAG_CD4 DIAG_CD5 DIAG_CD6 DIAG_CD7 DIAG_CD8 DIAG_CD9 DIAG_CD10
2212990780 5552 78907 53170 5368
2231127242 V5481 7812 71595 4019 2761 2859 496 V4364 30501

I need to unpivot this data to get it to look like this:

MEDCLM_MTH_SUM_KEY DIAG_CD_LEVEL DIAG_CD
2212990780 PRIMARY_DIAG_CD 5552
2212990780 DIAG_CD2 78907
2212990780 DIAG_CD3 53170
[code]...

I was wondering if there was a quicker, more efficient way to do this.

View 3 Replies View Related

SQL & PL/SQL :: Large Tables / Remove Unnecessary Columns

Mar 14, 2011

I have a large table with 450 column and we are only using nearly 170 columns and our BD block size is 8k.The DBA informed that there is an row chaining happening in the Database.My question is if we have data available in 170 column .why row chaining is happening.

The DBA informed us to remove the unnecessary columns .. Does those empty columns have any impact on the chaining.If we increase the size of DB block to 32k . does it will resolve the issue.

View 4 Replies View Related

Server Utilities :: Data Pump For Exporting And Importing Extremely Large Data Files

Sep 24, 2010

I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?

View 4 Replies View Related

Takes Long Time To Drop Tables With Large Numbers Of Partitions

Jul 17, 2013

11.2.0.3 This is for a build. We are still in development. No risk of data loss. As part of the build, I drop the user,re-create it, re-create the objects. Allows us to test the build all the way through. Its our process. This user has some tables with several 1000 partitions. I ran a 10046 trace and oracle is using pl/sql to do loops to do DML against the data dictionary. Anyway to speed this up? I am going to turn off the recyclebin during the build and turn it back on. anything else I can do? Right now I just issue 'drop user cascade'. Part of is the weak hardware we have in the development/environment. Takes about 20 minutes just to run through this part of the script (the script has alot more pieces than this) and we do fairly frequent builds. I can't change the build process. My only option is to try to make this run a little faster.

View 3 Replies View Related

Performance Tuning :: Create Small Functional Indexes For Special Cases In Very Large Tables

Apr 5, 2012

Create small functional indexes for special cases in very large tables.

When there is a column having one values in 99% records and another values that have to be search for, it is possible to create an index using null value. Index will be small and the rebuild fast.

Example

create index vh_tst_decode_ind_if1 on vh_tst_decode_ind
(decode(S,'I','I',null),style)

It is possible to do index more selective when the key is updated and there are many records to create more levels in b-tree.

create index vh_tst_decode_ind_if3 on vh_tst_decode_ind
(decode(S,'I','I',null),
decode(S,'I',style,null)
)

To access the record can by like:

SQL> select --+ index(vh_tst_decode_ind_if3)
2 style ,count(*)
3 from vh_tst_decode_ind
4 where
5 decode(S,'I','I',null)='I'
6 group by style
7 ;

[code]....

View 2 Replies View Related

SQL & PL/SQL :: Insert From 3 Different Files In 3 Tables

Jun 12, 2013

I am trying to insert from 3 different files in 3 tables and I am finding that the same id is being inserted for each row.

DECLARE

F UTL_FILE.FILE_TYPE;
V_LINE VARCHAR2 (1000);
V_Name varchar2(512 char);
V_ParentUID raw(16);
V_Path varchar2(1024 char);
[code]...

View 5 Replies View Related

SQL & PL/SQL :: Can't Delete Files That Were Used On External Tables

May 11, 2011

I'm not sure if this should go on this topic.

Anyway, I've loaded 5 .csv files through an external table and after doing it I tried to delete them.

But this error comes "Cannot delete 'filename': It is being used by another person or program".

I closed Oracle Developer and tried again deleting them manually, and the result was the same.

Tried restarting and deleting one .csv and it worked, but when I open sql dev and tried deleting the other files couldn't do it.

The question is: files that were used on external tables can't be deleted if developer is working?

The thing is that I've created a Stored Procedure that delete the files and obviously can't work. So, I should delete every time I load a csv file after restarting the computer.

View 6 Replies View Related

SQL & PL/SQL :: How To Insert BLOB Files Into Tables

Dec 4, 2011

I created a music database, and I'm having trouble inserting the audio, video, and lyrics (.doc) into their respective tables. I searched through the forums and found some example code, but I'm not sure how to modify it to fit my purposes.

What I need is a procedure that can insert a complete record into the track table (including an .mp3 file for each row), one that can insert a record into the lyrics table (including .doc file for each row), and a procedure that can insert a single record into the Video table (including an .mv4 file).

Here's the DDL:

CREATE TABLE Artist(
artist_id number(9),
artist_name varchar(30),
country char(2),
num_albums number(3),
num_songs number(5)
);

[Code]....

View 39 Replies View Related

SQL & PL/SQL :: Insert Data In Three Tables From Three CSV Files Simultaneously?

Jun 12, 2013

I am trying to insert data in three tables from three csv files simultaneously. This is what I have so far:

---insert all data from three csv files
DECLARE
--zenobject
F UTL_FILE.FILE_TYPE;

[Code]....

View 5 Replies View Related

PL/SQL :: Spool Data From Tables Into Flat Files

Oct 23, 2012

I am trying to spool data from tables into flat files. I am using the following scripts to accomplish it

1. A cmd file (windows) that makes a call to a sql file
2. The SQL file which generates another query file at the run time, depending upon the table name passed to it
3. The run time query file , that executes the final query and spools the data into a txt file | delimited

For e.g. :

Actual command passed C:Spool_utilityspool_utility TABLE_NAME

E.g. of the spool utility file :

@echo off

SET dbuser=XX@YY
SET dbpw=xxxx

echo %date% - %time% - Start > %1%log.txt

echo START

sqlplus -s %dbuser%/%dbpw% @spool_utility.sql %1>%1.txt

echo %date% - %time% - Done >> %1%log.txt
echo DONE

E.g. of the spool_utility.sql

set echo off
SET newpage 0
SET feedback off
SET linesize 32767
set pagesize 0
[code]........

The above file generates a table_name.sql file with the actual table name at run time and gets executed and the output is written to the table_name.txt file.

This works perfectly fine. But the issue is when someone passes some wrong table name or if there is a actual run time error while executing the query , the error with details itself itself gets written to the end spool file.

For e.g. : if i do this just to generate an error and execute it from command line, the query generates an error and writes the error to the spool file , but at the command prompt where I executed the command I do not see any error and the process seems to have run perfectly well

set xxx on xxx off as above

spool &1.sql;

Prompt Select * from &1 where rownum><10---this will cause the issue

spool off
set termout ON
@ &1

EXIT

Eg of spool file generated :

from     table_name WHERE rownum><=10 *
ERROR at line 62:
ORA-00936: missing expression

My question is, is there any way i can capture this runtime error and return this error to my calling sql script spool_utility.sql and then propagate it to the calling command file and do some tasks for eg removing the spool file and writing the actual error to a log file . Basically any way to know at my OS calling level that the entire spooling operation was unsuccessful.

View 8 Replies View Related

Server Utilities :: Load Data Into More Tables From Many Files

Jan 20, 2012

I want to load data into more tables from many files ,based on first column value,which is FILLER field.i am trying to test this scenario with two oracle tables with similar definition. and load one record on each table using WHEN/POSITION keywords. for this , i added first column as reference column in the data which i have in ctl file itself.

1st table loaded with 1st record. But, 2nd record not loading.if i missed anything with WHEN/POSITION keyword ?

This is the error in log file for 2nd table(WD1):

Record 2: Rejected - Error on table WD1, column TAB.
ORA-01841: (full) year must be between -4713 and +9999, and not be 0

Table WD1:
0 Rows successfully loaded.
1 Row not loaded due to data errors.
1 Row not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
[code]....

View 9 Replies View Related

Server Utilities :: Loading Multiple Excel Files To Different Tables

Mar 29, 2012

I have a bunch of data in 50 excel files. I need to load all these 50 files into 50 different tables. I would like to do this in one script. I went through the forum to get this information, people suggested create a shell script etc or list the sqlldr command multiple times etc.

provide some clarity on this as to what's the best approach.If it is through shell scripting provide the shell script and instructions to execute it. Iam new to shell scripting.

View 5 Replies View Related

Server Utilities :: Load 780 CSV Files Into 12 Tables Created In Database - Sql Loader?

Jul 22, 2011

I have 780(12*65) csv files generated from 65 databases.Now I have to load this 780 csv files into 12 tables created in my database for some monitoring and reporting purpose.to call the sql loader I am plannig to create 780 lines like below.

sqlldr abc@tns/pwd control='E:htmlctlhtml_broken_jobs_rpt.ctl' log='E:htmlreportloghtml_broken_jobs_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_db_size_rpt.ctl' log='E:htmlreportloghtml_db_size_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_fragmentation_rpt.ctl' log='E:htmlreportloghtml_fragmentation_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_index_stats_rpt.ctl' log='E:htmlreportloghtml_index_stats_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_invalid_object_rpt.ctl' log='E:htmlreportloghtml_invalid_object_rpt.log'
sqlldr abc@tns/pwd control='E:htmlctlhtml_long_running_queries_rpt.ctl' log='E:htmlreportloghtml_long_running_queries_rpt.log'

we know creating 780 control files is the difficult task.So I have created only 12 control files. is there any mechanism to pass a varible (planning to declare it in the sqlldr line) to the infile clause like below in sql loader?

infile "E:htmlreportoutput&a_html_broken_jobs_rpt.csv"

here a is the variable name. it will change every 12 csv files once.

or

is there anyother way to achive this?

View 8 Replies View Related

Export/Import/SQL Loader :: External Tables Loading Multiple Files From Directory One By One

Oct 4, 2013

the following situation, I have a directory named /dat/global/stock/  inside this i will get files named differently for example below.abcdef.112dfgrt.2......

 Here i want to load this file one by one into the external tables and generate one more file based on some enrichment.

Step 1. Have to take first file and to load into the ext table.
Step 2. Enrichment
Step 3.File generation. 

Now here i am facing a problem that in that particular directory i usually get 1000 files so i need to get file one by one and to put in one more directory. how can i get file one by one and generate file by using oracle loader 

View 4 Replies View Related

Server Utilities :: Loading Multiple Input Files Into Multiple Tables

Jul 9, 2012

NGFID;RECTYPE;RECNAME
25;7;POLES
PARENT
CHILD;1401;9845075;2020
817;8;SUPPORT
PARENT
CHILD

Required output:-

AREA_SRNO = 1
AREA_NAME = '3rivieres.export.ngf'

File :-mauri.export.ngf

NGFID;RECTYPE;RECNAME
257;7;POLES
PARENT
CHILD;1401;9845075;2020
8174;8;SUPPORT
PARENT
CHILD

Required output:-

AREA_SRNO = 2
AREA_NAME = 'mauri.export.ngf'....etc

CREATE TABLE NGF_REC_LINK
(
AREA_SRNO NUMBER(2),
AREA_NAME VARCHAR2(40),
NGFID NUMBER(20),
TABLENAME VARCHAR2(40),
PARENT VARCHAR2(200),
[code].......

find the ctl file (ngf_test.ctl) and modify the ctl file as per my requirement.

View 6 Replies View Related

Precompilers, OCI & OCCI :: Make Files That Compile PC Files In Unix

Jul 31, 2008

I have written make files that compile .pc files in unix. This was for several projects that use an oralib source code directory.Just running proc on one target .pc file works fine on unix. I am trying to use proc - Oracle 10.2.0 - in windows and I keep getting:

Quote:unable to open include file
#include <stdio.h>
and other C library headers.

I am doing all development under cygwin, this way I can write a makefile just like under unix instead of using nmake.All C library headers are in /usr/include When I run proc on Solaris as that:

proc program.pc
No problems, and I do get program.c.

However in windows I get the previous error message. I have tried to do proc include=/user/include program.pc and proc include=/user/include parse=full program.pc but I still get the same error message.

View 3 Replies View Related

Know If Database Files Are On File System Or Raw Files?

Oct 11, 2012

How can I know if database files are on file system or raw files?

View 4 Replies View Related

Compare Two Dump Files And Their Respective Log Files

Jun 20, 2012

I want to compare two dump files and their respective log files, to see the difference. Is there any way.

View 8 Replies View Related

Upload Excel To 9i

Feb 15, 2004

i'm working with oracle 9i database fo my project. my questions are :

1)can we upload ms excel (.xsl) to oracle table. if yes, is it direct upload or we have to change it to .csv or other format ?

2)what are the tools to upload ms excel to oracle 9i?

View 5 Replies View Related

Upload XML Data To Table

Oct 1, 2009

I need to upload data from XML file to oracle database table.For this I have referred some websites in which solutions given in a such a way that loading entire XML file into LOB column inside the table.But I want only data to be extracted from XML file.

Will SQL Loader be best option?

View 1 Replies View Related

SQL & PL/SQL :: BLOB Bulk Upload

May 11, 2011

Is it possible to Bulk Upload file into Oracle table's BLOB column using Pl/Sql code.

The process I know of dose 1 file @ a time as of now.

I have more than 10000+ files to upload and this will be one off process.

View 3 Replies View Related

Forms :: Upload File In Ebs 11i?

Oct 8, 2013

oracle forms 6i
oracle ebs 11.5.10
database 9i

i create new custom application in my ebs 11i and i want to upload file in our custom application and save it as file system in my server lunix

how i attach my file and i want to save it not as blob but just upload to my server and after download it?

View 2 Replies View Related

SQL & PL/SQL :: Upload Word File

Aug 9, 2010

i want to upload a word file to database.I wrote the following code

create or replace directory MY_FILES as 'C:Uploadfolder';

Create table emp1(
EMPNO number(30),
EMPNAMe varchar2(10),
RESUME BLOB);

[Code]....

ERROR at line 1: ORA-22288: file or LOB operation FILEOPEN failed No such file or directory ORA-06512: at "SYS.DBMS_LOB", line 523 ORA-06512: at line 10

View 32 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved