SQL & PL/SQL :: Importing XML Files Into Oracle Database
Apr 25, 2010
I have a big problem that came up latly which is importing XML files into oracle database.The point is that I have extracted whole PostgreSQL database into XML files - 236 tables - 1 XML file for every table and now I'm about to import them into Oracle tables. First of all, I would like to point out that I already have the structure of all the tables in oracle database, the files only carry the data (records) that need to be imported into oracle.
I've been trying to make it running and I can't do anything more serious about it for over a week..I will show You all example:
That was one of my attempts to import data from file "ps_sprawozdania.xml" into table "ps_sprawozdnaia" into oracle. Here are 2 records from the XML file to show you Its structure
<NewDataSet>
<Cust>
<miesiac>7</miesiac>
<umowa_rok>2008</umowa_rok>
<umowa_nr>051/210412/01/000/08</umowa_nr>
<nr_korekty>0</nr_korekty>
<nazwa>Sprawozdanie z realizacji umowy nr 051/210412/01/000/08 za miesiąc Lipiec</nazwa>
[code]....
There is a report that is generated everyday in the .csv file format that i would want to load into the database. I want to completely automate this process. I don't want to use the load function available in APEX. I use APEX 3.2 Version.
What i mean by automation is that i will create the CSV file and automatically move to a location from which APEX can access the data. I would then write a procedure to fetch the content directly from CSV file and write to the database. What i need is the location from which APEX can access data directly ?
Actually I am using Oracle 9i in Windows Operating System. Now I want to move to my entire Oracle Database to Oracle 11g in Linux version. What are all the correct steps to do this. I am not a DBA.
I have a rather complicated process to import text files into my DB.I'm given thousands of files every day, separated by "," and with 80 fields each. With a bash script, I take the 45 fields I need and then split each file into x number of files grouping the rows by three fields.Then I use SQL Loader to insert them into de DB.
The problem is that now I must insert on two tables and the "WHEN" clause doesn't allow the use of > and <.
To make things a litle clearer take this text file (already splited and grouped and ready to be inserted): ... 1,1,135,1900,0,12,114,2011/08/25 17:19:00,135,... 1,1,135,1900,0,13,119,2011/08/25 17:19:00,136,... 1,1,135,1900,0,14,117,2011/08/25 17:19:00,137,... 1,1,135,1900,0,15,113,2011/08/25 17:19:00,138,... 1,1,135,1900,0,16,119,2011/08/25 17:19:00,139,... ... When field 6 is higher or equal to 14, it must go to table a.When field 6 is lower than 14, it must go to table b.I can't use external tables as I'm in a different server.
1) I have 5 Exported Dump files. 2) All of those 5 dump files were taken in different time periods. 3) Many of those Dump files are having the same Partition records.
eg:- Dump 1:- 01-06-2010 to 31-11-2010 Dump 2:- 01-09-2010 to 31-12-2010
4) Now i want to import all those partitioning data into a single table, without having any duplication.
I am considering all of the capabilities and benefits of using Data Pump for exporting and importing extremely large data files. Would like to know if importing to tape is possible? If so, would the data be accessible if needed later?
Here I am explaining the process of how I am trying to insert pdf file into oracle database.
create or replace directory files as 'c:/welcome/';
(physical directory is created in the system also., both in server and client machine)
Create or replace PROCEDURE procloadMetaPdf (Filename IN VARCHAR2) is temp_blob blob:=empty_blob(); location BFILE; Bytes_To_Load Integer:=0; auto_Id number; Begin
[Code]...
procudure creating successfully
but when executing
exec procloadMetaPdf('help.pdf');
displaying the following error:
ERROR at line 1: ORA-22285: non-existent directory or file for FILEOPEN operation ORA-06512: at "SYS.DBMS_LOB", line 605 ORA-06512: at "SCOTT.PROCLOADMETAPDF", line 14 ORA-06512: at line 1
(line 14 is : DBMS_LOB.OPEN(location , DBMS_LOB.LOB_READONLY)
I would like to ask you if you know which built-in can I use for transferring a excel file from our Unix box to a table in oracle database, right now we are using webutil_file_transfer.Client_To_DB_with_progress using forms developer, but I need to run as an automatic process uploading form unix into oracle directly without using forms.
I am trying to setup my environment to prepare for hands on DBA exercises from an OCM handbook for 10g -OCM:Oracle Database 10g Administrator Certified Master Exam Guide
I need access to the following files gc_10205_part1of2.zip, gc_10205_part2of2.zip for 32 bit linux
Apparently the above along with other Grid software was once attainable from the link below but sometime between this year and the last Oracle decided to remove this and other software from public access. URL.....
I doubt Oracle will hear my plea - educational purposes only - so given I want access to the software only.
I am migrating data from a Solid Database to Oracle, I am using Flat Files to do that.
1.- I download the data to flat files from Solid 2.- I move the files to Oracle server 3.- I upload the data to Oracle
Now, I have done the 90% of the data base, but I have found some tables that has description columns and in this description the users writes enters, so when I try to upload the data to Oracle SQL loader cannot recognize this characters.
Example:
'25','0.','5.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','','' '26','0.','2.','0.','0.','0.','0.','3.','0.','0.','0.','0.','0.','','' '27','0.','1.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','','' '28','0.','1.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','','' '29','0.','38.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','','' '30','0.','13.','0.','0.','0.','0.','0.','6.','0.','6.','0.','0.','|SE RECHAZA B20CS50SNW ^M ^M SE RECHAZAN CINCO PZAS ^M DOS MOD. HSC15I41EH,DOS MOD. HSK15I41EH |Agregó: 06/06/2009 12:22:50 |','DEV. A PROV.' '31','0.','50.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','','' '32','0.','9.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','','' '33','0.','2.','0.','0.','0.','0.','0.','0.','0.','0.','0.','0.','',''
Process started but after sometime it gave 1200 errors. Is it due to the Different Database name or Is it because i did not create table space in destination database.
I am using Oracle 11g R2 version.I want to import the DB statistics. But i am getting an exception when i execute the command DBMS_STATS.IMPORT_SCHEMA_STATS ('user1','STATS_INFO', '','', TRUE, FALSE).
The error is ORA-20000: no statistics are imported ORA-06512: at "SYS.DBMS_STATS", line 10603 ORA-06512: at line 1.
The privileges 'ANALYZE ANY' and 'ANALYZE ANY DICTIONARY' is already given to the user.Also i executed this command as sys. But still error occurs.
Same command is successfully executed in Oracle 10g. Is there any difference in importing the statistics in Oracle 10g and 11g ?
I have received a dump, which i need to put on a newly created schema, there is a particular table with more then 4 million rows, and other tables have hardly few thousand rows.
I want to import it in a way where only 1000 rows get imported for this table and other tables do not get affected. Is there a way to do it?
While trying to import a schema using Data Dump, I am facing the following issue - UDI-00018 - Import utility version can not be more recent than the Data Dump server.Following is the version information of the source and target DB and the utilities :
Source DB server : 10.1.0.2.0 Export utility : 10.1.0.2.0 Import utility : 10.1.0.2.0
Target DB server : 10.1.0.2.0 Export utility : 10.2.0.1.0 Import utility : 10.2.0.1.0
We have a customer that was backing up his data by copying the oracle data XE folder (all .dbf files) to a backup drive. His server crashed and he formatted it and reinstalled the database 11g from scratch.
Now he needs to re-attach the .DBF files, is there a way?
I am an absolute neophyte at this. I'm fairly good at SQL, but have not administrated a database before.
I am trying to take a backup from our production database and restore it to a fresh install of Oracle 10.2 on a Windows 2003 server using RMAN - but I am not getting very far. Software versions are all the same. I have access to the manuals, but it seems like every link I follow adds more questions than it answers.
I have gone through the wizard and set up oracle on the new box - that all seemed to go smoothly. In order for this to work, do I need to set up the database with the same DBID as the one in the backup files? Or is there a way to import the data from the backup and have it build the database from those files?
I'm not sure I'm even asking the right questions. Do any of y'all know of a tutorial that would walk me through this process step by step?
I have a requirement. My Front end is Oracle Application. If any user deletes the data from front end screen. One log file should be generated. That file will save in one folder in my server. And That log file consists which code is deleted. And the user name who deleted the code, time and deleting status..
Now my requirement is i have to develop a report to display the log file data. But i don't have a database table to retrieve the data. The data consists Log files. How to transfer the Log files in DB table.
load data infile 'trlc.csv' replace into table trlc fields TERMINATED BY '|' TRAILING NULLCOLS (est_no,right_no,maj_auth,weight,idm_ht,c_date,P_tkt)
The rows get inserted successfully. But the result sets are different, for example: When I do a select in SQL Server,'select len(weight) from trlc;' , I get the length as 0. But when I do a select in oracle database, I get the length as 1. Also, the result set varies for the query below:
select * from trlc where weight=' ';
(SQL Server returns 1 row but Oracle returns no rows)
Do I need to mention any conversion code for the weight field to accept ' ' value?
We're currently in a situation where the primary database server fs size does not match the standby database server fs size.
Standby database filesystem is almost 100% utilized, and we suggested to move some of the datafiles first to avoid threshold alerts and archive gaps.
Now, if we're gonna move datafiles on the physical standby, I believe the process would be stop managed redo apply -> shutdown standby -> OS move -> startup mount -> alter rename -> start managed redo apply. Is this correct? If not, how?
Also, would it have an effect if the controlfiles of primary and standby do not match because of the movement?
I have the following situation: a customer sent me a group of files, that he said to be a "backup" of an old database, that is no longer online (he doesn´t use our software anymore). However, he needs to recover the data in this database. The person at the company who generate these files doesn´t work there anymore, so we don´t have any of the following information:
- Database version, connection details, service names, user names neither tablespace names.
We were just told that these files were recorded in a DVD, and is que only source remaining. The files he sent are the following:
2) Created a Source Directory --> DR1 3) Created a Target Directory --> DR2 4) Created a DB Link CREATE DATABASE LINK LINK1 CONNECT TO <username of the Remote DB> IDENTIFIED BY <Password> USING '<Remote DB Name taken in the TNSFILE>. 5) In the Local Server I had written the below command.
create or replace procedure proc1 is cursor c1 is select recid,substr(name,37) "ARCH_FILE" from v$archived_log; var1 c1%rowtype; begin open c1; fetch c1 into var1; [code]...
It is working, but not coping the Files from Local To the Remote.
We have a host for restore purposes. We execute monthly or quarterly restore operations to verify that I am able to restore a subset of the data in the given amount of time or to other purposes. I have a automated script to Clone the Database, but to Clone Database we need remove the old database from ASM before start this operation. I want remove only Database files and keep the configuration (such as oratab/network/ocr and so on).
Question: There is a easy way to remove these files without connect on ASM or by using DBCA?