SQL & PL/SQL :: Import Data From Csv To Temp Then Alter Data As It Goes Into Another Table?
May 30, 2012
I need to import some data from .csv files. There is one file each day, so I want them to be automatically imported into the DB. This is the format it comes in:
so filename is the actual name of the .csv file that this row came from. And reading id is date, num1, num2, meterid combined. And the remaining fields coming from temp_table
i need a trigger with alter commands to alter the table structure,it will be captured in a separate meta data table(META)
CREATE OR REPLACE TRIGGER meta_alter AFTER Alter ON SCHEMA BEGIN update meta set column_name=:new where table_name=ora_dict_obj_name column_name=:old; END; /
Meta table contains Table name and column name..i attached the table data in atext file
Is it possible to hold the data from select statement without temp table or materialized view or view in oracle?
because my DBA does not give access to create temp table.but we are selecting the records from 3 different sql statement.
Example: inserting in temp table
a) insert into temp select empno,ename,sal from emp where sal>4000 b) insert into temp select empno,ename,sal from emp where dept=40 c) insert into temp select empno,ename,sal from emp where comm is null
Export and import of data in oracle forms...i have created 02 boutons one for export his trigger like this:
eclare alrt number; v_directory varchar2(200) := 'c:ackup'; --- that if the C Drive not the Drive that the windows had installed in it. path varchar2(100):='back_up' ||to_char(sysdate,'dd_mm_yyyy-hh24_mi_ss'); v_exp varchar2(200) := 'exp hamada/hamada2013@orcl file = ' ||v_directory ||'' ||path ||'.dmp'; [code]....
this code is correct he expot not only the data but also the creation of the table ....for exemple i do export and everything is good until now and i find the .dmp in the folder backup .. but when i deleted all data from my app and try to import this .dmp iit show me error it tell me thet the table phone is already created...just export the data of phone not the creation of table and data ???? or how can i import just the data from this .dmp ??
I'm currently moving an IOT from one database to another using expdp/imdp. The IOT is non-partitioned and about 100GB in size containing ~1,1 billion rows.
The dumpfile contains nothing else but the IOT. I'm importing with no special parameters, no pre-created IOT, just ordinary dumpfile import. (impdp username/password dumpfile=impdp:iot.dmp nologfile=y )
During import I got unable to extend TEMP errors from impdp.
ORA-39171: Job is experiencing a resumable wait. ORA-01652: unable to extend temp segment by 128 in tablespace TEMP ORA-39171: Job is experiencing a resumable wait. ORA-01652: unable to extend temp segment by 128 in tablespace TEMP
I had to add 2 additional files to my Temp tablespace (total 96GB of temp) before the import could finish off.
Is this temp usage to be expected when importing IOT's ?
i am trying to use exp/imp utility through cmd and exp/imp is done successfully as per message given at last. but data is not import in targeted user.
Microsoft Windows [Version 6.1.7600] Copyright (c) 2009 Microsoft Corporation. All rights reserved.
C:UsersNeetesh>exp
Export: Release 10.2.0.1.0 - Production on Thu Jul 12 14:18:04 2012
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Username: scott/tiger@localdb
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Enter array fetch buffer size: 4096 >
Export file: EXPDAT.DMP > d:/scott_data
(2)U(sers), or (3)T(ables): (2)U > t
Export table data (yes/no): yes > y
Compress extents (yes/no): yes > n
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
About to export specified tables via Conventional Path ...
Table(T) or Partition(T:P) to be exported: (RETURN to quit) >
Export terminated successfully without warnings.
C:UsersNeetesh>imp
Import: Release 10.2.0.1.0 - Production on Thu Jul 12 14:20:09 2012
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Username: localaepuser/flair22@localdb
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options
Import file: EXPDAT.DMP > d:/scott_data
Enter insert buffer size (minimum is 8192) 30720>
Export file created by EXPORT:V10.02.01 via conventional path
Warning: the objects were exported by SCOTT, not by you
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set List contents of import file only (yes/no): no > y
Import entire export file (yes/no): no > y
importing SCOTT's objects into LOCALAEPUSER Import terminated successfully without warnings.
how to import data from Excel sheet (inside server) to oracle DB.
Explanation : I need to automate this work, whenever i get an excel sheets with table values, i need to import the table values in to oracle DB automatically.I need an immediate solution for this.
I have a query regarding importing data in a partitioned table. let me make myself more clear with an example:
I have 1 month table that contains 30 partitions single partition for a single day on one machine say machine A. on another machine say machine B i create the same table with the same script which is on machine A for the same table. i loaded data till 1-15th of a month in Machine A table and rest of 15 -30 Days data into table on machine B at the end i want to import the data on partitioned table on machine B that is from 15th -30th to machine A table. I just want to know whether data is properly imported or not not or i need to specify something
I take export partition wise (15 -30th) 15 partitions dumps and imported into Machine A table. Is it possible that i can import day wise partition from 15th to 30th into a partitioned table which already contains data from 1st -15th partition.
I'm try to import a table of data (character set: CL8ISO8859P5) to another database (character set: AL32UTF8) using exp/imp utility.After the import, all Cyrillic text was corrupted!
Not able to understand what's wrong with the code. I am trying to import data to a table using a CSV file. I have exported the data (CSV) from the interactive report and I am just trying to insert the same data to the table, through a process. When, I tried to do so; its throwing an error message saying NO_DATA_FOUND and file is not getting inserted into wwv_flow_files table.
But when I removed the data from the CSV file for the comments field and then tried importing the file, the process worked. I don't understand whats the problem with the code.
I have a sample app setup in my workspace for this weird problem.
[URL]
Workspace details:
CSV file with comments field and data in it - when trying to import - throws an error message NO_DATA_FOUND
CSV file with comments field and without data in it - tried importing - this worked
I try to transfer data from one database to another one through data pump via SQL Developer (data amount is quite important) exporting several tables. Tables export is doing fine, but I encounter the following error when I import the file (I try data only and data + DDL).
"Exception: ORA-39001: argument value invalid dbms_datapump.get_status(64...= ORA-39001: argument value invalid ORA-39000: .... ORA-31619: ...
The file is in the right place, data pump folder of the new database. User is the same on both base, database version are similar.
When I do the import the of succeeding dump, I drop the existing schema "SQL> drop user username cascade;" and import dump by " impdp system .... ". I would like to import a dump to an existing instance but only data import and will leave the current packages and other metadata untouched and unchanged on the said existing instance.
1. Do i need to drop user before the import if my requirements are the above?
2. If i need to drop user, what should be script.
3. For the import itself, what parameter should i use?
4. What are the necessaries I need to consider before doing the import.
I performed a switchover test of my Exadata databases last night. Both databases are running 11.2.0.2 (BP7) on top of GI of the same version.I'm using Data Guard Broker to administer the Data Guard configuration.
I have, as you'd expect, the standby_file_management set to AUTO, so any file changes/additions/deletions that are made on Primary should be applied to Standby also.And they have been. Until last night.
When I had switched over to running Primary on the Standby site, I got this error message:
Tue Jun 04 22:27:12 2013 Errors in file /u01/app/oracle/diag/rdbms/exdw1pdg/exdw1pdg1/trace/exdw1pdg1_ora_26630.trc: ORA-25153: Temporary Tablespace is Empty
I checked and my two temp tablespaces existed, but had no files in them. These files are 200Gb and 448Gb in size, so you'd think you'd notice them going missing. This wasn't by any means the first time we switched over (and, yes, I did create temp files for Standby when I built it and first switched over)
We've switched over to Standby multiple times and even ran a whole day's processing against it and haven't seen this. Ultimately, it wasn't a big deal, because I just created a tempfile for each of the tablespaces and off we went.Nothing in MOS seems to mention something like this. Basically, it looks like the switchover process decided to eat my tempfiles but keep my temp tablespace defintion. Odd.
I have an oracle autoback of controlfile and spfile. I am trying to restore the production database using this backup, into a new server with a different directory structure.
I have completed the following steps in the restore process.
1. Creating a pfile from spfile
2. Changing the the location of cdump, udump, bdump and the control files ( there are three )
3. Then creating a spfile from this pfile
I am stuck on the next step where to rename all the datafiles and tempfile and restore the database. Following is what I did.
After mounting the database in RMAN, tried to run the following.
RMAN> run 2> { 3> allocate channel c1 device type disk 4> ; 5> @/home/oracle/rman_scripts/newloc.rman 6> SET NEWNAME FOR DATAFILE 1 TO '/u02/oradata/dorian/system01.dbf'; 7> SET NEWNAME FOR DATAFILE 2 TO '/u02/oradata/dorian/undotbs01.dbf'; 8> SET NEWNAME FOR DATAFILE 3 TO '/u02/oradata/dorian/sysaux01.dbf'; 9> SET NEWNAME FOR DATAFILE 4 TO '/u02/oradata/dorian/users01.dbf';
[code]....
I am getting the following error message.
Starting restore at 25-JAN-2012 21:26:48 released channel: c1 RMAN-00571: =========================================================== RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS =============== RMAN-00571: =========================================================== RMAN-03002: failure of restore command at 01/25/2012 21:26:49 RMAN-06026: some targets not found - aborting restore RMAN-06100: no channel to restore a backup or copy of datafile 30 RMAN-06100: no channel to restore a backup or copy of datafile 29 RMAN-06100: no channel to restore a backup or copy of datafile 28
when i press when button pressed trigger, i want first the form will delete all the previous data and then populate the data from the table, that's why i used clear_block first, but this clear_code is not working here. my coding is given below
go_block('show'); clear_block(NO_VALIDATE); declare cursor c1 is select * from qtr_demand order by 1; begin [code]..........
Let us say, if i enter empno 10 (which is not there in database) in FIND Screen --> press FIND Button, it's showing up 'QUERY CAUSED NO RECORDS', Till this point it's working fine;. But after this, if i press CTR+F11 in block A, it's not pulling records. only this case it's not pulling records.
But if i enter something else in FIND Screen, if it returns any data, then if i press CTR+F11,it's pulling all records.
why it's failing to pull records if i try to query data in first case only.
I have written a java code which reads 2 millions of data under a particular column from CSV file and store it into a set. Now there is a table in Oracle database which contains 10 millions of records for that particular column. Now, I want to form a SQL query which select those records under that particular column from the database table which is in CSV file but not in database table. For e.g.
If I consider the CSV file name as employee.csv and it has column called employee_name under which the records are as follows