I have a specific requirement. Currently in our system a SQL script is automated trough TOAD DATA ANALYST (Contains 50 sql Select statement) and all the output are stored in a single excel file in multiple worksheet. All these sql statements are running sequentially. Now my requirement is
1) All the sql queries will be executed in database in parallel .
2) All the output of select statement will be stored in a single excel.
3) Idea using any Client but Toad is preferred.
I have to export the results of a query to an Excel spreadsheet. Easy enough; however it is that I have to do it for each group. What I want to do is to be able to save it in a file that has the group number.
This is the query Lets say that I have the groups in the valiable RegionID: 11213, 21345 and 6537
@export on; @export set filename="C:11213IP_claims_11213_.xls" select * from mytable where RegionID=11213;
How I can make it that I do not have to manually change the folder name in the direction and the where statement for each RegionID?
I have some problem when i try to export data from a table which contains a nested table, using toad.
When toad generate the file, in the column corresponding to the nested table, toad just write a (DATASET) instead of the data contained in the nested table
Here is an example: INSERT INTO SSD_REV_S ( REV_ID, REV_TAB, REV_TS, REV_USER, REV_LOG ) VALUES ( TO_Date( '07/30/2007 12:00:00 AM', 'MM/DD/YYYY HH:MI:SS AM'), (DATASET), TO_TIMESTAMP('4/3/2009 11:20:51.000000 AM','fmMMfm/fmDDfm/YYYY fmHH12fm:MI:SS.FF AM'), 'operator', 'Add EVENT');
REV_TAB is my nested table
way to export data from a table which contain a nested table, as a list of insert statement, so i can move the data to a different database schema? I can also use different client tool.
I would like to use the Spool command to export data for other purposes within the application. We would like to use tab delimiter to seperate the fields but the client wants to know if the text datatype fields can be wrapped in double quotes along with the tab delimiter..
SQL>create table test (id number(2), first_name varchar2(15), last_name varchar2(15),var_no number(4), type varchar2(1),type_no number(12));
Table created.
SQL> insert into test values(1,'mary','ross',132,'S',12);
1 row created.
SQL> insert into test values(3,'Sue','Bill',432,'S',12);
1 row created.
I tried the below spool command to use tab delimited for all the fields but not sure how to wrap double quotes for only the text fields and also would to have the column names in the 1st row but don't seem to get the full column name in the csv file.
set echo off set feedback off set linesize 1000 set pagesize 4000 set trim on set headsep off set colsep '' (used tab between the quote)
spool test.csv select id,first_name,last_name,var_no,type,type_no from test; spool off
I want to export forms data into Excel sheetfor that i am using Client_Ole2.I have attached Webutil object library and Pl/Sql libraryStill I can not export data from Form to Excel sheet
Anyway, I do not have much experience with databases (with MS Excel I do) but most of the problems I have been able to resolve by myself.I am using a TNS database.
I want to import data from SAP to the database, but we are having some problems with the connection and so on.Anyway, the only way to get the data into the database is to key them in manually or to import them.I have been using the import function (I had a lot of trouble with date formats) and it worked:I selected the "Import Data", the corresponsing file and then have been able to see an extract of the data. It worked and I have been able to import data from an Excel file to the database.
But an error occured: Some lines have not been copied.So I wanted to reimport them.But from this point of time I have not been able to open any Excel file anymore...When I select the file no extract/preview is given anymore and the file seems to be empty.I cannot select tabs of the Excel file or anything like this.
I restarted the computer, established a new connection to the database and even reinstalled my client.It seems that one setting has changed.I am still able to import the data from *.csv files, but this is not as comfortable that using Excel.
i'm working on sql developer my table contains 40 columns and contains around 4 to 5 lakhs records........
when i'm trying to export the results into excel or text file my sql developer is getting hanged... if the result is less than 2lakh record its copying....
want to load data from an excel file to a database table in Oracle. I am using Oracle 11 and the excel file has 3 columns as compared to 5 columns in the destination table. I want to generate sequential nos also for the table.
Here is one way to create EXCEL file from oracle sql query and prevent excel displaying large numbers in scientific notation(exponential notation)
set feedback off set verify off set heading off spool c:excel_test.xls select 'PO_NUMBER'||chr(9)||'VENDOR_NUMBER' from dual union select '=PROPER('||po_number||')'||chr(9)||'=PROPER('||vendor_number||')'||chr(9) from invoices where rownum < 12 order by 1 desc
Note that PO_NUMBER is 16 characters, VENDOR_NUMBER is 15 characters in invoices table.
i have problem with export data from SQL to excel format my question do i can export directly data from sql to excel or i have to write sql statement in form report and from form i can export it to excel
I wanted to load some data(selected rows) from one system to other.
from source(production) system Toad: selected * from employee table where registration date is not today.
from data grid I exported the rows using save as (in insert statement format) to employee.sql.
the sql file(employee.sql) has the below values for emaplyee 2,60716197237375E17 in the source(production) system this value is 260716197237375278, Later I loaded the sql file to dev system at linux prompt with sql file
@emplaoyee.sql
I retrieved the data after loading using to_char
select toc_char(emp_code) from employee_t;
the value I got is :260716197237375000
where as my expectation is : 260716197237375000
Is there any configuration required on linux system to convert the data correctly while loading to database using @employee.sql?
To conclude:
Exported value : 260716197237375278 (but exported as 2,60716197237375E17)
After import the value is : 260716197237375000 instead 260716197237375000
I am trying to export data into excel (office 2007 and above- .xlsx) using the calls mentioned below, but i am getting an error. i am able to upload data and open the document in IE succesfully using office 2003 format(.xls)
The following are the calls made in the package and it works for .xls format
OWA_UTIL.MIME_HEADER('application/vnd.ms-excel');
The following are the calls made in the package and it is not working even thow i can see that the document getting opened .xlsx format
If I am using the DESFORMAT=DELIMITED then I am getting unformated data and some time I am getting less data as per PDF output.
There are about 25 queries with interrelated way.
How can I get the output in excel from a RDF.
In Reports 6i Functionality was working while we were running reports because of reports were running in Previewer mode but in 10g reports output coming in PDF format.
i need to export master data in excel sheets to our database and we use toad too. How i can export the data with the use of macros in excel. how i can export data from excel to oracle.
We are having two data blocks as Earnings and Deductions. We need to export this to an excel in single sheet parallel [ imagine your payslip format ].
if we use normal text_io we are not able to get the result we want. so we have tried using a package called export2excel. we achieved what we want. The form is working perfectly in client server concept. When we move the same form to our Unix application server, it is not working.
I want to use the saved Toad username password into SqlDeveloper. is there any way to use the same username password without recreating new connection to users.