I am recieving errors when trying to load the control file. The errors are as follows:
SQL*Loader-500 Unable to open file (homework.ctl) SQL*Loader-553 file not found SQL*Loader-559 SYstem error: The system cannot find the file specified.
My control file is located directly in the C drive (C:homework.ctl). The control file contains the following
LOAD DATA INFILE 'c:country.dat' APPEND INTO TABLE homework fields terminated by ',' optionally encloded by '"' (country, month, day) WHEN (month='April')
The command I am entering is:
sqlldr system/password control=homework.ctl
I've tried c:homework.ctl, 'c:homework.ctl', and placing the file in the BIN folder of Oracle.
I'm getting an error when trying to use the new Data Pump Export/Import utility.
I am able to create a directory using SQLPLus, and I get the "Directory Created" message, but no directory actually gets created on the server.
SQL> CREATE DIRECTORY datapump AS 'C:Inetpubdatafiledatapump';
Directory created. But I dont see the directory created on the server.
Then on the server:
C:Documents and SettingsAdministrator>expdp ******/****** FULL=y DIRECTORY=datapump DUMPFILE=expdata.dmp LOGFILE=expdata.log Export: Release 10.2.0.1.0 - Production on Wednesday, 01 November, 2006 1:51:55 Copyright (c) 2003, 2005, Oracle. All rights reserved. Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production With the Partitioning, OLAP and Data Mining options ORA-39002: invalid operation ORA-39070: Unable to open the log file. ORA-29283: invalid file operation ORA-06512: at "SYS.UTL_FILE", line 475 ORA-29283: invalid file operation
And whether I run this from my PL/SQL process (PRO*C) or from a command line, it returns:
TKPROF: Release 10.2.0.4.0 - Production on Thu Jan 6 11:04:38 2011 Copyright (c) 1982, 2007, Oracle. All rights reserved.
could not open trace file /u00/app/oracle/admin/DB/udump/BITN1234.trc
I have already (from my PRO*C code and from command line)...
- made sure that the file is in the directory. - run this from the udump directory, where the .trc file is...didn't work. - run this from the udump directory, and specifying explicitely the
complete path anyway in the tkprof line (redundant...I know)...didn't work.
- tried to copy the file to another directory in order to run the tkprof,
and it returns:
cp: BITN1234.trc: The file access permissions do not allow the specified
Import: Release 11.2.0.1.0 - Production on Tue May 7 16:51:47 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Username: sys as sysdba Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options ORA-39001: invalid argument value ORA-39000: bad dump file specification ORA-39088: file name cannot contain a path specification
[oracle@oracledbserver product]$
This is Oracle 11g hosted on an eucalyptus cloud instance.
I tried a lot to load data to table from excel(.csv) using sql*loader the oracle version of sql*loader doesn't support the control file created using notepad(.ctl).Though i given a filename with extension as .ctl it seems as a .txt file. Is there any alternate way to create it?
Lots of email alerts reporting SQL Loader failures (the data is actually loading) but I want to prevent all these email alerts being fired. We have an SQL Loader script that is failing regularly with this error, however the data does end up in the tables so it must run subsequently succesfully the log files are cleared out quite quickly so it is difficult to track the errors. Why is there no filename just a.day reference in the error log file?
Below is the shell script I do not have much script experience, so I am unable to see how I can alter this...could I add some kind of exclusive lock check to see if I actually have access to the file before SQL Loader tries to Load it?
value used for ROWS parameter changed from 64 to 63 SQL*Loader-500: Unable to open file (/e2e_ms_xfer/cent01/.dat) SQL*Loader-553: file not found SQL*Loader-509: System error: No such file or directory SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
This is the full error log file SQL*Loader: Release 11.2.0.3.0 - Production on Sat Jun 15 12:17:38 2013 Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved. Control File: /tmp/e2e_load_ms_raw_coda.ctl Data File:
/e2e_ms_xfer/cent01/.dat Bad File: /tmp/e2e_load_ms_raw_coda.bad Discard File: /tmp/e2e_load_ms_raw_coda.dsc (Allow all discards) Number to load: ALL Number to skip: 0 Errors allowed: 50 Bind array: 64 rows, maximum of 256000 bytes Continuation: none specified Path used: Conventional Table MS_RAW_CODA, loaded from every logical record. Insert option in effect for this table: APPEND TRAILING NULLCOLS option in effect Column Name Position Len Term Encl Datatype -----------CODA_RECORD FIRST 4000 CHARACTER Terminator string : [code]....
I have created the below stored procedure and calling the procedure in when-button-pressed trigger. Problem here is that when I cancel the file selection in file open dialogue box its raising exception.
PROCEDURE TEST IS buffer_lines client_text_io.file_type; v_outputstr VARCHAR2 (32767); p_delimiter VARCHAR2 (10) := '","'; v_transaction_no VARCHAR2 (10); BEGIN [code].......
don't know if I can repeat it. It occured while converting a database to archivelog mode, after completing a Data Pump export/import upgrade from 11.1.0.7. I fixed it like this: juno> startup mount;
ORACLE instance started.
Total System Global Area 1.7108E+10 bytes Fixed Size 2175440 bytes Variable Size 8858373680 bytes Database Buffers 8220835840 bytes Redo Buffers 26886144 bytes Database mounted.
juno> alter database archivelog; alter database archivelog
ERROR at line 1: ORA-19821: an intentionally corrupt log file was found
juno> recover database until cancel; Media recovery complete.
juno> alter database open resetlogs; Database altered. juno> shu immediate; Database closed. Database dismounted. ORACLE instance shut down. juno> startup mount; ORACLE instance started. [code]...
I am using oracle 10g R2. Some how control file is corrupted and database is not open. and there is no backup of control file. Now i need to open the database without recreating the database.
when i try to install Oracle 10g (10.1.0.2) 64- bits on windows server 2003 enterprise edition 64 bits service pack1 and i try for windows server 2003 standard edition 64 bits on a pc,i am getting the below error.
the image file E: is valid, but it is for a machine type other than current machine
I am not able to delete one month old archive log file manually on windows which doesn't having info about the standby on v$archived_log view of primary database. the sequence were already applied to the standby database. It shows the status as deleted in v$archived_log. while deleting the file manually. it showing an error like another program or person is using it.
i have files a.text which is parent file and another one is child one called file b.txt . Both files are linked together by common field called "id". Interesting part child file have multiple layers name associated with ids. (we are only aware that in b.txt for each id there could be max 3 layers)
So they needs to get loaded into Table called PARENT_TBL
So PARENT_TABLE looks like ID NAME SUBJECT LAYER LAYERNO
Upgrading from 10.1.0.2 to 10.1.0.5. Enterprise Manager requires 'newest' version of Oracle JDBC drive.Downloaded what I believe to the correct file (classes12.jar). I'm unclear what to do with this, my readings have pointed me in the following direction:
1) copy to c:oracleproduct10.1.0db_1jre1.4.1in 2) extract
here is the problem...tried:
1) just clicking on it (nothing)
2) c:program filesjavajre1.6.0_03injavaw -jar classes12.jar Error: Failed to load Main-Class manfest atrribute from c:oracleproduct10.1.0db_1jre1.4.1inclasses12.jar
Is my location correct, I've been hunting everywhere..making no progress.
I have a requirement to read flat text file(around 15000 lines) residing at a client location from DB server and write into a table in One cell.
I tried UTL_FILE and DBMS_LOB but, i am not able to access client location to read the file as it reads path from Oracle Directory.
eg. my client path is 198.168.1.1 and my DB server is in unix say 192.168.1.10. file location is: \192.168.1.1shareabc.txt So I created One Oracle directory as MY_DIR having DIRECTORY_PATH as '\192.168.1.1share'. But both UTL_FILE and DBMS_LOB is not able to access the file.
Error Message: ------------- Unable to process CLOB -22288 ~ ORA-22288: file or LOB operation FILEOPEN failed No such file or directory
Few Details for reference: ------------------------- File Location: \192.168.1.1shareabc.txt Unix DB Server location: 192.168.1.10 Table : Test (filename varchar2(30), Content CLOB) Oracle Dir: MYDIR Directory_Path: \192.168.1.1share
Are there any GUI based tools that can auto generate a CTL file based off a CSV input? I'd love something like this since I have quite a few SQL*LDR projects coming up!
I am trying to run the imp command from dos prompt.
Here is the error I got .
C:Documents and SettingssairammMy Documentsartmsexp_AUDT2_04292012>imp sairamm/mypassword@aws fromuser=AUDT2 toUSER=SAIRAMM file=exp_AUDT2_04292012.dmp log=imp_AUDT2_04292012.log BUFFE R=10000000 GRANTS=y Import: Release 9.2.0.1.0 - Production on Mon Apr 30 14:23:29 2012 Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, Oracle Label Security, OLAP, Data Mining,Oracle Database Vault and Real Application Testing option
IMP-00010: not a valid export file, header failed verification IMP-00000: Import terminated unsuccessfully
Is it possible to determine whether the dump file is created using data pump export or normal export method by just looking at dump file, If yes, how ?
Why i am asking such question is...normal export and data pump export would create a dump file with an same extension filename.dmp. So to avoid confusion during import, i would want to determine by what method the dump file was created.
Also this would be useful for me at the scenario when the customer sends me only the dumpfile and ask to import into target database. ( may be the customer don't know in what method the dump file was created ).
I run this script through batch file its working.Problem is dump file showing with this format C:dmp_wed.dmp.I want to date format like that C:dmp_18052011.dmp.
for date formatting.How I can add date format in a batch file.