Server Utilities :: How To Access Local File Residing At Client From DB Server
Nov 23, 2011
I have a requirement to read flat text file(around 15000 lines) residing at a client location from DB server and write into a table in One cell.
I tried UTL_FILE and DBMS_LOB but, i am not able to access client location to read the file as it reads path from Oracle Directory.
eg.
my client path is 198.168.1.1 and my DB server is in unix say 192.168.1.10.
file location is: \192.168.1.1shareabc.txt
So I created One Oracle directory as MY_DIR having DIRECTORY_PATH as '\192.168.1.1share'.
But both UTL_FILE and DBMS_LOB is not able to access the file.
Error Message:
-------------
Unable to process CLOB -22288 ~ ORA-22288: file or LOB operation FILEOPEN
failed
No such file or directory
Few Details for reference:
-------------------------
File Location: \192.168.1.1shareabc.txt
Unix DB Server location: 192.168.1.10
Table : Test (filename varchar2(30), Content CLOB)
Oracle Dir: MYDIR
Directory_Path: \192.168.1.1share
I have one .mdb (Microsoft Access Database) file and it has some tables in it. I had load it once using toad. But now i have to load it frequently into the database. Is it possible using external table, so i can access that tables using "select" statement.
i am working on Database 11g. I Need to download file to local after the file is created in the database directory using UTL_FILE . i am able to generate the file but not aware how to copy or download the file to local using PL/SQL . i have done the same in forms using webutil pll.
I've a question regarding difference of character sets, while taking a export(logical backup) of database on directly to server(linux RHEL 2.1 AS) and export on a client (windows xp prof machine, where only a oracle 9i client is installed). On server it seems to fine and okay, but on client node i'm getting following error for almost all tables.
EXP-00091: Exporting questionable statistics.
My question is :
[1] Is it creating any sort of problem, if later on i import the data which was taken from client node.
[2] Why there is a difference(marginal) in dump(.dmp) file size.
[3] Is there any way to overcome it, or it is the natural behave of it. Means not a problem.
[4] If i'm using a long or blob as datatype for some of my table,is they have any problem if i persist like above.
Additional Information about character sets On server node :
Export done in US7ASCII character set and AL16UTF16 NCHAR character set server uses WE8ISO8859P1 character set (possible charset conversion)
On client node :
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set server uses US7ASCII character set (possible charset conversion)
We are on oracle 11.2.0.2 on Solaris 10. I dont have acces to the db server but connect to the db from the client side using sqlplus. though I have dba privilege (at oracle level) but no access to db server at os level. I also dont have access to enterprise manager console where such information is available.
I want to set up monitoring so that I can get a mail when the space falls below some threshold like say tablespace is 90% full. Is it possible to get mail from pl/sql script for example?
I exported oracle database from one server in a dmp file. then i imported it in another oracle database server. when i saw the imported data the columns which were storing german data is in rubbish characters.
then i remember that the database from where i exported is having nls language as german. i executed this statement to set the nls on the new server
alter system set nls_lang=german SCOPE=SPFILE
but now my database is not getting started always giving me error - cannot access NLS data files or invalid environment specified.
i also set the path NLS_LANG=german in the solaris environment.
I am using oracle 9i release 2 and forms 6i.I have a question,I have stored all the fmx files under one folder in a single server called "MAINSERVER".From here only client system calls the forms.In am calling forms in call_form function like
here mainserver is the server name, forms is the folder name and employee is the form name in fmx format.My question is when i call form from server with in a network no problem but When i call form from outside lan that through internet via static ip forms are not called.I tried like this way call_form('\198.52.67.66FORMSemployee.fmx',no_hide).Here i have specified static ip and forms folder but is not working.
ERROR at line 1: ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-00554: error encountered while parsing access parameters KUP-01005: syntax error: found "identifier": expecting one of: "binary_double,
[code].....
when I select count(*) on the external table created below.
SQL> select * from v$instance; INSTANCE_NUMBER INSTANCE_NAME --------------- ---------------- HOST_NAME ---------------------------------------------------------------- VERSION STARTUP_T STATUS PAR THREAD# ARCHIVE LOG_SWITCH_WAIT
I have a file which is placed on a certain location on unix server. I want to write a code which picks up that file and emails it to certain recipients.
Is it also possible to use the EXPORT and IMPORT utilities from a client machine? I want to give these utilities to one of my developers without allowing him to sit in front of my Oracle server.
I'm installing a new application-testing server, i have installed 11g r2 instant clients & SQL* Plus client.
when i'm trying to run an expdp command, i get this:
'expdp' is not recognized as an internal or external command
Now, i understand this is because i don't have the Bin directory of a client installation in my Path of the OS. My question is, which one exactly i need for using data-pump utility, and where to download it?
I've found lots of posts of people that had issues with defining the ORA_HOME$in in the $PATH, or having a client incompatibility issue throughout the web, but no answer to my specific question.
i have files a.text which is parent file and another one is child one called file b.txt . Both files are linked together by common field called "id". Interesting part child file have multiple layers name associated with ids. (we are only aware that in b.txt for each id there could be max 3 layers)
So they needs to get loaded into Table called PARENT_TBL
So PARENT_TABLE looks like ID NAME SUBJECT LAYER LAYERNO
When I try to download file from client to server a found the following error./oas/product/j2ee/OC4J_BI_Forms/applications/formsapp/formsweb/WEB-INF/lib/frmsrv.jar.
Are there any GUI based tools that can auto generate a CTL file based off a CSV input? I'd love something like this since I have quite a few SQL*LDR projects coming up!
Import: Release 11.2.0.1.0 - Production on Tue May 7 16:51:47 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Username: sys as sysdba Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options ORA-39001: invalid argument value ORA-39000: bad dump file specification ORA-39088: file name cannot contain a path specification
[oracle@oracledbserver product]$
This is Oracle 11g hosted on an eucalyptus cloud instance.
I am trying to run the imp command from dos prompt.
Here is the error I got .
C:Documents and SettingssairammMy Documentsartmsexp_AUDT2_04292012>imp sairamm/mypassword@aws fromuser=AUDT2 toUSER=SAIRAMM file=exp_AUDT2_04292012.dmp log=imp_AUDT2_04292012.log BUFFE R=10000000 GRANTS=y Import: Release 9.2.0.1.0 - Production on Mon Apr 30 14:23:29 2012 Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, Oracle Label Security, OLAP, Data Mining,Oracle Database Vault and Real Application Testing option
IMP-00010: not a valid export file, header failed verification IMP-00000: Import terminated unsuccessfully
Is it possible to determine whether the dump file is created using data pump export or normal export method by just looking at dump file, If yes, how ?
Why i am asking such question is...normal export and data pump export would create a dump file with an same extension filename.dmp. So to avoid confusion during import, i would want to determine by what method the dump file was created.
Also this would be useful for me at the scenario when the customer sends me only the dumpfile and ask to import into target database. ( may be the customer don't know in what method the dump file was created ).
I run this script through batch file its working.Problem is dump file showing with this format C:dmp_wed.dmp.I want to date format like that C:dmp_18052011.dmp.
for date formatting.How I can add date format in a batch file.
I have A Daily hot backup using Expdp Command On oracle 10g R2 installed on the Linux server. And I'm trying to move this Dump File to Another directory on Windows server 2003 over network using Ftp script which will be run after the export process finished Automatically.
select sum(bytes)/(1024*1024*1024) "GB" from dba_segments where owner='JACK';
The above select query give the output of Schema size with 15 GB. When i perform the same schema export, the dump file size generating is 2 GB. What is the difference between the two scenarios as how come there could be a variation in file size?