Precompilers, OCI & OCCI :: Data Lost After Adding Null Terminator For VARCHAR Data Received From Database
Dec 13, 2010
We are migrating a proc application as described below.
Old Env: UNIX
Old DB: Oracle 8i
New Env: Linux
New DB: Oracle 11g
New modules are successfully compiled in Linux environment. But we are facing issues in writing the output of VARCHAR datatype to a file.
find below the extract of code.
EXEC SQL BEGIN DECLARE SECTION;
varchar mcolmnvarchar[4];
EXEC SQL END DECLARE SECTION;
EXEC SQL DECLARE crs CURSOR FOR
SELECT NVL(colmn,' ') FROM table1
memset(mcolmnvarchar.arr,'�',4); //Was added for only Linux migration. Not present in unix env.
EXEC SQL FETCH c1 INTO :mcolmnvarchar;
cout << "Data at Stage one"<< mcolmnvarchar << endl;
mcolmnvarchar.arr[mcolmnvarchar.len]='�';
cout << "Data at Stage two"<< mcolmnvarchar << endl;
fprintf(fptr,"%-4s",mcolmnvarchar.arr);
Above code works absolutely fine in Unix env with Oracle 8i. But with Linux env & Oracle 11g it is not working. No compilation or run time errors. Data at Stage one prints the output of database properly. But after null terminator code, Data at Stage two statement prints without any value. Value is lost after null terminator code.
I use a cursor to select records from a database table into a C structure as follows...
{ int iLoop = 0; int iResult = 0;
[Code]....
The otc_multiplier field is NULL. As is the to_date sometimes. However, when I output the records later, the entries where the to_date is NULL come out fine (no value). But the otc_multiplier is getting output as 0.0 using...
// this is output later in another fuction using the following.. sprintf(newRecord, "%.1f",daServiceRecs->itemMultiplier);
If using c structures in this manner, what is the method for ensuring that numeric values are set to NULL when required?
I am seeing some trailing characters in the coloumn when we are inserting Blob. I am doing SQLBindParameter with SQL_C_BINARY and SQL_LONGVARBINARY as InputOutputType and ValueType respectively.Do you see any problem in this. I get this problem when I am running Oracle 11 g client on an Windows 2008 Server 64bit.When the same set of query is fired from Windows 2003 Server 32bit with Oracle 10g it works fine and no trailing character gets inserted.
I want to insert XML data into my ( Oracle 11G Release 1 ) XMLType table using OCCI. I'm getting
ORA-01461: can bind a LONG value only for insert into a LONG column
My XML data size is around 1.5 to 2MB. I have also tried using setMaxParamSize before calling the setString method. But, still I'm getting the same exception.
I am trying to load a input flat file (french data in it) into the database using pro*c (not using sql loader because of some validations ). i am reading the line by line and populating it into the structure and then process it.
The input file is encoded in WE8ISO8895P1 , I want the records to be populated into a table. so i did set NLS_LANG=French_France.WE8ISO8895P1 and ran the pro*c program . i used character host variable in the insert query , i used data from the earlier read structure and set these character host variable.
The problem i am facing is , when i am printing the values before insert i could see correct data.
For example the printed data of variable is "pas de donné " and strlen is 14. the target table field is of varchar type ( name varchar(20 char) )
but after i insert i could see only the truncated value in the database i.e "pas de donn" in the table length is 11.
I'm new to SQL Developer and am using the wizard to import a dataset with a non-standard row terminator.The row terminator is essentially 2 spaces back to back.
In SQL Server I specified the following in the SQL Server wizard and it worked:
I need to load a file with fields separated by '|^|' and at end of each record has '||*||'.
So in my ctl file what do i mention ? fields terminated by '|^|' ? for the record termination wat should I say? Should I still mention 'trailing null col' in my ctl file...?
Sample data file: Name|^|Age|^|city||*|| john|^|33|^|||*|| james|^||^|nyc||*|| ken|^|44|^| washington||*||
the fields are properly terminated with |^| and the records are terminated with ||*||. Is it true that a file with |^| as field terminator cannot be loaded with sqlldr?
i used example in demo directory to connect Database.I used "ansidyn1" example. But i can connect to my DB.I have just modified oracle_connect in sample code(in ansidyn1.pc),follow it:
I want write a program using oci to login into oracle database.Advance security option for the oracle server is set to kerberos authentication. Is it possible to login to the oracle database through olog() function.
Initial ticket is required in kerberos authentication. So is it also needed durinfg the login from my programme.
connecting to an oracle db with the instant client 11.01.06. I can connect to the db via other tools like sqldbx or SqlTools 1.5 which are also using oci. So now I tried it with c++ using Visual Studio 2005 SP1.
where tnsservice-name is the name I specified in the tnsnams.ora, and which I used with the other tools working fine -> same error.My username is only 4 long, so I thought it might have something to do with the string. So I tried:
now it returned the error "TNS:connect descriptor too long". Can't have anything to do with the .ora file, since it worked fine for the other tools. Plus it didn't matter if I renamed the .ora file or removed it from the TNS_ADMIN directory..So I searched a little more for a solution and so an example, where the the last argument is missing. So I tried
It returns: "TNS: protocol Adapter error" which I somehow can understand, since the adapter doesn't know to wich db to connect.I also tried to use some frameworks like soci or otl but was never able to compile them... (link errors, unknown data types etc...).
Previously we had 32 bit C++. Now, we have migrated it to 64 bit. And our C++ programs interact with Oracle 10g DB.Our C++ program was working fine with 32 bit. But once after we migrate to 64 bit we are facing problem with one program which does FETCH(EXEC SQL FETCH SUBP1 INTO :new TabRec;) from Oracle DB. ie, We exit from a for loop in the C++ program when we get NOT FOUND(sqlca.sqlcode=1403) on executing the FETCH statement.
The sqlcode generated for NOT FOUND scenario is 1403. But, once after moving to 64 bit C++, we do not see the sqlcode 1403 instead we are seeing a different code 7124089117159473.
As the sqlcode is not 1403, our program does not exit from the for loop and goes on an infinite loop.Am I missing anything that makes me to not get the exact sqlcode?
I am using Pro*C/C++ Release 10.2.0.2 in HP-UX. But Pro*C/C++ application was written long back during oracle release 8. Now we are facing a problem like EXEC SELECT query is not fetching the records even though record is present in DATABASE.
This is not happening every time. This problem starts happening only after heavy use of the unix process.
For every request, unix process will fetch the record and updates the same at the end and process goes to wait mode to get the request again. Let say after 50 request, process is returned with no rows found error.
It started working fine only after restarting the process and problem starts again after 50th or 60th request. This problem we are facing only after upgrade to 10g.
. I need to configure simple standby database. I have followed this[URL]...-guard-setup-11gr2.php tutorial to do that Problem is that primary db cannot log on to the standby db. Informations privided below
:Primary DB:CentOS 6.4Oracle 11gR2ORACLE_SID=primdb1SQL> SELECT MESSAGE FROM V$DATAGUARD_STATUS; MESSAGE--------------------------------------------------------------------------------ARC0: Archival startedARC1: Archival startedARC2: Archival startedARC1: Becoming the 'no FAL' ARCHARC1: Becoming the 'no SRL' ARCHARC2: Becoming the heartbeat ARCHARC1: Beginning to archive thread 1 sequence 31 (336165-356856)Error 12514 received logging on to the standbyPING[ARC2]: Heartbeat failed to connect to standby 'stbydb1'. Error is 12514.ARC1: Completed archiving thread 1 sequence 31 (336165-
While i'm trying some of fail over and switch over scenario's on my primary and standby db configuration. But suddenly i completely loss my standby db box, due to hardware failure. so now i'm not able to recover that box. It is almost non recoverable hardware failure to that box. My primary db is still up and running well. When i check the alert logs of primary db it has some errors related to standby db. Due to configuration it tries to sync the standby db, which is not possible to connect anymore. So my main question is how get rid of that errors/stop that errors from primary db. which parameter's i've to turn off in primary db. Herewith i'm giving some of initSID parameter of Primary db and a snap of alert log from primary db.
My primary db is in MAXIMUM AVAILABILITY mode. Snap from alert files
Errors in file /ora9isoft/odb/OH1/admin/wbdata/bdump/wbdata_lgwr_1248.trc: ORA-12541: TNS:no listener ****************************************************************** LGWR: Setting 'active' archival for destination LOG_ARCHIVE_DEST_2 ****************************************************************** Creating archive destination LOG_ARCHIVE_DEST_2: 'wbdata.wbhouse' LGWR: Transmitting activation ID a043c21d [code]....
We are planning to upgrade a database from 9.2.0.8.0 to 10.2.0.4.0.We have a lot a PRO*C programs pre compiled using 9.2.0.8.0 (and most likely even 9.2.0.7.0) compiler.
if we could upgrade the database without having to re-compile all the programs.We have tested this approach against some of our programs. Most of them executed fine - but in 2 cases we are getting "ORA-01001 invalid cursor". I suspect, that the case is that Pre compiler version 9 is not supported against 10g databases - but I am not sure.
Would it be a better option to upgrade Pre compiler/client as the first step and the db as the second step (and would that be supported) ? We definitely don't want to upgrade both Pre compiler/Client and the Database in one goal - that would be too risky.
Why it's not excluding '0-5' and '25-30', how I should write code to exclude this and Is there is any function in oracle to check for numeric in column and print.
I am trying to give back data from a stored procedure written in C. I registered the functions as follows:create or replace procedure version(versioninfo OUT clob) as external name "version" library myLib language c with context parameters (context, versioninfo, versioninfo INDICATOR SB4); It compiles fine. The function being called look like this:
If I execute the procedure with SQLDeveloper by pressing "play" it is getting executed but there is no result. If I try to execute it from an anonymous block it results in ORA-22275 instead of doing anything.
declare res clob; begin -- the following doesn't work much --dbms_lob.createtemporary(res,true); version(res); dbms_output.put_line(res); end;
Actually I have to questions: 1.) Why does Oracle give me the error? In my opinion all requirements mentioned by the error description are met. 2.) Why is there no output when executing the function via SQL Developer? Is the usage of OCILobWrite wrong?
I am trying to describe an STP in a package, but it gives me an error.
e.g. In package ABC suppose there is an STP XYZ, I am trying to describe ABC.XYZ function but it gives me an error code 4043 and error message object XYZ.ABC does not exist.
i need to compile a proc program, say prog.pc.have oracle 10g in my system. Since i am new to proc programming, me on the steps to compile the proc program in oracle proc compiler.
in my oci applications,if i get a column of number that is in the scope of int,i can use value = *(int *)field.data; get the value,but if the column size is larger than 10,the code can't be available,how can i get the value.
I am trying to call procedure from PRO C Procedure has many parameters and I do not need to put all of them when I call procedure. Is there way to make the same way as in PL/SQL