You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A select query of integer data returns incorrect results when I am running pyodbc 3.0.7 with python 2.7 on RHEL6 against a Sybase database. I do not experience this issue when running on 3.0.3
Here is a basic example of incorrect results for int data
All of the integers consistently add 2^32 * 2^7 (210453397504) to the correct value. This remains true until the value gets to 2^31 (2147483648), at which point results are correctly converted to type Long.
When I explicitly cast the data to numeric, it returns correctly:
c.execute("select cast(0 as numeric), cast(1 as numeric), cast(2 as numeric)")
print c.fetchone()
>> (Decimal('0'), Decimal('1'), Decimal('2'))
Likewise if I explicitly add a decimal to the values, it will return the correct values as Decimal instances
A select query of integer data returns incorrect results when I am running pyodbc 3.0.7 with python 2.7 on RHEL6 against a Sybase database. I do not experience this issue when running on 3.0.3
Here is a basic example of incorrect results for int data
All of the integers consistently add 2^32 * 2^7 (210453397504) to the correct value. This remains true until the value gets to 2^31 (2147483648), at which point results are correctly converted to type Long.
When I explicitly cast the data to numeric, it returns correctly:
Likewise if I explicitly add a decimal to the values, it will return the correct values as Decimal instances
However when I try to cast the data as Ints, I get the same incorrect results as I do by default:
There are a couple other users that have experienced the same issue on code.google.com
https://code.google.com/p/pyodbc/issues/detail?id=360
The text was updated successfully, but these errors were encountered: