You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently the binary serialization format (BINARY_OUTPUT) is not compatible across machine with different word size. This is documented, but should be fixed.
The format may be endianness-dependent as well.
One place that causes the machine-dependence is the bigint serialization ( operator<<(std::ostream &out, const bigint<n> &b) and operator>>(std::istream &in, bigint<n> &b) in src/algebra/fields/bigint.tcc), which just copies the internal array of mp_limb_t limbs. This is easily fixed.
Are there additional places?
The text was updated successfully, but these errors were encountered:
There is the easy-to-fix thing about lengths: it might be the case that 32 * (number of 32-bit limbs) != 64 * (number of 64-bit limbs), so one needs to trim upper bits appropriately. Similar about the endianness.
Another thing is the Montgomery form, which uses auxiliary modulus that is word-size dependent. It is very plausible that there is a relatively cheap conversion (esp. given that we probably only care about 32 and 64 bit architectures), but we haven't looked into it.
Currently the binary serialization format (BINARY_OUTPUT) is not compatible across machine with different word size. This is documented, but should be fixed.
The format may be endianness-dependent as well.
One place that causes the machine-dependence is the bigint serialization (
operator<<(std::ostream &out, const bigint<n> &b)
andoperator>>(std::istream &in, bigint<n> &b)
insrc/algebra/fields/bigint.tcc
), which just copies the internal array ofmp_limb_t
limbs. This is easily fixed.Are there additional places?
The text was updated successfully, but these errors were encountered: