A Rabbit through the mirror full implementation. We include support for ReLUs. This release includes:
- Basic random sampling using
dabits
in MAMBA. - VHDL Circuits and Bristol Fashion Files.
- Constant Round implementations of Rabbit and Catrina comparison protocols from Through the Mirror.
- A
relu
implementation, and a complementing library. - Easy to use test files, to verify your installation/configuration is correct.
This is to the best of our knowledge the first implementation of any support for the dabits
instruction in mamba.
numpy
1.16 or above (it is used exclusively to test, which in this context means, executetest_relu.mpc
).
-
Download and configure
tii-mpclib
: -
That's it. All 6 comparison modes (class
Mode
) are included in therabbit_lib.py
:rabbit_slack
: it receivessint
and utilizes slack in conjunction withrabbit
classic. It returns asint
containing either 0 or 1.rabbit_list
: it receivessint
and utilizes a rejection list in conjunction withrabbit
classic. It returns asint
containing either 0 or 1.rabbit_fp
: it receivessint
and assumes a prime close to a power of 2 in conjunction withrabbit
classic. The method requires you to define if the approximation is from below or above. It returns asint
containing either 0 or 1.rabbit_conv:
it receivessint
and2 to the k
domains composed bitwise as proposed byrabbit
. It returns asint
containing either 0 or 1.rabbit_less_than:
it receivessint
and utilizes a boolean less-than circuit. It returns asint
containing either 0 or 1.dabits_ltz
: it receivessint
and utilizes slack generated viadabits
forCatrina and De Hoogh
. It returns asint
containing either 0 or 1.
-
You can now run the test file and check all tests are in green. (NOT in red.)
Parallel Truncation with ReLU. We observe that many machine learning functions, such as neural networks, are constructed as a series of multiplication or convolution operations, followed by the application of ReLU. To leverage this structure effectively, we merge a set of sequential truncations with the ReLU operation, leading to enhanced efficiency. To achieve this, we introduced an additional comparison mode in the rabbit_lib.py
:
dabits_trunc_ltz
: it receivessint
along with a public intbatch
and reuses the mask for truncation of thebatch
number of previous multiplications and thedabits_ltz
comparison. It returns twosint
values: one represents the truncated input, and the other is the comparison bit.
NOTE: The facades have been built in a way that allows for the vectorization of inputs. This feature is extremely important when compiling without optimizations, which is a common issue in Machine Learning.
We included facade methods that can be parametrized with the rabbit mode desired. This is true for simple LTZ tests and for ReLUs:
-
rabbit
: on rabbit_lib.py it receivessfix
inputs and a comparison mode. It returns the comparison using the specified mode. -
rabbit_sint
: on rabbit_lib.py it receivessint
inputs and a comparison mode. It returns the comparison using the specified mode. -
relu
: on relu_lib.py it receives asint
relu mode input and a gradient mode (optional). It returns arelu_response
object using the specified mode. -
relu_sfix
: on relu_lib.py it receives asfix
relu mode input and a gradient mode (optional). It returns arelu_response
object using the specified mode. -
relu_2d
: on relu_lib.py it receives eithersint
orsfix
matrix (2-dimensional vector) input and a gradient mode (optional). It returns arelu_response
matrix (2-dimensional vector) object using the specified mode. -
relu_3d
: on relu_lib.py it receives eithersint
orsfix
matrix (3-dimensional vector) input and a gradient mode (optional). It returns arelu_response
matrix (3-dimensional vector) object using the specified mode. -
trunc_LTZ
: on rabbit_lib.py it receives asint
input, public parameterbatch
and a comparison mode. It returns the batch-truncated input along with the comparison result using the specified mode. -
relu_trunc
: on relu_lib.py it receives asint
relu mode input, public parameterbatch
and a gradient mode (optional). It performs ReLU over the batch-truncated input and returns arelu_response
object using the specified mode. -
relu_trunc_sfix
: on relu_lib.py it receives asfix
relu mode input, public parameterbatch
and a gradient mode (optional). It performs ReLU over the batch-truncated input and returns arelu_response
object using the specified mode. -
relu_trunc_2d (not vectorized)
:on relu_lib.py it receives eithersint
orsfix
matrix (2-dimensional vector) input, public parameterbatch
and a gradient mode (optional). It performs ReLU over the batch-truncated input matrix and returns arelu_response
matrix (2-dimensional vector) object using the specified mode. -
relu_trunc_3d (not vectorized)
: on relu_lib.py it receives eithersint
orsfix
matrix (3-dimensional vector) input, public parameterbatch
and a gradient mode (optional). It performs ReLU over the batch-truncated input matrix and returns arelu_response
matrix (3-dimensional vector) object using the specified mode.
The library can be configured to use a different Base Circuit
by default or any Garbling Online-Offline
mode. It provides 2 global variables for that purpose, namely (with their actual values):
DEFAULT_CIRCUIT = Circuit.ONLY_ANDS
DEFAULT_GARBLING = Garbling.ONLINE_GARBLING
Depending on the function, you can always override the library default and use a specific configuration when the function is invoked. These kind of configurations are a bit more complicated and are recommended for expert users. You can check rabbit_lib.py for more details.
If you decide to use a different circuit, you have 2 options:
- Change the default. This would affect all executions of the functions above.
- Parametrize the circuit id. You can do this by using the Circuit class on the native methods that use circuits, for example:
a = sint(5)
c = dabits_LTZ(a, circuit=Circuit.ANDS_XORS)
Please be advised you cannot directly parametrize the circuit on the facade
functions. Hence, we believe this kind of invocations are better left for advance users.
NOTE: Finally, you can also parametrize comparison related facade
's with an specific rabbit_lib
mode as follows:
a = sint(5)
b = sint(1)
c = dabits_LTZ(a,b mode = Mode.RABBIT_LIST)
On the other hand, relu_lib
related facade
s cannot be parametrized this way. They use Mode.dabits_LTZ
by default.
If you have questions please contact any of the authors. Current repo maintainer is: Abdelrahaman ALY.
- Abdelrahaman ALY
- Victor SUCASAS
- Kashif NAWAZ
- Eugenio SALAZAR
- Ajith SURESH