You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Seems like a memory overflow? I encountered a similar problem. When I change downsample_factor=1.0, I have error message:
MemoryError: Error allocating 4305920000 bytes of device memory (out of memory).
Apply node that caused the error: GpuAllocEmpty(Shape_i{0}.0, Shape_i{0}.0, Elemwise{Composite{((((i0 + i1) - i2) // i3) + i3)}}[(0, 0)].0, Elemwise{Composite{((((i0 + i1) - i2) // i3) + i3)}}[(0, 0)].0)
Toposort index: 188
Inputs types: [TensorType(int64, scalar), TensorType(int64, scalar), TensorType(int64, scalar), TensorType(int64, scalar)]
Inputs shapes: [(), (), (), ()]
Inputs strides: [(), (), (), ()]
Inputs values: [array(10000), array(32), array(58), array(58)]
Outputs clients: [[GpuDnnConv{algo='small', inplace=True}(GpuContiguous.0, GpuContiguous.0, GpuAllocEmpty.0, GpuDnnConvDesc{border_mode='valid', subsample=(1, 1), conv_mode='conv'}.0, Constant{1.0}, Constant{0.0})]]
HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.
Because I'm using GTX 970M on my laptop, clearly 4305920000 bytes (roughly more than 4G) is more than my GPU's memory capacity (3 G)
I am running the Spatial Transformation example (https://github.com/Lasagne/Recipes/blob/master/examples/spatial_transformer_network.ipynb) downsample_factor =1.0 and I am getting the following error:
Can anyone replicate the same issue?
The text was updated successfully, but these errors were encountered: