-
Notifications
You must be signed in to change notification settings - Fork 744
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DepthAi: Access to NN FP16 output layer causes JVM dump #1155
Comments
Could you try to use |
@saudet Perfect, this seems to work. What's the background here? |
JavaCPP doesn't allocate memory for new |
Thanks for your explanation. I‘ll close the issue. |
I've improved somewhat the mappings for this in commit 0dca099 such that we now get a |
Running a neural network via Java on DepthAI works fine as long as I do not try to access FP16 data.
Example
NNData seg = nn.getNNData(); FloatBuffer output_0 = seg.getLayerFp16("output_0"); float value = output_0.get();
Any access to the data of this Buffer fails and causes a dump of the JVM. The limit and capacity of this buffer seems to be correct
with 10752 bytes (shown as DirectByteBuffer) as I have 2688 FP16 values (2688*4) in the output layer.
Could it be that FP16 -> Java Float is not supported properly?
The text was updated successfully, but these errors were encountered: