-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
My journey so far into javacv / javacpp on the jetson TX2 / Opencv / Deeplearning4j / libnd4j / Nd4j / openblas etc.. and at the end my remaining issue ;) #1021
Comments
@Spawn32 Thanks for posting! So, there's a couple of separate things that we are going to need here:
|
/cc @supertick |
I've put the arm64 work on hold for now, after running some OpenCV benchmarks on Pi3 32bit vs 64bit arm and finding 64bit a bit slower (and looked like there might be something in lack of NEON optimisation making both builds significantly slower than x86). So, right now Pi doesn't look like a viable platform. I guess if someone wants to have a crack at it, the arm sections might give most of the template - alot could work just with switching to aarch64 |
@vb216 That's too bad :( Where would the cross-compilation toolchain be for 64-bit builds? |
I think would be this package: gcc-aarch64-linux-gnu |
So, nothing like https://github.com/raspberrypi/tools? That's annoying... |
Ah I see what you mean. Not that I've seen, and I'd be surprised if there was anything from the raspberry side - there isn't an official 64 bit OS for Pi platforms (nor any commitment to provide one anytime soon), so I think not much demand for cross compiling there. Modifying the native build section could be another approach, and building on the target device? Wouldn't help get the snapshots repo updated though I guess. |
Looks like that's what people do for Jetson too: |
@saudet I will commit my changes to the build scripts as soon as i re do everything on the 1.0.0 BETA release. Just hoping someone has the time to help out with a new Framegrabber , it's a bit beyond my skills, i am new to the javacv world ;) |
Short update, javacv / javacpp Opencv with yolo filter is now running great on the tx2, need to compile a small missing module that dl4j needs to test it tomorrow.... And big thanks to @kwatters for fixing the frame grabber :) |
trying to compile javacv for the Nvidia Jetson Tx2 too. the compilation command I'm using is: Error thrown is Steps I had to perform in the dockercross image to get this far:
|
I haven't seen any issues like that with OpenCV 4.0.0 on any of the other platforms, but try with the 1.4.3 release tag instead. Maybe that'll work better. |
ok, took me a while to test that through. up until 1.3 arm64 is not supported, so I didn't test that 1.4 [INFO] g++ -I/work/opencv/cppbuild/linux-arm64/include/ -I/work/opencv/target/classes/org/bytedeco/javacpp/include/ -I/usr/lib/jvm/java-7-openjdk-amd64/include -I/usr/lib/jvm/java-7-openjdk-amd64/include/ says "skipping incompatible ...", but the file exists: ..and is a valid arm64 lib file opencv/cppbuild/linux-arm64/lib//libopencv_imgproc.so.3.4.0 1.4.1 1.4.2 1.4.3 |
@AlessioM Does it do that |
I'm having the same trouble with dockercross-linux-x86 at least if I use it like in this I get
however I've spent more time than I care to admit on that now, so I'm cutting my losses and rewrote my code to use opencv 2.4.9 which can be isntalled from the repository on the JetsonTX2 |
Right, it looks like OpenCV can't locate your Java installation in that
environment. Anyway, we need someone to come up with a build environment
that works to cross compile for linux-arm64, that's what this issue is for.
FWIW, I think the closest to it would be linux-ppc64le so I'd recommend
starting with that:
https://github.com/bytedeco/javacpp-presets/blob/master/ci/install-ppc.sh
|
I just got one of those new jetson nano boards. $99.. not too shabby. Anyway, I'm starting down the path of building javacpp on it natively. The first issue i came across was with openblas build that it didn't recognize the architecture linux-arm64 ... I'll post more updated as I work through the rest of the build. |
No difference, the idea is to use linux-arm64 because it's a simpler name.
If you see linux-aarch64 coming up anywhere, let's get that fixed.
|
Heh, that was quick... So, for openblas in the cppbuild.sh, i just added another branch for linux-aarch64 and one for linux-arm64 I used the default gcc/gfortran "aarch64-linux-gnu-gcc-7" and "aarch-linux-gfortran-7" as the CC and FC.. I assume the BINARY should be set to 64 (for bits?) and "TARGET" i didn't know what to set it to.. so i just arbitrarily chose ARMV8 .. |
Add whatever pattern for the architecture here:
https://github.com/bytedeco/javacpp-presets/blob/master/cppbuild.sh#L36
|
And we also need to make sure it gets translated in the |
BTW, @vb216 you think you'd like to start working on this? The new Jetson Nano goes around for not much more than a beefy Raspberry Pi these days! |
I came close to buying one a couple of weeks ago when saw them with 10% discount! Need to think of a good excuse to justify the spend ;) |
…L, Leptonica, Tesseract, and others (issue bytedeco/javacv#1021)
…L, Leptonica, Tesseract, and others (issue bytedeco/javacv#1021)
Ok, we now have builds for OpenCV, FFmpeg, OpenBLAS, FFTW, GSL, Leptonica, Tesseract, and others. Please try them out and let me know if there are any issues before the release later next month: http://bytedeco.org/builds/ Thanks! For CUDA, Jetson doesn't use the same version as the other platforms, so I'm not too sure how to best support this, but that's going to be tracked at bytedeco/javacpp-presets#735. |
Version 1.5.1 has now been released with all the builds! Enjoy |
Hi,
Have seen other users struggling to get all this pieces compiled for the Jetson TX2 and thought i would share my findings and results so far.
I started out by involving myself in the myrobotlab.org Nixe project, setting a goal at getting it running on the powerful Jetson TX2 platform (and later this year the new Jetson Xavier) and thanks to all the nice people over at mrl, like @supertick @kwatters and the rest of the brilliant minds over there... :)
Started out with the Opencv module since it was not depend of a lot of other modules, after compile it for linux-aarch64 and scratching my head a while why javacv would not pick it up i realized that it needs to be compiled for linux-arm64.
The good thing was, it worked, i can set up a webcam (logitech 922) and it works like it should, also if i start a rtsp server on the tx2 (using 1 of my 3 csi connected raspberry V2 cams / imx219) i can connect to it without any problem with opencv.
And here's where my problem comes in, i need to directly connect to the csi cams, to get the lowest latency and delays possible, this is not possible at the moment with opencv's frame grabber, after having a talk with RidgeRun (ported the imx219 driver source code i got from them onto the current firmware release on the TX2) they told me they were pretty sure this is because csi cams normally only outputs "Bayer" format, and opencv needs YUV.
This is normally done by the nvcamerasrc "daemon" running on the TX2, her are 2 examples on how it's done in "native" opencv:
https://github.com/jetsonhacks/buildOpenCVTX2/blob/master/Examples/cannyDetection.py
https://github.com/jetsonhacks/buildOpenCVTX2/blob/master/Examples/gstreamer_view.cpp
Here is another example taken from:
https://gist.github.com/jkjung-avt/86b60a7723b97da19f7bfa3cb7d2690e
def open_cam_onboard(width, height):
# On versions of L4T prior to 28.1, add 'flip-method=2' into gst_str
gst_str = ('nvcamerasrc ! '
'video/x-raw(memory:NVMM), '
'width=(int)2592, height=(int)1458, '
'format=(string)I420, framerate=(fraction)30/1 ! '
'nvvidconv ! '
'video/x-raw, width=(int){}, height=(int){}, '
'format=(string)BGRx ! '
'videoconvert ! appsink').format(width, height)
return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)
and this is an example of a typical string i use to bring up the camera in a window from the terminal:
DISPLAY=:0 gst-launch-1.0 nvcamerasrc sensor-id=0 contrast=1 wbmode=3 saturation=1 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! nvegltransform ! nveglglessink -e
So, "nvcamerasrc" is very important since it does the conversion from bayer to yuv, and it also run's of the cuda cores (fast) .
If i use the "file" function from opencv's framgrabber and give it: /dev/video0 i can by using dmesg -wH see it powering up the correct cam, starting a stream and beginning to adjust the gain, 5-10 seconds later it will power off the cam.
If i use opencv's framgrabber and just tells it to select cam 0 or 1 it will power the cam, but nothing more happens.
So i would really appreciate if someone could help out with this part, modifying or adding a new frame grabber for us present and future Jetson users. :)
@saudet tipped me on using the FFmpeg grabber, but since there is no support at all (as i know?) for native FFmpeg on the TX2 this would probably be a lot of work.
Building Opencv / openblas / libnd4j / ndj4 / cuda / Deepleaning4j, etc...
Actually i found it much more easy then i first thought it would be, keep in mind that i build natively on my tx2, takes a bit more time, but no fiddling with cross compilers... :)
Most modules will build nicely if you just add the linux-arm64 and points to it's compiler / change needed flags.
inside the modules directory you want to build, add linux-arm64 to the .sh files, also do the same to the /platform/pom.xml.
Then in the modules directory do:
mvn clean install -Djavacpp.platform=linux-arm64 -DskipTests -Dmaven.javadoc.skip=true
libnd4j needs to be built with:
./buildnativeoperations.sh -a native
./buildnativeoperations.sh -a native -c cuda -сс 62
After libnd4j is build, export it's path like this:
LIBND4J_HOME=/media/nvidia/970EVO/Nixie_tmp/libnd4j <-- in my case...
export LIBND4J_HOME
Only module left for me to build is the Deeplearning4j, should be a easy fix but i had an error compiling the Datavec dependency, at that point i returned to my opencv frame grabber problem, i need vision to test the rest anyway :)
Since i compiled the "SNAPSHOT" tag i will need to re do everything, since i need the "1.0.0 BETA" tag for now.
If i find i forgot something or left something out while i rebuild the ALPHA i will edit / update this "Issue"
Really hope someone can help out with the frame grabber issue.. :)
The text was updated successfully, but these errors were encountered: