-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues Installing TF Serving on Jetson TX2 #832
Comments
Full error if it helps: ERROR: /home/nvidia/serving/tensorflow_serving/model_servers/BUILD:205:1: Linking of rule '//tensorflow_serving/model_servers:tensorflow_model_server' failed (Exit 1): gcc failed: error executing command |
My workaround is In section (cc_library) in my case line 27 replace: Then it will successfully build on Jetson. |
@booglerz - Hi, is this still an issue ? Did the workaround help you to resolve it ? |
There is a workaround provided and it was in awaiting response for more than 7 days. Hence closing this issue. |
The workaround help me to solve the issue. Thanks. |
Add fix for aarch64 described here: tensorflow/serving#832 (comment)
Noting this as a duplicate of #1277 for prioritization |
hi @mrodozov is this pulled into the main tensorflow repo? |
Hello,
Since our TF models heavily utilize unsupported TF layers, converting our TF Model to a UFF in TensorRT does not seem feasible. Instead, we were thinking of trying to get TensorFlow Serving working on the jetson, to act as a mini server for model inference.
Has anyone done this yet, or know of people who have? I've seen examples of installing TensorFlow on the Jetson so I assumed it might be possible to install TensorFlow Serving as well.
However, I run in issues building TF Serving with Bazel, and have exhausted my ability to narrow down the problem.
So far I have:
Installed all pre-reqs
Installed bazel
cloned TF Serving and attempted to build it from source.
I run into an issue which is similar to memory issues (see below) I've seen around the forums/github pages and have tried to confine the resources used during the build, but nothing works (e.g., bazel build --jobs 1 --local_resources 1024,1.0,1.0 --verbose_failures tensorflow_serving/...)
The error I keep getting is:
Linking of rule '//tensorflow_serving/model_servers:tensorflow_model_server' failed (Exit 1).
bazel-out/local-opt/bin/external/aws/_objs/aws/external/aws/aws-cpp-sdk-core/source/client/ClientConfiguration.o:ClientConfiguration.cpp:function Aws::Client::ComputeUserAgentString(): error: undefined reference to 'Aws::OSVersionInfo::ComputeOSVersionStringabi:cxx11'
collect2: error: ld returned 1 exit status
Does anyone have experience attempting / successfully installing TensorFlow Serving on a Jetson?
Any clue why my build is failing?
The text was updated successfully, but these errors were encountered: