-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Description
Hello,
Since our TF models heavily utilize unsupported TF layers, converting our TF Model to a UFF in TensorRT does not seem feasible. Instead, we were thinking of trying to get TensorFlow Serving working on the jetson, to act as a mini server for model inference.
Has anyone done this yet, or know of people who have? I've seen examples of installing TensorFlow on the Jetson so I assumed it might be possible to install TensorFlow Serving as well.
However, I run in issues building TF Serving with Bazel, and have exhausted my ability to narrow down the problem.
So far I have:
Installed all pre-reqs
Installed bazel
cloned TF Serving and attempted to build it from source.
I run into an issue which is similar to memory issues (see below) I've seen around the forums/github pages and have tried to confine the resources used during the build, but nothing works (e.g., bazel build --jobs 1 --local_resources 1024,1.0,1.0 --verbose_failures tensorflow_serving/...)
The error I keep getting is:
Linking of rule '//tensorflow_serving/model_servers:tensorflow_model_server' failed (Exit 1).
bazel-out/local-opt/bin/external/aws/_objs/aws/external/aws/aws-cpp-sdk-core/source/client/ClientConfiguration.o:ClientConfiguration.cpp:function Aws::Client::ComputeUserAgentString(): error: undefined reference to 'Aws::OSVersionInfo::ComputeOSVersionStringabi:cxx11'
collect2: error: ld returned 1 exit status
Does anyone have experience attempting / successfully installing TensorFlow Serving on a Jetson?
Any clue why my build is failing?