213

I'm really eager to start using Google's new Tensorflow library in C++. The website and docs are just really unclear in terms of how to build the project's C++ API and I don't know where to start.

Can someone with more experience help by discovering and sharing a guide to using tensorflow's C++ API?

Deduplicator
  • 44,692
  • 7
  • 66
  • 118
theideasmith
  • 2,835
  • 2
  • 13
  • 20
  • 5
    +1 for your question. Any chance to install/compile on Windows ? Website shows only Linux/Mac . A guide to have bazel run is needed. This example could be a good starting point to learn: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android – alrama Nov 13 '15 at 15:39
  • 1
    This question still doesn't have an answer. How to install just C++ tensorflow C++ API libraries has no guide to it, and the accepted answer does not give any guidence on how to that, even through any of multiple provided links. – iantonuk Dec 19 '17 at 09:29
  • 1
    For Windows, I found [this question](https://stackoverflow.com/questions/41070330/is-it-possible-to-use-tensorflow-c-api-on-windows) and its accepted answer most helpful. By building the example trainer project, you build the entire TensorFlow project as a static library, then link to it. You can make your own projects and link TensorFlow the same way. – omatai Feb 20 '18 at 21:24

13 Answers13

56

To get started, you should download the source code from Github, by following the instructions here (you'll need Bazel and a recent version of GCC).

The C++ API (and the backend of the system) is in tensorflow/core. Right now, only the C++ Session interface, and the C API are being supported. You can use either of these to execute TensorFlow graphs that have been built using the Python API and serialized to a GraphDef protocol buffer. There is also an experimental feature for building graphs in C++, but this is currently not quite as full-featured as the Python API (e.g. no support for auto-differentiation at present). You can see an example program that builds a small graph in C++ here.

The second part of the C++ API is the API for adding a new OpKernel, which is the class containing implementations of numerical kernels for CPU and GPU. There are numerous examples of how to build these in tensorflow/core/kernels, as well as a tutorial for adding a new op in C++.

mrry
  • 125,488
  • 26
  • 399
  • 400
  • 8
    No installation instructions for C++ is shown https://www.tensorflow.org/install/, but there are example programs shown https://www.tensorflow.org/api_guides/cc/guide that clearly is using C++ api. How exactly did you install C++ for Tensorflow? – user3667089 Jun 05 '17 at 23:39
  • @user3667089 The location of the installation procedure is now located at https://www.tensorflow.org/install/install_sources – Dwight Jun 27 '17 at 20:07
  • 7
    @Dwight I saw that page before but I don't see any info about C++ – user3667089 Jun 27 '17 at 20:24
  • 2
    @user3667089 The headers, after the installation procedure above, will be located within the dist-packages folder of the python distribution you choose during the installation procedure(such as /usr/local/lib/python2.7/dist-packages). In that folder there will be a folder tensorflow/include, which will have all the headers. You'll need to do a little bit of work for making sure whatever you are building has that on it's include path. I personally use CMAKE, so am trudging through [this](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/public/session.h). – Dwight Jun 28 '17 at 18:15
  • Hi mrry, It's been 2 years since your answers. I checked out tensorflow c++ API reference. I think It is now possible to build graph with tensorflow. But I don't find equivalent API to dataset/dataset provider which I think is crucial for speeding up training. Do you have information on this? – scott huang Dec 13 '17 at 07:56
  • 7
    This is not a real answer up to this date. It starts with "To get started" and then links no relevant info in a place that people looking guidance here would already looked up. It then fails to provide next step, changing subject. – iantonuk Dec 19 '17 at 09:35
  • 1
    @Dwight how do you build the .so file? – pooya13 Nov 24 '20 at 02:52
31

To add to @mrry's post, I put together a tutorial that explains how to load a TensorFlow graph with the C++ API. It's very minimal and should help you understand how all of the pieces fit together. Here's the meat of it:

Requirements:

  • Bazel installed
  • Clone TensorFlow repo

Folder structure:

  • tensorflow/tensorflow/|project name|/
  • tensorflow/tensorflow/|project name|/|project name|.cc (e.g. https://gist.github.com/jimfleming/4202e529042c401b17b7)
  • tensorflow/tensorflow/|project name|/BUILD

BUILD:

cc_binary(
    name = "<project name>",
    srcs = ["<project name>.cc"],
    deps = [
        "//tensorflow/core:tensorflow",
    ]
)

Two caveats for which there are probably workarounds:

  • Right now, building things needs to happen within the TensorFlow repo.
  • The compiled binary is huge (103MB).

https://medium.com/@jimfleming/loading-a-tensorflow-graph-with-the-c-api-4caaff88463f

JohnAllen
  • 7,317
  • 9
  • 41
  • 65
Jim
  • 650
  • 1
  • 6
  • 12
  • 1
    Hello Jim. is this tutorial still the best/easiest way to compile a c++ project with TF? Or is there an easier way now as you predict at the end of your post? – Sander Apr 14 '16 at 21:41
  • 3
    I believe there is now a built-in build rule. I submitted a PR for it a while back. I'm not sure about the caveats. I would expect the first to remain as it's a result of Bazel, not TF. The second could likely be improved upon. – Jim Apr 16 '16 at 00:02
  • I followed that tutorial, but when running `./loader` I get an error: `Not found: models/train.pb`. – 9th Dimension Jul 06 '16 at 18:44
  • 4
    Is there now way to have your project outside of the TensorFlow source code directory? – Seanny123 May 24 '17 at 03:29
  • yep, how to make it oustide given you have shared .so library of tensorflow? – Xyz May 26 '17 at 17:36
  • I have built `libtensorflow.so` but still have linking problems. – Ziyuan Aug 09 '18 at 14:58
  • I was able to build and use tensorflow successfully in a separate project by following the tutorial from https://tuatini.me/building-tensorflow-as-a-standalone-project/ the only disadvantage IMHO is that you still have to copy a lot of headers to your project as tensorflow downloads its own version of eigen and libprotobuf – rkachach Aug 29 '18 at 13:34
  • 1
    @Jim any improved method available to do inference in C/C++ at this point of time? – Sathyamoorthy R Feb 06 '19 at 10:21
24

If you are thinking into using Tensorflow c++ api on a standalone package you probably will need tensorflow_cc.so ( There is also a c api version tensorflow.so ) to build the c++ version you can use:

bazel build -c opt //tensorflow:libtensorflow_cc.so

Note1: If you want to add intrinsics support you can add this flags as: --copt=-msse4.2 --copt=-mavx

Note2: If you are thinking into using OpenCV on your project as well, there is an issue when using both libs together (tensorflow issue) and you should use --config=monolithic.

After building the library you need to add it to your project. To do that you can include this paths:

tensorflow
tensorflow/bazel-tensorflow/external/eigen_archive
tensorflow/bazel-tensorflow/external/protobuf_archive/src
tensorflow/bazel-genfiles

And link the library to your project:

tensorflow/bazel-bin/tensorflow/libtensorflow_framework.so (unused if you build with --config=monolithic)
tensorflow/bazel-bin/tensorflow/libtensorflow_cc.so

And when you are building your project you should also specify to your compiler that you are going to use c++11 standards.

Side Note: Paths relative to tensorflow version 1.5 (You may need to check if in your version anything changed).

Also this link helped me a lot into finding all this infos: link

Renan Wille
  • 298
  • 2
  • 8
  • 1
    I needed this additional include path to build with version 1.11: `tensorflow/bazel-tensorflow/external/com_google_absl` – Noah_S Oct 23 '18 at 20:55
19

First, after installing protobuf and eigen, you'd like to build Tensorflow:

./configure
bazel build //tensorflow:libtensorflow_cc.so

Then Copy the following include headers and dynamic shared library to /usr/local/lib and /usr/local/include:

mkdir /usr/local/include/tf
cp -r bazel-genfiles/ /usr/local/include/tf/
cp -r tensorflow /usr/local/include/tf/
cp -r third_party /usr/local/include/tf/
cp -r bazel-bin/libtensorflow_cc.so /usr/local/lib/

Lastly, compile using an example:

g++ -std=c++11 -o tf_example \
-I/usr/local/include/tf \
-I/usr/local/include/eigen3 \
-g -Wall -D_DEBUG -Wshadow -Wno-sign-compare -w  \
-L/usr/local/lib/libtensorflow_cc \
`pkg-config --cflags --libs protobuf` -ltensorflow_cc tf_example.cpp
Monster
  • 488
  • 3
  • 12
lababidi
  • 2,654
  • 1
  • 22
  • 14
  • I believe it is not necessary to install protobuf and eigen. The bazel workspace configuration includes rules to download and build those components. – 4dan Dec 13 '17 at 01:01
  • 2
    finally, the crazy OFFICIAL build guide at https://www.tensorflow.org/install/source is for building pip module, tks for the build option "tensorflow:libtensorflow_cc.so", it's not even documented on tensorflow.org – Dee Mar 26 '19 at 09:38
  • @lababidi what c++ dependencies should be before the 'bazel build' command? i'm facing issue that the build fails after an hour, this is hard to test build again and again – Dee Mar 27 '19 at 02:19
16

One alternative to using Tensorflow C++ API I found is to use cppflow.

It's a lightweight C++ wrapper around Tensorflow C API. You get very small executables and it links against the libtensorflow.so already compiled file. There are also examples of use and you use CMAKE instead of Bazel.

Bersan
  • 1,032
  • 1
  • 17
  • 28
  • 1
    This alternative doesn't include all the features of TensorFlow C++ API. For example, using C API one can't reduce the number of threads generated by TensorFlow to 1. You can find more details on the problem using https://stackoverflow.com/questions/60206113/how-to-stop-tensorflow-from-multi-threading and https://stackoverflow.com/questions/45063535/change-number-of-threads-for-tensorflow-inference-with-c-api and – fisakhan Aug 22 '20 at 20:03
14

If you wish to avoid both building your projects with Bazel and generating a large binary, I have assembled a repository instructing the usage of the TensorFlow C++ library with CMake. You can find it here. The general ideas are as follows:

  • Clone the TensorFlow repository.
  • Add a build rule to tensorflow/BUILD (the provided ones do not include all of the C++ functionality).
  • Build the TensorFlow shared library.
  • Install specific versions of Eigen and Protobuf, or add them as external dependencies.
  • Configure your CMake project to use the TensorFlow library.
cjweeks
  • 267
  • 1
  • 4
  • 8
10

If you don't mind using CMake, there is also tensorflow_cc project that builds and installs TF C++ API for you, along with convenient CMake targets you can link against. The project README contains an example and Dockerfiles you can easily follow.

Floop
  • 451
  • 4
  • 10
  • It is successfully working on ubuntu but it has some issues in CentOS. The problem is with the downloading of tensorflow zip files using curl/wget during building. – fisakhan Sep 10 '20 at 19:54
8

You can use this ShellScript to install (most) of it's dependencies, clone, build, compile and get all the necessary files into ../src/includes folder:

https://github.com/node-tensorflow/node-tensorflow/blob/master/tools/install.sh

Ivan Seidel
  • 2,394
  • 5
  • 32
  • 49
7

If you don't want to build Tensorflow yourself and your operating system is Debian or Ubuntu, you can download prebuilt packages with the Tensorflow C/C++ libraries. This distribution can be used for C/C++ inference with CPU, GPU support is not included:

https://github.com/kecsap/tensorflow_cpp_packaging/releases

There are instructions written how to freeze a checkpoint in Tensorflow (TFLearn) and load this model for inference with the C/C++ API:

https://github.com/kecsap/tensorflow_cpp_packaging/blob/master/README.md

Beware: I am the developer of this Github project.

kecsap
  • 350
  • 3
  • 8
5

I use a hack/workaround to avoid having to build the whole TF library myself (which saves both time (it's set up in 3 minutes), disk space, installing dev dependencies, and size of the resulting binary). It's officially unsupported, but works well if you just want to quickly jump in.

Install TF through pip (pip install tensorflow or pip install tensorflow-gpu). Then find its library _pywrap_tensorflow.so (TF 0.* - 1.0) or _pywrap_tensorflow_internal.so (TF 1.1+). In my case (Ubuntu) it's located at /usr/local/lib/python2.7/dist-packages/tensorflow/python/_pywrap_tensorflow.so. Then create a symlink to this library called lib_pywrap_tensorflow.so somewhere where your build system finds it (e.g. /usr/lib/local). The prefix lib is important! You can also give it another lib*.so name - if you call it libtensorflow.so, you may get better compatibility with other programs written to work with TF.

Then create a C++ project as you are used to (CMake, Make, Bazel, whatever you like).

And then you're ready to just link against this library to have TF available for your projects (and you also have to link against python2.7 libraries)! In CMake, you e.g. just add target_link_libraries(target _pywrap_tensorflow python2.7).

The C++ header files are located around this library, e.g. in /usr/local/lib/python2.7/dist-packages/tensorflow/include/.

Once again: this way is officially unsupported and you may run in various issues. The library seems to be statically linked against e.g. protobuf, so you may run in odd link-time or run-time issues. But I am able to load a stored graph, restore the weights and run inference, which is IMO the most wanted functionality in C++.

Martin Pecka
  • 2,953
  • 1
  • 31
  • 40
  • I couldn't get this to work. I got a bunch of link time errors about undefined references to python stuff like: `undefined reference to 'PyType_IsSubtype'` – 0xcaff Jun 01 '17 at 23:01
  • Oh, thanks for pointing it out... You must also link against the `python2.7` library... I'll edit the post accordingly. – Martin Pecka Jun 02 '17 at 00:10
  • @MartinPecka I tried this on Raspbian Buster with the armv7l (Raspberry PI 2). The latest Python 2.7 and 3.7 wheels available are for 1.14.0, but I'm targeting 2.0.0. Thanks anyway, I upvoted your hack. – Daisuke Aramaki Dec 12 '19 at 21:37
5

answers above are good enough to show how to build the library, but how to collect the headers are still tricky. here I share the little script I use to copy the necessary headers.

SOURCE is the first param, which is the tensorflow source(build) direcoty;
DST is the second param, which is the include directory holds the collected headers. (eg. in cmake, include_directories(./collected_headers_here)).

#!/bin/bash

SOURCE=$1
DST=$2
echo "-- target dir is $DST"
echo "-- source dir is $SOURCE"

if [[ -e $DST ]];then
    echo "clean $DST"
    rm -rf $DST
    mkdir $DST
fi


# 1. copy the source code c++ api needs
mkdir -p $DST/tensorflow
cp -r $SOURCE/tensorflow/core $DST/tensorflow
cp -r $SOURCE/tensorflow/cc $DST/tensorflow
cp -r $SOURCE/tensorflow/c $DST/tensorflow

# 2. copy the generated code, put them back to
# the right directories along side the source code
if [[ -e $SOURCE/bazel-genfiles/tensorflow ]];then
    prefix="$SOURCE/bazel-genfiles/tensorflow"
    from=$(expr $(echo -n $prefix | wc -m) + 1)

    # eg. compiled protobuf files
    find $SOURCE/bazel-genfiles/tensorflow -type f | while read line;do
        #echo "procese file --> $line"
        line_len=$(echo -n $line | wc -m)
        filename=$(echo $line | rev | cut -d'/' -f1 | rev )
        filename_len=$(echo -n $filename | wc -m)
        to=$(expr $line_len - $filename_len)

        target_dir=$(echo $line | cut -c$from-$to)
        #echo "[$filename] copy $line $DST/tensorflow/$target_dir"
        cp $line $DST/tensorflow/$target_dir
    done
fi


# 3. copy third party files. Why?
# In the tf source code, you can see #include "third_party/...", so you need it
cp -r $SOURCE/third_party $DST

# 4. these headers are enough for me now.
# if your compiler complains missing headers, maybe you can find it in bazel-tensorflow/external
cp -RLf $SOURCE/bazel-tensorflow/external/eigen_archive/Eigen $DST
cp -RLf $SOURCE/bazel-tensorflow/external/eigen_archive/unsupported $DST
cp -RLf $SOURCE/bazel-tensorflow/external/protobuf_archive/src/google $DST
cp -RLf $SOURCE/bazel-tensorflow/external/com_google_absl/absl $DST
hakunami
  • 2,351
  • 4
  • 31
  • 50
  • 1
    this was really helpful snippet, there was an issue while creating a directory, so I had to add this `mkdir -p $DST/tensorflow$target_dir` before `cp $line $DST/tensorflow/$target_dir` – user969068 Nov 16 '19 at 12:23
  • @hakunami [I made a gist out of this script](https://gist.github.com/dtsmith2001/18bf4d2d9b7f832d53fe4a060772ed0f). Let me know what you think. If you want to make your own gist, I'll remove mine and clone yours. – Daisuke Aramaki Dec 12 '19 at 20:59
  • 1
    I quit after building tensorflow from source multiple times. Every time the problem was missing header files. Them https://github.com/FloopCZ/tensorflow_cc solved my problem. – fisakhan Sep 10 '20 at 19:59
2

Tensorflow itself only provides very basic examples about C++ APIs.
Here is a good resource which includes examples of datasets, rnn, lstm, cnn and more
tensorflow c++ examples

1

We now provide a pre-built library and a Docker image for easy installation and usage of the TensorFlow C++ API at https://github.com/ika-rwth-aachen/libtensorflow_cc

  1. We provide the pre-built libtensorflow_cc.so including accompanying headers as a one-command-install deb-package.
  2. We provide a pre-built Docker image based on the official TensorFlow Docker image. Our Docker image has both TensorFlow Python and TensorFlow C++ installed.

Try it out yourself by running the example application:

git clone https://github.com/ika-rwth-aachen/libtensorflow_cc.git && \
cd libtensorflow_cc && \
docker run --rm \
    --volume $(pwd)/example:/example \
    --workdir /example \
    rwthika/tensorflow-cc:latest \
        ./build-and-run.sh

While we currently only support x86_64 machines running Ubuntu, this could easily be extended to other OS and platforms in the future. Except for a some exceptions, all TensorFlow versions from 2.0.0 through 2.9.2 are available, 2.10.0 coming soon.

If you want to use the TensorFlow C++ API to load, inspect, and run saved models and frozen graphs in C++, we suggest that you also check out our helper library tensorflow_cpp.

lerei
  • 103
  • 7
  • rpm / rhel 'd be nice - as in lib and hdrs. (lot of friction to use C++ version vs others) – Bob Jun 03 '23 at 10:38