Build Tensorflow 1.14 C++ DLL for Windows

Build Tensorflow 1.14 C++ DLL for Windows

Btw if you enjoy my tutorial, I always appreciate endorsements on my LinkedIn: https://www.linkedin.com/in/ashleytharp/
Questions can go to ashley.tharp@gmail.com. I can currently check my email and attempt to assist on weekends mostly.

Relevant Github Issue Here: https://github.com/tensorflow/tensorflow/issues/23542 with my contributions. If you are unable to use my method, there is another popular method explained in that Github issue.

Building the Tensorflow Source code on Windows in C++ with GPU support

Python version: 3.6.8
Tensorflow Version: 1.14.0
OS: Windows 10
Cuda Version: 10.2

Step 0: Check your hardware

Before we even start we need to make sure you are running on the correct hardware. See below this screencap from the Tensorflow documentation:

Step 1: Install NVidia Graphics Card Driver

You need to install the NVidia graphics driver for your card. The page for NVidia Graphics card drivers is https://www.nvidia.com/Download/index.aspx

Step 2: Install Cuda for Windows

The documentation is here: https://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/
Go to here: https://developer.nvidia.com/cuda-downloads and download the NVidia Cuda Toolkit.

Step 3: Install Python for Windows

Download page: https://www.python.org/downloads/windows/ Make sure you set your path variables. If you cannot call python --version from the cmd terminal you did not set your environment variables correctly yet.

pip3 install six numpy wheel
pip3 install keras_applications==1.0.6 --no-deps
pip3 install keras_preprocessing==1.0.5 --no-deps

Step 4: Install Bazel

The documentation is on this page: https://docs.bazel.build/versions/master/install-windows.html Follow all the steps on that page and install all the prereqs.

'bazel' is not recognized as an internal or external command,
operable program or batch file.
set PATH=%PATH%;<path to the Bazel binary>
set PATH="%PATH%;C:\Program Files\Bazel"

Step 5: Configure Bazel to Build C++ on Windows

Follow this documentation: https://docs.bazel.build/versions/master/windows.html#build-c-with-msvc

Step 6: Install MSYS

I get this really helpful error message on first run:

$ bazel helpERROR: bash.exe not found on PATH
Bazel on Windows requires MSYS2 Bash, but we could not find it.
If you do not have it installed, you can install MSYS2 from
http://repo.msys2.org/distrib/msys2-x86_64-latest.exe
If you already have it installed but Bazel cannot find it,
set BAZEL_SH environment variable to its location:
set BAZEL_SH=c:\path\to\msys2\usr\bin\bash.exe
[bazel ERROR src/main/cpp/blaze_util_windows.cc:1463] bash.exe not found on PATH
[bazel INFO src/main/cpp/blaze_util_windows.cc:1478] BAZEL_SH detection took 1 msec, found
bazel help
Extracting Bazel installation...
WARNING: --batch mode is deprecated. Please instead explicitly shut down your Bazel server using the command "bazel shutdown".
[bazel release 0.24.1]
Usage: bazel <command> <options> ...
Available commands:
...
...
pacman -S git patch unzip

Step 7: Install Visual Studio Build Tools

2017

  1. Select Redistributables and Build Tools,
  2. Download and install:
  • Microsoft Build Tools 2019

Step 8: Clone the Tensorflow source code

Original Documentation: https://www.tensorflow.org/install/source_windows#download_the_tensorflow_source_code

git clone https://github.com/tensorflow/tensorflow.git
cd tensorflow
git checkout r1.14

Step 9: Configure the Build using configure.py

Original Documentation: python ./configure.py Run python ./configure.py at the root of your source tree

$ python configure.py
WARNING: --batch mode is deprecated. Please instead explicitly shut down your Bazel server using the command "bazel shutdown".
You have bazel 0.24.1 installed.
Please specify the location of python. [Default is C:\Users\Username\AppData\Local\Programs\Python\Python36\python.exe]:
Found possible Python library paths:
C:\Users\Username\AppData\Local\Programs\Python\Python36\lib\site-packages
Please input the desired Python library path to use. Default is [C:\Users\Username\AppData\Local\Programs\Python\Python36\lib\site-packages]
Do you wish to build TensorFlow with XLA JIT support? [y/N]:
No XLA JIT support will be enabled for TensorFlow.
Do you wish to build TensorFlow with ROCm support? [y/N]:
No ROCm support will be enabled for TensorFlow.
Do you wish to build TensorFlow with CUDA support? [y/N]: Y
CUDA support will be enabled for TensorFlow.
Found CUDA 10.0 in:
C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v10.0/lib/x64
C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v10.0/include
Found cuDNN 7 in:
C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v10.0/lib
C:/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v10.0/include

Step 10: Build the dll

Note: Dont do this:

bazel build //tensorflow/tools/pip_package:build_pip_package
bazel build --config=cuda tensorflow:tensorflow.dll
ERROR: An error occurred during the fetch of repository 'eigen_archive':

Step 11: Build the .lib

bazel build --config=cuda tensorflow:tensorflow.lib

Step 12: Link your .lib into your Windows project for testing

You may do this in Visual Studio, or Qt for example. Add a path to your .lib file and test compilation calling a tensorflow function in your c++.

Step 13: Add Header source if necessary

On my build I had trouble with these libs so I downloaded their source code separately and linked them into my project.
Clone these header libraries from Github or download the source from these 3 places:

https://github.com/protocolbuffers/protobuf/releases/tag/v3.7.0 
https://github.com/abseil/abseil-cpp
http://eigen.tuxfamily.org/index.php?title=Main_Page

Step 14: Identify Missing Symbols:

You have built your .libs and .dll files. Now you should make a small C++ project to test using these libraries. Probably one of the first things you will want to do is just some very very basic tensorflow setup code like this:

#include "tensorflow/cc/ops/standard_ops.h"
#include <tensorflow/core/framework/graph.pb.h>
#include "tensorflow/core/graph/default_device.h"
#include "tensorflow/core/graph/graph_def_builder.h"
#include "tensorflow/core/lib/core/threadpool.h"
#include "tensorflow/core/lib/strings/stringprintf.h"
#include "tensorflow/core/platform/init_main.h"
#include "tensorflow/core/platform/logging.h"
#include "tensorflow/core/platform/types.h"
#include "tensorflow/core/public/session.h"
#include "tensorflow/core/protobuf/meta_graph.pb.h"
#include "tensorflow/core/framework/graph.pb.h"
#include "tensorflow/core/public/session.h"
#include "tensorflow/core/framework/tensor.h"
using namespace tensorflow;int main(int argc, char *argv[]) {
// Create a Session running TensorFlow locally in process.
std::unique_ptr<tensorflow::Session> session(tensorflow::NewSession({}));
return 0;
}
...
/// ```
///
/// A Session allows concurrent calls to Run(), though a Session must
/// be created / extended by a single thread.
///
/// Only one thread must call Close(), and Close() must only be called
/// after all other calls to Run() have returned.
class Session {
public:
TF_EXPORT Session();
virtual ~Session();
/// \brief Create the graph to be used for the session.
///
/// Returns an error if this session has already been created with a
/// graph. To re-use the session with a different graph, the caller
/// must Close() the session first.
virtual Status Create(const GraphDef& graph) = 0;
...
TF_EXPORT Status NewSession(const SessionOptions& options, Session** out_session);
TF_EXPORT Session* NewSession(const SessionOptions& options);
/* Copyright 2015 The TensorFlow Authors. All Rights Reserved.Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
==============================================================================*/
#ifndef TENSORFLOW_PUBLIC_SESSION_OPTIONS_H_
#define TENSORFLOW_PUBLIC_SESSION_OPTIONS_H_
#include <string>
#include "tensorflow/core/platform/types.h"
#include "tensorflow/core/protobuf/config.pb.h"
#include "tensorflow/core/platform/macros.h"
namespace tensorflow {class Env;/// Configuration information for a Session.
TF_EXPORT struct SessionOptions {
/// The environment to use.
Env* env;
/// \brief The TensorFlow runtime to connect to.
///
/// If 'target' is empty or unspecified, the local TensorFlow runtime
/// implementation will be used. Otherwise, the TensorFlow engine
/// defined by 'target' will be used to perform all computations.
///
/// "target" can be either a single entry or a comma separated list
/// of entries. Each entry is a resolvable address of the
/// following format:
/// local
/// ip:port
/// host:port
/// ... other system-specific formats to identify tasks and jobs ...
///
/// NOTE: at the moment 'local' maps to an in-process service-based
/// runtime.
///
/// Upon creation, a single session affines itself to one of the
/// remote processes, with possible load balancing choices when the
/// "target" resolves to a list of possible processes.
///
/// If the session disconnects from the remote process during its
/// lifetime, session calls may fail immediately.
string target;
/// Configuration options.
ConfigProto config;
TF_EXPORT SessionOptions();
};
} // end namespace tensorflow#endif // TENSORFLOW_PUBLIC_SESSION_OPTIONS_H_

On collaboration with me to get your Windows tensorflow build working or suggesting edits to this document

You can email me at ashley.tharp@gmail.com. I try to check my email and clear my inbox every day. I will have most active time to work on collabs on weekends as this is a side project for me.

XR Developer at HookBang