C++加载运行Tensorflow模型

来源:互联网 发布:multiselect.js 下载 编辑:程序博客网 时间:2024/06/05 20:06
1.Tensorflow提供了C++API来建立一个graph,但是对于python并不那么完善,许多特性在C++中并不可用。另外一种方法是,用C++ API加载预先训练好的graph,来单独或是嵌入到其他应用中。当前有关用c++加载graph的文档非常少,这里提出一个简单的例子。
参考自
https://medium.com/jim-fleming/loading-a-tensorflow-graph-with-the-c-api-4caaff88463f
https://vimsky.com/article/3600.html
2.需要
安装Bazel(google的build工具,用于编译)
下载并解压TensorFlow源码
git clone --recursive https://github.com/tensorflow/tensorflow
protobuf工具,用于生成pb文件
3.编写python代码,生成.pb文件
python线下模型和训练参考自http://blog.csdn.net/rockingdingo/article/details/75452711
4.文件夹结构
tensorflow/tensorflow/|project name|/
tensorflow/tensorflow/|project name|/|project name|.cc (e.g. https://gist.github.com/jimfleming/4202e529042c401b17b7)
tensorflow/tensorflow/|project name|/BUILD
.cc文件见例子形式
BUILD文件如下形式
cc_binary(
name = "<project name>",
srcs = ["<project name>.cc"],
deps = [
"//tensorflow/core:tensorflow",
]
)
5.编译运行
1)在tensorflow/根目录下
./configure
设置编译tensorflow相关选项(仅第一次需要,编译过不用再重复)
2)在tensorflow/tensorflow/|project name|/文件夹目录下
bazel build :<project name>
(即BUILD中name对应名称)
3)在tensorflow/bazel-bin/tensorflow//|project name|/目录下
复制生成的pb文件到该目录
./<project name>
运行该文件

这里示例
tensorflow/tensorflow/my_loader/
tensorflow/tensorflow/my_loader/loader.cc
tensorflow/tensorflow/my_loader/BUILD

BUILD:
cc_binary(
name="loader",
srcs= ["load.cc"],
deps= [
"//tensorflow/core:tensorflow",
]
)

loader.cc:
#include "tensorflow/core/public/session.h"
#include "tensorflow/core/platform/env.h"
#include <iostream>

using namespace tensorflow;

int main(int argc, char* argv[]) {
// Initialize a tensorflow session
Session* session;///
Status status = NewSession(SessionOptions(), &session);
if (!status.ok()) {
std::cout << status.ToString() << "\n";
return 1;
}

// Read in the protobuf graph we exported
GraphDef graph_def;
status = ReadBinaryProto(Env::Default(), "model.pb", &graph_def);///
if (!status.ok()) {
std::cout << status.ToString() << "\n";
return 1;
}

// Add the graph to the session
status = session->Create(graph_def);
if (!status.ok()) {
std::cout << status.ToString() << "\n";
return 1;
}

// Setup inputs and outputs:
// initialize inputs, (TensorShap: demonstrate)
float a={0.f,1.f,181.f,5450.f,0.f,8.f,8.f,0.f,1.f};
std::initializer_list<float> v_a={a[0],a[1],a[2],a[3],a[4],a[5],a[6],a[7],a[8]};
Tensor tensor_in_a(DT_FLOAT, TensorShape({1,9}));//output:0
tensor_in_a.matrix<float>().setValues({v_a});
std::vector<std::pair<string, tensorflow::Tensor>> inputs = {
{ "inputs/X_input", tensor_in_b}
};

// The session will initialize the outputs
Tensor tensor_out(DT_FLOAT, TensorShape({1,2}));
std::vector<tensorflow::Tensor> outputs={
{ tensor_out }
};

// Run the session, evaluating our "outputs/Softmax" operation from the graph
status = session->Run(inputs, {"outputs/Softmax"}, {}, &outputs);
if (!status.ok()) {
std::cout << status.ToString() << "\n";
return 1;
}

// Print the results
std::cout << outputs[0].DebugString() << std::endl;
std::cout << outputs[0].matrix<float>()(0,0) << ' ' << outputs[0].matrix<float>()(0,1) << std::endl;

// Free any resources used by the session
session->Close();
return 0;
}

关于.cc代码中用到相关类及方法,在官网API和Tensorflow源码头文件中查看获取。
参见http://blog.csdn.net/badmushroom/article/details/78720582
原创粉丝点击