python-caffe接口学习(Solving in Python with LeNet)
来源:互联网 发布:考公务员 知乎 编辑:程序博客网 时间:2024/06/13 14:23
参考官方链接:
http://nbviewer.jupyter.org/github/BVLC/caffe/blob/master/examples/01-learning-lenet.ipynb
[(k, v.data.shape) for k, v in solver.net.blobs.items()] // 利用此命令,可查看featuremap/* // solver.net.blobs数据类型为字典类型,其中value值为blob类型,其中data存放featuremap数据[('data', (64, 1, 28, 28)), ('label', (64,)), ('conv1', (64, 20, 24, 24)), ('pool1', (64, 20, 12, 12)), ('conv2', (64, 50, 8, 8)), ('pool2', (64, 50, 4, 4)), ('fc1', (64, 500)), ('score', (64, 10)), ('loss', ())]*/
[(k, v[0].data.shape) for k, v in solver.net.params.items()]//同样,类型为字典/*[('conv1', (20, 1, 5, 5)), ('conv2', (50, 20, 5, 5)), ('fc1', (500, 800)), ('score', (10, 500))]*/
可视化滤波器或滤波器梯度参数:
solver.step(1)imshow(solver.net.params['conv1'][0].diff[:, 0].reshape(4, 5, 5, 5) .transpose(0, 2, 1, 3).reshape(4*5, 5*5), cmap='gray'); axis('off')
1、通过python可视化输入数据来看,训练、测试过程中的图像是顺序导入caffe中的,不是随机生成的样本,因此,后续在制作训练集的时候,要先打乱数据在最做lmdb数据。2、训练过程中的test是如何共享训练过程中的权重的?在step函数中,调用TestAll(),TestAll()--》Test()---> ShareTrainedLayersWith(net_.get());3、step(1) 其中,1表示迭代次数,而test在首次肯定会被调用,因此step(1) test和train都会调用
void Solver<Dtype>::Step(int iters) { const int start_iter = iter_; const int stop_iter = iter_ + iters; int average_loss = this->param_.average_loss(); losses_.clear(); smoothed_loss_ = 0; while (iter_ < stop_iter) { // zero-init the params net_->ClearParamDiffs(); /////Net 的成员变量param_ 中的diff开辟空间,并初始化为0 if (param_.test_interval() && iter_ % param_.test_interval() == 0 && (iter_ > 0 || param_.test_initialization()) && Caffe::root_solver()) { TestAll();//-->Test()----> ShareTrainedLayersWith(net_.get());//////////// if (requested_early_exit_) { // Break out of the while loop because stop was requested while testing. break; } } for (int i = 0; i < callbacks_.size(); ++i) { callbacks_[i]->on_start(); } const bool display = param_.display() && iter_ % param_.display() == 0; net_->set_debug_info(display && param_.debug_info()); // accumulate the loss and gradient Dtype loss = 0; for (int i = 0; i < param_.iter_size(); ++i) { loss += net_->ForwardBackward(); } loss /= param_.iter_size(); // average the loss across iterations for smoothed reporting UpdateSmoothedLoss(loss, start_iter, average_loss); if (display) { LOG_IF(INFO, Caffe::root_solver()) << "Iteration " << iter_ //////////////// Iteration, loss= << ", loss = " << smoothed_loss_; const vector<Blob<Dtype>*>& result = net_->output_blobs(); int score_index = 0; for (int j = 0; j < result.size(); ++j) { const Dtype* result_vec = result[j]->cpu_data(); const string& output_name = net_->blob_names()[net_->output_blob_indices()[j]]; const Dtype loss_weight = net_->blob_loss_weights()[net_->output_blob_indices()[j]]; for (int k = 0; k < result[j]->count(); ++k) { //////Train net out/////////////// ostringstream loss_msg_stream; if (loss_weight) { loss_msg_stream << " (* " << loss_weight << " = " << loss_weight * result_vec[k] << " loss)"; } LOG_IF(INFO, Caffe::root_solver()) << " Train net output #" << score_index++ << ": " << output_name << " = " << result_vec[k] << loss_msg_stream.str(); } } }//if(display) for (int i = 0; i < callbacks_.size(); ++i) { callbacks_[i]->on_gradients_ready(); } ApplyUpdate(); ////////////////////////////////////// Iteration 0, lr==XXXX // Increment the internal iter_ counter -- its value should always indicate // the number of times the weights have been updated. ++iter_; SolverAction::Enum request = GetRequestedAction(); // Save a snapshot if needed. if ((param_.snapshot() && iter_ % param_.snapshot() == 0 && Caffe::root_solver()) || (request == SolverAction::SNAPSHOT)) { Snapshot(); } if (SolverAction::STOP == request) { requested_early_exit_ = true; // Break out of training loop. break; }// if() }// end while} // end step()
结果分析
阅读全文
0 0
- python-caffe接口学习(Solving in Python with LeNet)
- caffe学习笔记8-- Python solving with LeNet
- 【caffe】Caffe的Python接口-官方教程-01-learning-Lenet-详细说明(含代码)
- caffe示例实现之10LeNet的python接口
- caffe-Python-learning-lenet-02
- 使用 Caffe Python 编写 LeNet
- Caffe学习笔记(1)--Python接口
- Caffe学习笔记(1)--Python接口
- 【caffe学习笔记之6】caffe-matlab/python训练LeNet模型并应用于mnist数据集(1)
- 【caffe学习笔记之7】caffe-matlab/python训练LeNet模型并应用于mnist数据集(2)
- LeNet Tutorial with Caffe
- caffe之python接口实战 :01-learning-lenet 官方教程源码解析
- caffe学习笔记(2)【Training LeNet on MNIST with Caffe use CPU】
- caffe学习笔记:1、Training LeNet on MNIST with Caffe
- Caffe学习笔记《Training LeNet on MNIST with Caffe》
- 基于Problem Solving with Algorithms and Data Structures using Python的学习记录(3)——Basic Data Structures
- 基于Problem Solving with Algorithms and Data Structures using Python的学习记录(5)——Sorting
- 基于Problem Solving with Algorithms and Data Structures using Python的学习记录(6-1)——Tree
- 题目448-寻找最大数
- CodeChef:Centeroid(树的构造)
- PowerDesigner(一)-PowerDesigner概述(系统分析与建模)
- win10之dlib安装过程(c++调用库,非python版)
- 获取python文件中的特定函数执行结果
- python-caffe接口学习(Solving in Python with LeNet)
- PowerDesigner(二)-项目和框架矩阵
- 面试的时候应该怎么介绍自己(2)
- nginx tips
- PowerDesigner(三)-企业架构模型
- maven个人实战总结&FAQ
- Android 中利用 ksoap2 调用 WebService
- gitlab Chef::Exceptions::ChildConvergeError: Chef run process exited unsuccessfully (exit code 1)
- JSON 必知必会