caffe基础-13AlexNet模型bvlc_reference_caffenet的测试
来源:互联网 发布:网络包年维护服务 编辑:程序博客网 时间:2024/06/05 11:15
1、 准备
- AlexNet模型的相关caffe资料在Github均可见,https://github.com/BVLC/caffe/tree/master/models/bvlc_reference_caffenet
- 我们把这个训练好的模型下载下来后,测试一下模型的分类效果。
- 在目录:/home/username/caffe/examples/images下,有一些小猫的图片。我们以其作为模型的输入,来测试分类的效果。
2、模型测试代码
- 之前我们做灰度图像分类测试是用的caffe一个自带的cassify.py文件并经过一些简单的修改。这次我们创建一个testModel.py文件来测试分类,此文件代码如下:
import numpy as npimport sysimport caffecaffe_root = '/home/terrence/caffe/'sys.path.insert(0, caffe_root + 'python') caffe.set_device(0)caffe.set_mode_gpu() modelDef = '/home/terrence/caffe/models/bvlc_reference_caffenet/deploy.prototxt'modelWeights = '/home/terrence/caffe_case/bvlc_reference_caffenet.caffemodel'net = caffe.Net(modelDef, #模型结构 modelWeights, #训练后的权重 caffe.TEST) #三通道均值mu = np.load(caffe_root + 'python/caffe/imagenet/ilsvrc_2012_mean.npy')mu = mu.mean(1).mean(1)print 'mean subtracted values:' , zip('BGR', mu)transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})transformer.set_transpose('data', (2,0,1)) # h*w*c -> b* c*h*w transformer.set_mean('data', mu)transformer.set_raw_scale('data', 255)transformer.set_channel_swap('data', (2,1,0)) #RGB -> BGRnet.blobs['data'].reshape(1, #batch size 3, #3 channel 227,227) #image sizeimage = caffe.io.load_image(caffe_root + 'examples/images/cat.jpg')transformedImage = transformer.preprocess('data', image)net.blobs['data'].data[...] = transformedImageoutput = net.forward()outputProb = output['prob'][0]print 'predicted class is: ', outputProb.argmax()
- 运行代码之前一定要看目录下有没有相关的文件,有就不会出错。
3、实现分类
- 执行命令:python testModel.py ,中断输出信息如下:
WARNING: Logging before InitGoogleLogging() is written to STDERRW0921 15:31:58.679814 3654 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interfaceW0921 15:31:58.679846 3654 _caffe.cpp:140] Use this instead (with the named "weights" parameter):W0921 15:31:58.679864 3654 _caffe.cpp:142] Net('/home/terrence/caffe/models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='/home/terrence/caffe_case/bvlc_reference_caffenet.caffemodel')I0921 15:31:58.681190 3654 net.cpp:51] Initializing net from parameters: name: "CaffeNet"state { phase: TEST level: 0}layer { name: "data" type: "Input" top: "data" input_param { shape { dim: 10 dim: 3 dim: 227 dim: 227 } }}layer { name: "conv1" type: "Convolution" bottom: "data" top: "conv1" convolution_param { num_output: 96 kernel_size: 11 stride: 4 }}layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1"}layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "norm1" type: "LRN" bottom: "pool1" top: "norm1" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 }}layer { name: "conv2" type: "Convolution" bottom: "norm1" top: "conv2" convolution_param { num_output: 256 pad: 2 kernel_size: 5 group: 2 }}layer { name: "relu2" type: "ReLU" bottom: "conv2" top: "conv2"}layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "norm2" type: "LRN" bottom: "pool2" top: "norm2" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 }}layer { name: "conv3" type: "Convolution" bottom: "norm2" top: "conv3" convolution_param { num_output: 384 pad: 1 kernel_size: 3 }}layer { name: "relu3" type: "ReLU" bottom: "conv3" top: "conv3"}layer { name: "conv4" type: "Convolution" bottom: "conv3" top: "conv4" convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 2 }}layer { name: "relu4" type: "ReLU" bottom: "conv4" top: "conv4"}layer { name: "conv5" type: "Convolution" bottom: "conv4" top: "conv5" convolution_param { num_output: 256 pad: 1 kernel_size: 3 group: 2 }}layer { name: "relu5" type: "ReLU" bottom: "conv5" top: "conv5"}layer { name: "pool5" type: "Pooling" bottom: "conv5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" inner_product_param { num_output: 4096 }}layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6"}layer { name: "drop6" type: "Dropout" bottom: "fc6" top: "fc6" dropout_param { dropout_ratio: 0.5 }}layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" inner_product_param { num_output: 4096 }}layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7"}layer { name: "drop7" type: "Dropout" bottom: "fc7" top: "fc7" dropout_param { dropout_ratio: 0.5 }}layer { name: "fc8" type: "InnerProduct" bottom: "fc7" top: "fc8" inner_product_param { num_output: 1000 }}layer { name: "prob" type: "Softmax" bottom: "fc8" top: "prob"}I0921 15:31:58.681818 3654 layer_factory.hpp:77] Creating layer dataI0921 15:31:58.681843 3654 net.cpp:84] Creating Layer dataI0921 15:31:58.681849 3654 net.cpp:380] data -> dataI0921 15:31:58.688185 3654 net.cpp:122] Setting up dataI0921 15:31:58.688215 3654 net.cpp:129] Top shape: 10 3 227 227 (1545870)I0921 15:31:58.688235 3654 net.cpp:137] Memory required for data: 6183480I0921 15:31:58.688242 3654 layer_factory.hpp:77] Creating layer conv1I0921 15:31:58.688259 3654 net.cpp:84] Creating Layer conv1I0921 15:31:58.688266 3654 net.cpp:406] conv1 <- dataI0921 15:31:58.688285 3654 net.cpp:380] conv1 -> conv1I0921 15:31:58.688968 3654 net.cpp:122] Setting up conv1I0921 15:31:58.688983 3654 net.cpp:129] Top shape: 10 96 55 55 (2904000)I0921 15:31:58.688988 3654 net.cpp:137] Memory required for data: 17799480I0921 15:31:58.689014 3654 layer_factory.hpp:77] Creating layer relu1I0921 15:31:58.689023 3654 net.cpp:84] Creating Layer relu1I0921 15:31:58.689028 3654 net.cpp:406] relu1 <- conv1I0921 15:31:58.689035 3654 net.cpp:367] relu1 -> conv1 (in-place)I0921 15:31:58.689047 3654 net.cpp:122] Setting up relu1I0921 15:31:58.689066 3654 net.cpp:129] Top shape: 10 96 55 55 (2904000)I0921 15:31:58.689071 3654 net.cpp:137] Memory required for data: 29415480I0921 15:31:58.689077 3654 layer_factory.hpp:77] Creating layer pool1I0921 15:31:58.689085 3654 net.cpp:84] Creating Layer pool1I0921 15:31:58.689090 3654 net.cpp:406] pool1 <- conv1I0921 15:31:58.689095 3654 net.cpp:380] pool1 -> pool1I0921 15:31:58.689131 3654 net.cpp:122] Setting up pool1I0921 15:31:58.689139 3654 net.cpp:129] Top shape: 10 96 27 27 (699840)I0921 15:31:58.689144 3654 net.cpp:137] Memory required for data: 32214840I0921 15:31:58.689149 3654 layer_factory.hpp:77] Creating layer norm1I0921 15:31:58.689157 3654 net.cpp:84] Creating Layer norm1I0921 15:31:58.689162 3654 net.cpp:406] norm1 <- pool1I0921 15:31:58.689182 3654 net.cpp:380] norm1 -> norm1I0921 15:31:58.689230 3654 net.cpp:122] Setting up norm1I0921 15:31:58.689237 3654 net.cpp:129] Top shape: 10 96 27 27 (699840)I0921 15:31:58.689242 3654 net.cpp:137] Memory required for data: 35014200I0921 15:31:58.689246 3654 layer_factory.hpp:77] Creating layer conv2I0921 15:31:58.689275 3654 net.cpp:84] Creating Layer conv2I0921 15:31:58.689280 3654 net.cpp:406] conv2 <- norm1I0921 15:31:58.689285 3654 net.cpp:380] conv2 -> conv2I0921 15:31:58.690135 3654 net.cpp:122] Setting up conv2I0921 15:31:58.690148 3654 net.cpp:129] Top shape: 10 256 27 27 (1866240)I0921 15:31:58.690153 3654 net.cpp:137] Memory required for data: 42479160I0921 15:31:58.690176 3654 layer_factory.hpp:77] Creating layer relu2I0921 15:31:58.690183 3654 net.cpp:84] Creating Layer relu2I0921 15:31:58.690201 3654 net.cpp:406] relu2 <- conv2I0921 15:31:58.690207 3654 net.cpp:367] relu2 -> conv2 (in-place)I0921 15:31:58.690215 3654 net.cpp:122] Setting up relu2I0921 15:31:58.690222 3654 net.cpp:129] Top shape: 10 256 27 27 (1866240)I0921 15:31:58.690227 3654 net.cpp:137] Memory required for data: 49944120I0921 15:31:58.690232 3654 layer_factory.hpp:77] Creating layer pool2I0921 15:31:58.690238 3654 net.cpp:84] Creating Layer pool2I0921 15:31:58.690244 3654 net.cpp:406] pool2 <- conv2I0921 15:31:58.690249 3654 net.cpp:380] pool2 -> pool2I0921 15:31:58.690279 3654 net.cpp:122] Setting up pool2I0921 15:31:58.690294 3654 net.cpp:129] Top shape: 10 256 13 13 (432640)I0921 15:31:58.690299 3654 net.cpp:137] Memory required for data: 51674680I0921 15:31:58.690315 3654 layer_factory.hpp:77] Creating layer norm2I0921 15:31:58.690337 3654 net.cpp:84] Creating Layer norm2I0921 15:31:58.690342 3654 net.cpp:406] norm2 <- pool2I0921 15:31:58.690361 3654 net.cpp:380] norm2 -> norm2I0921 15:31:58.690397 3654 net.cpp:122] Setting up norm2I0921 15:31:58.690407 3654 net.cpp:129] Top shape: 10 256 13 13 (432640)I0921 15:31:58.690423 3654 net.cpp:137] Memory required for data: 53405240I0921 15:31:58.690428 3654 layer_factory.hpp:77] Creating layer conv3I0921 15:31:58.690449 3654 net.cpp:84] Creating Layer conv3I0921 15:31:58.690454 3654 net.cpp:406] conv3 <- norm2I0921 15:31:58.690477 3654 net.cpp:380] conv3 -> conv3I0921 15:31:58.691694 3654 net.cpp:122] Setting up conv3I0921 15:31:58.691715 3654 net.cpp:129] Top shape: 10 384 13 13 (648960)I0921 15:31:58.691720 3654 net.cpp:137] Memory required for data: 56001080I0921 15:31:58.691745 3654 layer_factory.hpp:77] Creating layer relu3I0921 15:31:58.691766 3654 net.cpp:84] Creating Layer relu3I0921 15:31:58.691772 3654 net.cpp:406] relu3 <- conv3I0921 15:31:58.691778 3654 net.cpp:367] relu3 -> conv3 (in-place)I0921 15:31:58.691787 3654 net.cpp:122] Setting up relu3I0921 15:31:58.691793 3654 net.cpp:129] Top shape: 10 384 13 13 (648960)I0921 15:31:58.691799 3654 net.cpp:137] Memory required for data: 58596920I0921 15:31:58.691803 3654 layer_factory.hpp:77] Creating layer conv4I0921 15:31:58.691812 3654 net.cpp:84] Creating Layer conv4I0921 15:31:58.691817 3654 net.cpp:406] conv4 <- conv3I0921 15:31:58.691823 3654 net.cpp:380] conv4 -> conv4I0921 15:31:58.692813 3654 net.cpp:122] Setting up conv4I0921 15:31:58.692829 3654 net.cpp:129] Top shape: 10 384 13 13 (648960)I0921 15:31:58.692834 3654 net.cpp:137] Memory required for data: 61192760I0921 15:31:58.692857 3654 layer_factory.hpp:77] Creating layer relu4I0921 15:31:58.692875 3654 net.cpp:84] Creating Layer relu4I0921 15:31:58.692880 3654 net.cpp:406] relu4 <- conv4I0921 15:31:58.692886 3654 net.cpp:367] relu4 -> conv4 (in-place)I0921 15:31:58.692893 3654 net.cpp:122] Setting up relu4I0921 15:31:58.692899 3654 net.cpp:129] Top shape: 10 384 13 13 (648960)I0921 15:31:58.692905 3654 net.cpp:137] Memory required for data: 63788600I0921 15:31:58.692910 3654 layer_factory.hpp:77] Creating layer conv5I0921 15:31:58.692919 3654 net.cpp:84] Creating Layer conv5I0921 15:31:58.692924 3654 net.cpp:406] conv5 <- conv4I0921 15:31:58.692930 3654 net.cpp:380] conv5 -> conv5I0921 15:31:58.693665 3654 net.cpp:122] Setting up conv5I0921 15:31:58.693677 3654 net.cpp:129] Top shape: 10 256 13 13 (432640)I0921 15:31:58.693682 3654 net.cpp:137] Memory required for data: 65519160I0921 15:31:58.693706 3654 layer_factory.hpp:77] Creating layer relu5I0921 15:31:58.693714 3654 net.cpp:84] Creating Layer relu5I0921 15:31:58.693719 3654 net.cpp:406] relu5 <- conv5I0921 15:31:58.693724 3654 net.cpp:367] relu5 -> conv5 (in-place)I0921 15:31:58.693744 3654 net.cpp:122] Setting up relu5I0921 15:31:58.693750 3654 net.cpp:129] Top shape: 10 256 13 13 (432640)I0921 15:31:58.693755 3654 net.cpp:137] Memory required for data: 67249720I0921 15:31:58.693760 3654 layer_factory.hpp:77] Creating layer pool5I0921 15:31:58.693766 3654 net.cpp:84] Creating Layer pool5I0921 15:31:58.693773 3654 net.cpp:406] pool5 <- conv5I0921 15:31:58.693780 3654 net.cpp:380] pool5 -> pool5I0921 15:31:58.693814 3654 net.cpp:122] Setting up pool5I0921 15:31:58.693822 3654 net.cpp:129] Top shape: 10 256 6 6 (92160)I0921 15:31:58.693827 3654 net.cpp:137] Memory required for data: 67618360I0921 15:31:58.693831 3654 layer_factory.hpp:77] Creating layer fc6I0921 15:31:58.693845 3654 net.cpp:84] Creating Layer fc6I0921 15:31:58.693866 3654 net.cpp:406] fc6 <- pool5I0921 15:31:58.693873 3654 net.cpp:380] fc6 -> fc6I0921 15:31:58.753211 3654 net.cpp:122] Setting up fc6I0921 15:31:58.753250 3654 net.cpp:129] Top shape: 10 4096 (40960)I0921 15:31:58.753270 3654 net.cpp:137] Memory required for data: 67782200I0921 15:31:58.753294 3654 layer_factory.hpp:77] Creating layer relu6I0921 15:31:58.753309 3654 net.cpp:84] Creating Layer relu6I0921 15:31:58.753315 3654 net.cpp:406] relu6 <- fc6I0921 15:31:58.753322 3654 net.cpp:367] relu6 -> fc6 (in-place)I0921 15:31:58.753334 3654 net.cpp:122] Setting up relu6I0921 15:31:58.753338 3654 net.cpp:129] Top shape: 10 4096 (40960)I0921 15:31:58.753343 3654 net.cpp:137] Memory required for data: 67946040I0921 15:31:58.753348 3654 layer_factory.hpp:77] Creating layer drop6I0921 15:31:58.753361 3654 net.cpp:84] Creating Layer drop6I0921 15:31:58.753368 3654 net.cpp:406] drop6 <- fc6I0921 15:31:58.753373 3654 net.cpp:367] drop6 -> fc6 (in-place)I0921 15:31:58.753394 3654 net.cpp:122] Setting up drop6I0921 15:31:58.753402 3654 net.cpp:129] Top shape: 10 4096 (40960)I0921 15:31:58.753407 3654 net.cpp:137] Memory required for data: 68109880I0921 15:31:58.753412 3654 layer_factory.hpp:77] Creating layer fc7I0921 15:31:58.753418 3654 net.cpp:84] Creating Layer fc7I0921 15:31:58.753434 3654 net.cpp:406] fc7 <- fc6I0921 15:31:58.753440 3654 net.cpp:380] fc7 -> fc7I0921 15:31:58.779583 3654 net.cpp:122] Setting up fc7I0921 15:31:58.779621 3654 net.cpp:129] Top shape: 10 4096 (40960)I0921 15:31:58.779641 3654 net.cpp:137] Memory required for data: 68273720I0921 15:31:58.779666 3654 layer_factory.hpp:77] Creating layer relu7I0921 15:31:58.779678 3654 net.cpp:84] Creating Layer relu7I0921 15:31:58.779685 3654 net.cpp:406] relu7 <- fc7I0921 15:31:58.779691 3654 net.cpp:367] relu7 -> fc7 (in-place)I0921 15:31:58.779702 3654 net.cpp:122] Setting up relu7I0921 15:31:58.779707 3654 net.cpp:129] Top shape: 10 4096 (40960)I0921 15:31:58.779712 3654 net.cpp:137] Memory required for data: 68437560I0921 15:31:58.779716 3654 layer_factory.hpp:77] Creating layer drop7I0921 15:31:58.779723 3654 net.cpp:84] Creating Layer drop7I0921 15:31:58.779733 3654 net.cpp:406] drop7 <- fc7I0921 15:31:58.779741 3654 net.cpp:367] drop7 -> fc7 (in-place)I0921 15:31:58.779762 3654 net.cpp:122] Setting up drop7I0921 15:31:58.779770 3654 net.cpp:129] Top shape: 10 4096 (40960)I0921 15:31:58.779775 3654 net.cpp:137] Memory required for data: 68601400I0921 15:31:58.779780 3654 layer_factory.hpp:77] Creating layer fc8I0921 15:31:58.779796 3654 net.cpp:84] Creating Layer fc8I0921 15:31:58.779803 3654 net.cpp:406] fc8 <- fc7I0921 15:31:58.779808 3654 net.cpp:380] fc8 -> fc8I0921 15:31:58.785863 3654 net.cpp:122] Setting up fc8I0921 15:31:58.785899 3654 net.cpp:129] Top shape: 10 1000 (10000)I0921 15:31:58.785918 3654 net.cpp:137] Memory required for data: 68641400I0921 15:31:58.785930 3654 layer_factory.hpp:77] Creating layer probI0921 15:31:58.785957 3654 net.cpp:84] Creating Layer probI0921 15:31:58.785964 3654 net.cpp:406] prob <- fc8I0921 15:31:58.785971 3654 net.cpp:380] prob -> probI0921 15:31:58.786042 3654 net.cpp:122] Setting up probI0921 15:31:58.786051 3654 net.cpp:129] Top shape: 10 1000 (10000)I0921 15:31:58.786056 3654 net.cpp:137] Memory required for data: 68681400I0921 15:31:58.786098 3654 net.cpp:200] prob does not need backward computation.I0921 15:31:58.786115 3654 net.cpp:200] fc8 does not need backward computation.I0921 15:31:58.786120 3654 net.cpp:200] drop7 does not need backward computation.I0921 15:31:58.786124 3654 net.cpp:200] relu7 does not need backward computation.I0921 15:31:58.786129 3654 net.cpp:200] fc7 does not need backward computation.I0921 15:31:58.786134 3654 net.cpp:200] drop6 does not need backward computation.I0921 15:31:58.786139 3654 net.cpp:200] relu6 does not need backward computation.I0921 15:31:58.786144 3654 net.cpp:200] fc6 does not need backward computation.I0921 15:31:58.786149 3654 net.cpp:200] pool5 does not need backward computation.I0921 15:31:58.786154 3654 net.cpp:200] relu5 does not need backward computation.I0921 15:31:58.786159 3654 net.cpp:200] conv5 does not need backward computation.I0921 15:31:58.786164 3654 net.cpp:200] relu4 does not need backward computation.I0921 15:31:58.786172 3654 net.cpp:200] conv4 does not need backward computation.I0921 15:31:58.786183 3654 net.cpp:200] relu3 does not need backward computation.I0921 15:31:58.786188 3654 net.cpp:200] conv3 does not need backward computation.I0921 15:31:58.786193 3654 net.cpp:200] norm2 does not need backward computation.I0921 15:31:58.786198 3654 net.cpp:200] pool2 does not need backward computation.I0921 15:31:58.786203 3654 net.cpp:200] relu2 does not need backward computation.I0921 15:31:58.786208 3654 net.cpp:200] conv2 does not need backward computation.I0921 15:31:58.786216 3654 net.cpp:200] norm1 does not need backward computation.I0921 15:31:58.786222 3654 net.cpp:200] pool1 does not need backward computation.I0921 15:31:58.786226 3654 net.cpp:200] relu1 does not need backward computation.I0921 15:31:58.786232 3654 net.cpp:200] conv1 does not need backward computation.I0921 15:31:58.786237 3654 net.cpp:200] data does not need backward computation.I0921 15:31:58.786242 3654 net.cpp:242] This network produces output probI0921 15:31:58.786253 3654 net.cpp:255] Network initialization done.I0921 15:31:59.048005 3654 upgrade_proto.cpp:44] Attempting to upgrade input file specified using deprecated transformation parameters: /home/terrence/caffe_case/bvlc_reference_caffenet.caffemodelI0921 15:31:59.048066 3654 upgrade_proto.cpp:47] Successfully upgraded file specified using deprecated data transformation parameters.W0921 15:31:59.048084 3654 upgrade_proto.cpp:49] Note that future Caffe releases will only support transform_param messages for transformation fields.I0921 15:31:59.048104 3654 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/terrence/caffe_case/bvlc_reference_caffenet.caffemodelI0921 15:31:59.177397 3654 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameterI0921 15:31:59.226982 3654 net.cpp:744] Ignoring source layer lossmean subtracted values: [('B', 104.0069879317889), ('G', 116.66876761696767), ('R', 122.6789143406786)]predicted class is: 281
- 系统打印出了理整个前向传播的相关信息以及最终分类结果 281。
阅读全文
0 0
- caffe基础-13AlexNet模型bvlc_reference_caffenet的测试
- bvlc_reference_caffenet.caffemodel模型的应用
- Caffe环境AlexNet模型训练
- windows中caffe中alexnet模型可能会遇到的问题
- caffe下的AlexNet
- 【AlexNet】模型训练与测试导读
- 深度学习之基础模型---AlexNet
- caffe测试模型时遇到的问题
- [caffe]深度学习之图像分类模型AlexNet解读
- [caffe]深度学习之图像分类模型AlexNet解读
- [caffe]深度学习之图像分类模型AlexNet解读
- [caffe]深度学习之图像分类模型AlexNet解读
- [caffe]深度学习之图像分类模型AlexNet解读
- [caffe]深度学习之图像分类模型AlexNet解读
- [caffe]深度学习之图像分类模型AlexNet解读
- [caffe]深度学习之图像分类模型AlexNet解读
- Caffe学习笔记(二)——AlexNet模型
- [caffe]深度学习之图像分类模型AlexNet解读
- Android WebView的使用
- linux安装git
- java 用jsoup爬数据
- 通过generator-esri-appbuilder-js自定义Web Appbuilder微件()
- Android源码修改所了解的默认语言及默认时区对应表
- caffe基础-13AlexNet模型bvlc_reference_caffenet的测试
- Theano dimshuffle函数用法
- 第四周项目(2)-建立链表的算法库
- 扫地僧身份解析
- pop的实际应用
- oracle之触发器(trigger)
- VIM 移动
- plink做SNP筛选和GWAS
- JS中的!=、== 、!==、===的用法和区别