神经网络sim实现原理,及原理坑的方法。

来源:互联网 发布:网页搜索优化 编辑:程序博客网 时间:2024/06/07 03:52

https://cn.mathworks.com/matlabcentral/answers/14590-neural-network-sim-net-input-gives-crazy-results

As surprising as it can be, the output of the network is correct. So, how come such huge differences from your correct math interpretation of the network?

Well, the network normalizes the input data before processing it through the network and then transforms the output back. This is done by the network with the function mapmimax. You can find it in your network and if you make the network not to use them, then you will obtain the same results as your math:

net.inputs{1}.processFcns net.outputs{2}.processFcns 

However, I don't recommend this. It is a good idea to normalize your data before you present it to the network or your weights could get too big.

In order to follow the math of the network, you can do the following:

imp2 = mapminmax('apply',imp,net.inputs{1}.processSettings{3});OutLayer1 = tansig(net.IW{1}*imp2+B1); OutLayer2 = purelin(net.LW{2}*OutLayer1+B2); y2 = mapminmax('reverse',OutLayer2,net.outputs{2}.processSettings{2});

Now your plot of y1 and y2 should be the same.


0 0
原创粉丝点击