通常的神经网络大小的设计研究,需要多少神经元

来源:互联网 发布:画图纸软件下载 编辑:程序博客网 时间:2024/04/29 21:51

 There's one additional rule of thumb that helps for supervised learning problems. The upper bound on the number of hidden neurons that won't result in over-fitting is:



Ni = number of input neurons.
No= number of output neurons.
Ns = number of samples in training data set.
α= an arbitrary scaling factor usually 2-10.

Others recommend setting to α a value between 5 and 10, but I find a value of 2 will often work without overfitting. As explained by this excellent NN Design text, you want to limit the number of free parameters in your model (its degree or number of nonzero weights) to a small portion of the degrees of freedom in your data. The degrees of freedom in your data is the number samples * degrees of freedom (dimensions) in each sample or Ns(Ni+No) (assuming they're all independent). So α is a way to indicate how general you want your model to be, or how much you want to prevent overfitting.

For an automated procedure you'd start with an alpha of 2 (twice as many degrees of freedom in your training data as your model) and work your way up to 10 if the error for training data is significantly smaller than for the cross-validation data set.

从2逐渐增大到10,如果随着增大数量,错误率有效下降的话。

From :https://stats.stackexchange.com/questions/181/how-to-choose-the-number-of-hidden-layers-and-nodes-in-a-feedforward-neural-netw

阅读全文
0 0