神经网络Version2
来源:互联网 发布:冥王星的早餐知乎 编辑:程序博客网 时间:2024/06/05 05:53
% A demo to BP nerual networkformat long% define the sigmoid functionf = @(x) 1/(1+exp(-x));% enter the learning rateEta = 0.5;x=[0.05,0.10];y=[0.01,0.99];% modifiedb=[1,1,1];bw=[0.35,0.6,0.35];temp = size(x); % number of rowsm = temp(1);% number of colsn = temp(2);% the weightsw=[0.15,0.20,0.25,0.30; 0.40,0.45,0.50,0.55; 0.40,0.45,0.50,0.55];% gradient gradient = zeros(3,4);% the output of neurons in hidden layerhout = zeros(2,2);% the output of neurons in hidden layeroout=[1;1];% initialize the condition to terminate loopdiff = 1;while(diff <= 1) % 正向推理(输入层->隐藏层1) neth1=w(1,1)*x(1)+w(1,2)*x(2) + b(1)*bw(1); hout(1,1) = f( neth1 ); hout(1,2) = f( w(1,3)*x(1)+w(1,4)*x(2) + b(1)*bw(1) ); %(隐藏层1->隐藏层2) hout(2,1)=f( w(2,1)*hout(1,1)+w(2,2)*hout(1,2) + b(2)*bw(2) ) hout(2,2)=f( w(2,3)*hout(1,1)+w(2,4)*hout(1,2) + b(2)*bw(2) ) %(隐藏层2->输出层) oout(1)=f( w(3,1)*hout(2,1)+w(3,2)*hout(2,2) + b(3)*bw(3) ); oout(2)=f( w(3,3)*hout(2,1)+w(3,4)*hout(2,2) + b(3)*bw(3) ); % gradient of output layer deltaO=zeros(2,4); deltaO(1,4)=( oout(1) - y(1) ) * oout(1) * (1 - oout(1) ); deltaO(2,4)=( oout(2) - y(2) ) * oout(2) * (1 - oout(2) ); %求输出层到隐藏层2的偏导 gradient(3,1)=deltaO(1,4)*hout(2,1); gradient(3,2)=deltaO(1,4)*hout(2,2); gradient(3,3)=deltaO(2,4)*hout(2,1); gradient(3,4)=deltaO(2,4)*hout(2,2); %求deltaO (3) deltaO(1,3) = ( deltaO(1,4)*w(2,1) + deltaO(2,4)*w(2,3) )*hout(2,1)*(1-hout(2,1)); deltaO(2,3) = ( deltaO(2,4)*w(2,4) + deltaO(1,4)*w(2,2) )*hout(2,2)*(1-hout(2,2)); gradient(1,:); %w(2,:) %求隐藏层2到隐藏层1的偏导 gradient(2,1)= deltaO(1,3) * hout(1,1); gradient(2,2)= deltaO(1,3) * hout(1,2); gradient(2,3)= deltaO(2,3) * hout(1,1); gradient(2,4)= deltaO(2,3) * hout(1,2); %求deltaO (2) deltaO(1,2) = ( deltaO(1,4)*w(2,1) + deltaO(2,4)*w(2,3) )*hout(1,1)*(1-hout(1,1)); deltaO(2,2) = ( deltaO(2,4)*w(2,4) + deltaO(1,4)*w(2,2) )*hout(1,2)*(1-hout(1,2)); %求隐藏层1到输入层的偏导 gradient(1,1)= deltaO(1,2) * x(1); gradient(1,2)= deltaO(1,2) * x(2); gradient(1,3)= deltaO(2,2) * x(1); gradient(1,4)= deltaO(2,2) * x(2);% gradient(1,:); %进行梯度下降 w=w-Eta*gradient % update diff diff=diff+1;end
阅读全文
0 0
- 神经网络Version2
- Mecca version2
- BusinessFrameWork Version2
- 各种类型转换Version2
- DES加密 version2
- 多线程代理服务器version2
- 简单视频加密【Version2】
- Dinic Template Version2.0
- starUML破解-version2.8.0
- C#程序设计语言Version2.0简介
- SD卡CSD version2解析
- C#程序设计语言Version2.0简介(zz)
- [创新杯]client 与 server version2
- Global file sytem Version2 简单测试
- python之小说下载器version2.0
- 奖客富翁系统Version2.0
- Cocos2d-html5(version2.2.1)常用API
- hexo-theme-believe version2.0 说明文档
- unity神奇的效果:角色随机移动
- spfa是否可以进一步优化
- loadrunner Web_类函数之web_text_area()
- 抓包分析MSS/MTU
- AS的gradle各版本下载地址
- 神经网络Version2
- arm-none-linux-gnueabi-gcc 下载地址
- 2017-9-26 html模型及居中方法
- 并查集 leetcode 编程题
- Android自定义Dialog--DanmakuFlame弹幕发送框界面的实现
- [总结]----Sqoop 几种导入导出模式
- java基础补充
- 数据结构-线性表
- Java 注解(Annotation)的简介与使用实例