Lecture 5:把无限多个假设降到有限个

来源:互联网 发布:淘宝买万艾可怎么购买 编辑:程序博客网 时间:2024/05/01 07:12

Learning is PAC-possible if enough statistical data and finite

when M is infinite,we need to estabish a finite quantity that replaces M

is effective number of hypotheses
always is less than
,we called it a break point.

positive rays:
break point at2
positive intervals:
break point at 3
convex sets: no break point
2D perceptrons: break point at 4
这里写图片描述

So learning is possible again when we find a finite

1 0
原创粉丝点击