One-class Classification(introduction)
来源:互联网 发布:人工智能会议 编辑:程序博客网 时间:2024/05/17 22:37
One-class Classification
In contrast with normal classification problems where one tries to distinguish between two (or more) classes of objects, one-class classification tries to describe one class of objects, and distinguish it from all other possible objects. In the figure below an example data set is shown, representing a set of apples and pears. The objects can be classified well by the classifier, but the outlier in the lower right corner will be classified as a pear. A one-class classifier should then be trained to reject this object and to label it as an outlier.
This one-class classification can be applied in different problems. It can be used for:
1. Novelty detection (for machine condition monitoring where faults should be detected),
2. Outlier detection (for more confident classification as in the example above),
3. Badly balanced data (classification in medical data with poorly sampled classes),
4. Data set comparison (to avoid the training classifiers again for comparable data).
Often just the probability density of the training set is estimated. When new objects fall under some density threshold, this new object is considered outlier and is rejected. We propose a method which does not rely on density estimation. The method is inspired on the Support Vector Machine by V.Vapnik. It computes a spherically shaped decision boundary with minimal volume around a training set of objects. This requirement (and the constraint that all objects are within the sphere) results in a description in which the sphere is described by just a few objects from the training set, called Support Objects. Instead of storing the complete training set, just this much smaller set of support objects has to be stored. The spherical description can be made more flexible by introducing kernel functions, analogous to the Support Vector Machines. When a Gaussian kernel is used (with an extra free parameter s) solutions ranging from a Parzen density estimation to the original spherical description are obtained. Also a procedure for choosing the appropriate value for s is given such that for all types of data a tight description can be obtained.
These tools can now answer other questions. It can solve the classification problems in which the different classes are very poorly balanced (or in which one of the classes is completely absent). This happens in medical applications where the population of normal, healthy people is far bigger than the abnormal population. It also opens the possibility to give an indication that a test set is sufficiently similar to the training set.
- One-class Classification(introduction)
- Introduction to One-class Support Vector Machines
- caffe introduction & classification
- Multi-class Classification
- Multi-class Classification相关
- Introduction to Classification Evaluation Methods -- Part 1
- Learning English - Lesson One (Introduction)
- Define one JavaScript class
- one class SVM
- PHP学习 Class one
- Git learning-- Class one
- one-class SVM
- one class classifier
- Machine Learning - Logistic Regression - Two-class Classification
- Machine Learning - Logistic Regression - Multi-class Classification
- [Machine Learning][Octave]Multi-class Classification
- Bayesian Inference and Naive Bayesian Classification: A Brief Introduction
- classification
- StringBuffer和StringBuilder的区别
- python爬虫实战,抓取贴吧全部人员名称和等级
- 多层感知机及其BP算法(Multi-Layer Perception)
- 从 iOS AutoLayout 到识别关系链的渐进过程,说开去!
- CXF整合spring框架实现动态调用,找不到函数接口, 添加拦截器无法初始化
- One-class Classification(introduction)
- 经典算法题15-稀疏矩阵及三元组
- Linux基本命令练习:脚本实现硬盘分区功能
- MJPG-STREAMER 移植+BUG
- nginx location语法
- 欢迎使用CSDN-markdown编辑器
- Unity3D Shader编程】之六 暗黑城堡篇: 表面着色器(Surface Shader)的写法(一)
- mediaplayer完全解读
- 算法:Python递归实现走迷宫