吴恩达机器学习——我的错题集(持续更新)
来源:互联网 发布:淘宝卖饰品好做吗 编辑:程序博客网 时间:2024/06/05 07:22
第二周
Which of the following are reasons for using feature scaling?
It speeds up gradient descent by making it require fewer iterations to get to a good solution
.It speeds up gradient descent by making each iteration of gradient descent less expensive to compute
第三周
2. Suppose you have the following training set, and fit a logistic regression classifier
Which of the following are true? Check all that apply.
Adding polynomial features (e.g., instead using
At the optimal value of
Adding polynomial features (e.g., instead using
If we train gradient descent for enough iterations, for some examples
选AB
3.For logistic regression, the gradient is given by
选AC
1.You are training a classification model with logistic regression. Which of the following statements are true? Check all that apply.
Introducing regularization to the model always results in equal or better performance on examples not in the training set.
Introducing regularization to the model always results in equal or better performance on the training set.
Adding a new feature to the model always results in equal or better performance on the training set.
Adding many new features to the model helps prevent overfitting on the training set.
选C
- 吴恩达机器学习——我的错题集(持续更新)
- 有用的机器学习链接(持续更新)
- 机器学习资料收集(持续更新)
- 机器学习资料整理(持续更新)
- 机器学习概览(持续更新)
- 机器学习资料收集(持续更新)
- 机器学习资源汇总(持续更新)
- 机器学习库积累(持续更新)
- 机器学习收藏(持续更新)
- 机器学习推荐网站(持续更新)
- 我的WEB学习生涯(持续更新)
- 机器学习资源(持续更新)
- 机器学习资料持续更新
- 机器学习、深度学习、计算机视觉、自然语言处理及应用案例——干货分享(持续更新......)
- 机器学习、深度学习、计算机视觉、自然语言处理及应用案例——干货分享(持续更新……)
- 我的工具集(持续更新)
- 我的DOM库 (持续更新)
- 我的vim文件(持续更新)
- AspectJ 生成的代码粗读
- SAAS,PAAS IAAS
- 二组作业完成情况
- 数据节点操作
- QueryDsl查询mongodb数据1
- 吴恩达机器学习——我的错题集(持续更新)
- Ribbon负载均衡策略配置
- 自动关机HTA桌面小程序实现源码
- iOS的UILabel设置居上对齐,居中对齐,居下对齐
- Spring Boot: Building a RESTful Web Service
- react_basic(3)
- 《Effective C++》条款25:考虑写出一个不抛异常的swap函数
- QoS/ToS/CoS/DSCP 介绍
- 乃奎斯特采样定理的理解及应用实例