斯坦福CS231n课程: 视觉识别中的卷积神经网络 Convolutional Neural Networks for Visual Recognition

来源:互联网 发布:打底衫比外套长 知乎 编辑:程序博客网 时间:2024/05/16 10:19
truck
car
cat
ship
horse
*This network is running live in your browser

Course Description

Computer Vision has become ubiquitous in our society, with applications in search, image understanding, apps, mapping, medicine, drones, and self-driving cars. Core to many of these applications are visual recognition tasks such as image classification, localization and detection. Recent developments in neural network (aka “deep learning”) approaches have greatly advanced the performance of these state-of-the-art visual recognition systems. This course is a deep dive into details of the deep learning architectures with a focus on learning end-to-end models for these tasks, particularly image classification. During the 10-week course, students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in computer vision. The final assignment will involve training a multi-million parameter convolutional neural network and applying it on the largest image classification dataset (ImageNet). We will focus on teaching how to set up the problem of image recognition, the learning algorithms (e.g. backpropagation), practical engineering tricks for training and fine-tuning the networks and guide the students through hands-on assignments and a final course project. Much of the background and materials of this course will be drawn from the ImageNet Challenge.

Course Instructors

Fei-Fei Li
 
Andrej Karpathy
Justin Johnson

Teaching Assistants

Serena Yeung
 
Subhasis Das
 
Song Han
 
Albert Haque
 
Bharath Ramsundar
Hieu Pham
 
Irwan Bello
 
Namrata Anand
 
Lane McIntosh
 
Catherine Dong
Kyle Griswold
 

Class Time and Location

Winter quater (January - March, 2016).
Lecture: Monday, Wednesday 3:00-4:20
Bishop Auditorium in Lathrop Building (map)

Office Hours

Mon 9-11am in Gates 392 with Albert
Mon 1-3pm in Fairchild D202 with Lane
Mon 6-7pm in Gates 260 with Andrej
Tue 10:25-11:25 in Huang (basement) with Song
Tue 10:30-12:30 in Huang (basement) with Kyle
Tue 5-7pm in Gates B24A with Namrata
Wed 10-12pm in Gates 498 with Serena
Wed 12-2pm in Gates 359 with Subhasis
Wed 7-8pm in Gates 259 with Justin
Thr 10:25-11:25am in Huang (basement) with Song
Thr 3:30-5:30pm in Huang B007 with Irwan
Thr 6-8pm in Gates 260 with Catherine
Thr 1-3pm in Clark S361 with Bharath
Fri 2-4pm in Gates B24A wtih Hieu

Grading Policy

Assignment #1: 15%
Assignment #2: 15%
Assignment #3: 15%
Midterm: 15%
Final Project: 40%

Course Discussions

Stanford students: Piazza 
Online discussions for non-Stanford students: Reddit on r/cs231n 
Our Twitter account: @cs231n

Assignment Details

See the Assignment Page for more details on how to hand in your assignments.

Course Project Details

See the Project Page for more details on the course project.

Prerequisites

  • Proficiency in Python, high-level familiarity in C/C++
    All class assignments will be in Python (and use numpy) (we provide a tutorial here for those who aren't as familiar with Python), but some of the deep learning libraries we may look at later in the class are written in C++. If you have a lot of programming experience but in a different language (e.g. C/C++/Matlab/Javascript) you will probably be fine.
  • College Calculus, Linear Algebra (e.g. MATH 19 or 41, MATH 51)
    You should be comfortable taking derivatives and understanding matrix vector operations and notation.
  • Basic Probability and Statistics (e.g. CS 109 or other stats course)
    You should know basics of probabilities, gaussian distributions, mean, standard deviation, etc.
  • Equivalent knowledge of CS229 (Machine Learning)
    We will be formulating cost functions, taking derivatives and performing optimization with gradient descent.

FAQ

Is this the first time this class is offered?
This class was first offered in Winter 2015, and has been slightly tweaked for the current Winter 2016 offering. The class is designed to introduce students to deep learning in context of Computer Vision. We will place a particular emphasis on Convolutional Neural Networks, which are a class of deep learning models that have recently given dramatic improvements in various visual recognition tasks. You can read more about it in this recent New York Times article.
Can I follow along from the outside?
We'd be happy if you join us! We plan to make the course materials widely available: The assignments, course notes, lecture videos and slides will be available online. We won't be able to give you course credit.
Can I take this course on credit/no cred basis?
Yes. Credit will be given to those who would have otherwise earned a C- or above.
Can I audit or sit in?
In general we are very open to sitting-in guests if you are a member of the Stanford community (registered student, staff, and/or faculty). Out of courtesy, we would appreciate that you first email us or talk to the instructor after the first class you attend. If the class is too full and we're running out of space, we would ask that you please allow registered students to attend.
Can I work in groups for the Final Project?
Yes, in groups of up to two people.
I have a question about the class. What is the best way to reach the course staff?
Stanford students please use an internal class forum on Piazza so that other students may benefit from your questions and our answers. If you have a personal matter, email us at the class mailing list cs231n-winter1516-staff@lists.stanford.edu.
Can I combine the Final Project with another course?
Yes, you may. There are a couple of courses concurrently offered with CS231n that are natural choices, such as CS231a (Computer Vision, by Prof. Silvio Savarese) and CS228 (Graphical Models, by Prof. Stefano Ermon). If you are taking some combination of these classes, please speak to the instructors to receive permission to combine the Final Project assignments.


These notes accompany the Stanford CS class CS231n: Convolutional Neural Networks for Visual Recognition. 
For questions/concerns/bug reports regarding contact Justin Johnson regarding the assignments, or contactAndrej Karpathy regarding the course notes. You can also submit a pull request directly to our git repo. 
We encourage the use of the hypothes.is extension to annote comments and discuss these notes inline.
Winter 2016 Assignments
Assignment #1: Image Classification, kNN, SVM, Softmax, Neural Network
Assignment #2: Fully-Connected Nets, Batch Normalization, Dropout, Convolutional Nets
Assignment #3: Recurrent Neural Networks, Image Captioning, Image Gradients, DeepDream
Module 0: Preparation
Python / Numpy Tutorial
IPython Notebook Tutorial
Terminal.com Tutorial
AWS Tutorial
Module 1: Neural Networks
Image Classification: Data-driven Approach, k-Nearest Neighbor, train/val/test splits
L1/L2 distances, hyperparameter search, cross-validation
Linear classification: Support Vector Machine, Softmax
parameteric approach, bias trick, hinge loss, cross-entropy loss, L2 regularization, web demo
Optimization: Stochastic Gradient Descent
optimization landscapes, local search, learning rate, analytic/numerical gradient
Backpropagation, Intuitions
chain rule interpretation, real-valued circuits, patterns in gradient flow
Neural Networks Part 1: Setting up the Architecture
model of a biological neuron, activation functions, neural net architecture, representational power
Neural Networks Part 2: Setting up the Data and the Loss
preprocessing, weight initialization, batch normalization, regularization (L2/dropout), loss functions
Neural Networks Part 3: Learning and Evaluation
gradient checks, sanity checks, babysitting the learning process, momentum (+nesterov), second-order methods, Adagrad/RMSprop, hyperparameter optimization, model ensembles
Putting it together: Minimal Neural Network Case Study
minimal 2D toy data example
Module 2: Convolutional Neural Networks
Convolutional Neural Networks: Architectures, Convolution / Pooling Layers
layers, spatial arrangement, layer patterns, layer sizing patterns, AlexNet/ZFNet/VGGNet case studies, computational considerations
Understanding and Visualizing Convolutional Neural Networks
tSNE embeddings, deconvnets, data gradients, fooling ConvNets, human comparisons
Transfer Learning and Fine-tuning Convolutional Neural Networks



from: http://vision.stanford.edu/teaching/cs231n/index.html
http://cs231n.github.io/
0 0