Exploring Sparsity in Recurrent Neural Networks
来源:互联网 发布:剑雨江湖10进阶数据 编辑:程序博客网 时间:2024/05/20 07:17
Exploring Sparsity in Recurrent Neural Networks
(Submitted on 17 Apr 2017)
Recurrent Neural Networks (RNN) are widely used to solve a variety of problems and as the quantity of data and the amount of available compute have increased, so have model sizes. The number of parameters in recent state-of-the-art networks makes them hard to deploy, especially on mobile phones and embedded devices. The challenge is due to both the size of the model and the time it takes to evaluate it. In order to deploy these RNNs efficiently, we propose a technique to reduce the parameters of a network by pruning weights during the initial training of the network. At the end of training, the parameters of the network are sparse while accuracy is still close to the original dense neural network. The network size is reduced by 8x and the time required to train the model remains constant. Additionally, we can prune a larger dense network to achieve better than baseline performance while still reducing the total number of parameters significantly. Pruning RNNs reduces the size of the model and can also help achieve significant inference time speed-up using sparse matrix multiply. Benchmarks show that using our technique model size can be reduced by 90% and speed-up is around 2x to 7x.
Submission history
From: Sharan Narang [view email][v1] Mon, 17 Apr 2017 20:42:05 GMT (259kb,D)
阅读全文
1 0
- Exploring Sparsity in Recurrent Neural Networks
- Learning Structured Sparsity in Deep Neural Networks
- Gated Recurrent Neural Networks
- Recurrent Neural Networks Tutorial
- Recurrent Neural Networks - collections
- Recurrent Neural Networks regularization
- Recurrent Neural Networks
- Recurrent Neural Networks
- Temporal Activity Detection in Untrimmed Videos with Recurrent Neural Networks
- Batch Normalized Recurrent Neural Networks
- Recurrent Neural Networks 循环神经网络
- TensorFlow3: RNN, Recurrent Neural Networks
- Recurrent Neural Networks Tutorial 中文翻译
- RNN(recurrent neural networks)简介
- tensorflow 的 Recurrent Neural Networks
- QRNN(Quasi-Recurrent Neural Networks)
- Recurrent Neural Networks VS LSTM
- Recurrent neural networks deep dive
- java关键词
- ROS
- numpy学习笔记
- window对象
- ios pod相关问题解析
- Exploring Sparsity in Recurrent Neural Networks
- 泛型Hibernate DAO实现基本操作
- 国外一个特别火的识别应用
- iOS block传值的简单使用
- Qt5.5.0 框架常用模块纲要
- 【AngularJS】如果你修改了视图,模型和控制器也会相应更新
- 算法总结(进行中)
- Ubuntu16更新内核之后无法进入系统
- java基础之集合框架