资源受限的深度学习:挑战与实践

被引:10
作者
吴建鑫
高斌斌
魏秀参
罗建豪
机构
[1] 南京大学软件新技术国家重点实验室
关键词
深度学习; 资源受限; 数据资源; 标记资源; 计算资源;
D O I
暂无
中图分类号
TP181 [自动推理、机器学习];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
深度学习近年来取得了突出进展,然而,深度学习模型需要占用大量的与计算相关的资源,同时其学习过程需要大量的数据与标记,因此目前深度学习领域的一个热点是降低其对计算和数据资源的渴求,即研究资源受限的深度学习.本文首先分析深度学习对资源的渴求及其导致的挑战,然后分别从数据、标记、计算资源受限3个方面对目前的研究进展简要描述,并以我们在计算机视觉领域的研究实践为例进行较详细的介绍.
引用
收藏
页码:501 / 510
页数:10
相关论文
共 29 条
[1]  
http://www.robots.ox.ac.uk/?vgg/software/vgg face/ .
[2]  
http://chalearnlap.cvc.uab.es/challenge/12/description/ .
[3]  
Semi-supervised learning with ladder networks. Antti Rasmus,Mathias Berglund,Mikko Honkala,Harri Valpola,Tapani Raiko. Advances in Neural Information Processing Systems . 2015
[4]  
"Xnor-net:Imagenet clas-sification using binary convolutional neural networks ". M.Rastegari,V.Ordonez,J.Redmon,A.Farhadi. European Conference on Computer Vision . 2016
[5]  
Learning both Weights and Connections for Efficient Neural Network. Han S,Pool J,Tran J.et al. Advances in Neural Information Processing Systems . 2015
[6]  
Revisiting unreasonable effectiveness of data in deep learning era. Sun C,Shrivastava A,Singh S,et al. Proceedings of the International Conference on Computer Vision . 2017
[7]  
Very deep convolutional networks for large-scale image recognition. Simonyan K,Zisserman A. .
[8]  
http://www.image-net.org .
[9]  
Pruning filters for efficient Conv Nets. Li H,Kadav A,Durdanovic I,et al. Proceedings of the International Conference on Learning Representations . 2017
[10]  
Learning efficient convolutional networks through network slimming. Liu Z,Li J G,Shen Z Q,et al. Proceedings of the International Conference on Computer Vision . 2017