今日论文推荐(附下载地址)

AMiner下周三将发布《3D打印研究报告》

届时微信公众号菜单栏可直接下载研究报告

敬请关注

此篇论文来自于第十三届EuroSys会议论文集

论文名:

Improving the Expressiveness of Deep Learning Frameworks with Recursion

作者:

Eunji Jeong, Joo Seong Jeong, Soojeong Kim, Gyeong-In Yu, Byung-Gon Chun

Abstract

Recursive neural networks have widely been used by researchersto handle applications with recursively or hierarchicallystructured data. However, embedded control flow deeplearning frameworks such as TensorFlow, Theano, Caffe2, andMXNet fail to efficiently represent and execute such neural networks,due to lack of support for recursion. In this paper, we addrecursion to the programming model of existing frameworks bycomplementing their design with recursive execution of dataflowgraphs as well as additional APIs for recursive definitions. Unlikeiterative implementations, which can only understand thetopological index of each node in recursive data structures, ourrecursive implementation is able to exploit the recursive relationshipsbetween nodes for efficient execution based on parallelcomputation. We present an implementation on TensorFlow andevaluation results with various recursive neural network models,showing that our recursive implementation not only conveys therecursive nature of recursive neural networks better than otherimplementations, but also uses given resources more effectivelyto reduce training and inference time.

中文摘要

递归神经网络已被研究人员广泛用于处理具有递归或分层结构数据的应用程序。然而,由于缺乏对递归的支持,嵌入式控制流深度学习框架(如TensorFlow,Theano,Caffe2和MXNet)无法有效地表示和执行此类神经网络。在本文中,我们通过递归执行数据流图以及用于递归定义的附加API来补充其设计,从而将递归添加到现有框架的编程模型中。与迭代实现不同,迭代实现只能理解递归数据结构中每个节点的拓扑索引,我们的递归实现能够利用节点之间的递归关系,以便基于并行计算进行高效执行。我们使用各种递归神经网络模型呈现TensorFlow和评估结果的实现,表明我们的递归实现不仅比其他实现更好地传达递归神经网络的递归性质,而且还更有效地使用给定资源来减少训练和推理时间。

论文下载链接:https://arxiv.org/pdf/1809.00832.pdf

(0)

相关推荐