We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
AERO: A 1.28 MOP/s/LUT Reconfigurable Inference Processor for Recurrent Neural Networks in a Resource-Limited FPGA.
- Authors
Kim, Jinwon; Kim, Jiho; Kim, Tae-Hwan
- Abstract
This study presents a resource-efficient reconfigurable inference processor for recurrent neural networks (RNN), named AERO. AERO is programmable to perform inference on RNN models of various types. This was designed based on the instruction-set architecture specializing in processing primitive vector operations that compose the dataflows of RNN models. A versatile vector-processing unit (VPU) was incorporated to perform every vector operation and achieve a high resource efficiency. Aiming at a low resource usage, the multiplication in VPU is carried out on the basis of an approximation scheme. In addition, the activation functions are realized with the reduced tables. We developed a prototype inference system based on AERO using a resource-limited field-programmable gate array, under which the functionality of AERO was verified extensively for inference tasks based on several RNN models of different types. The resource efficiency of AERO was found to be as high as 1.28 MOP/s/LUT, which is 1.3-times higher than the previous state-of-the-art result.
- Subjects
RECURRENT neural networks; FIELD programmable gate arrays; GATE array circuits
- Publication
Electronics (2079-9292), 2021, Vol 10, Issue 11, p1249
- ISSN
2079-9292
- Publication type
Article
- DOI
10.3390/electronics10111249