We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Fast LSTM by dynamic decomposition on cloud and distributed systems.
- Authors
You, Yang; He, Yuxiong; Rajbhandari, Samyam; Wang, Wenhan; Hsieh, Cho-Jui; Keutzer, Kurt; Demmel, James
- Abstract
Long short-term memory (LSTM) is a powerful deep learning technique that has been widely used in many real-world data-mining applications such as language modeling and machine translation. In this paper, we aim to minimize the latency of LSTM inference on cloud systems without losing accuracy. If an LSTM model does not fit in cache, the latency due to data movement will likely be greater than that due to computation. In this case, we reduce model parameters. If, as in most applications we consider, the LSTM models are able to fit the cache of cloud server processors, we focus on reducing the number of floating point operations, which has a corresponding linear impact on the latency of the inference calculation. Thus, in our system, we dynamically reduce model parameters or flops depending on which most impacts latency. Our inference system is based on singular value decomposition and canonical polyadic decomposition. Our system is accurate and low latency. We evaluate our system based on models from a series of real-world applications like language modeling, computer vision, question answering, and sentiment analysis. Users of our system can use either pre-trained models or start from scratch. Our system achieves 15 × average speedup for six real-world applications without losing accuracy in inference. We also design and implement a distributed optimization system with dynamic decomposition, which can significantly reduce the energy cost and accelerate the training process.
- Subjects
SINGULAR value decomposition; CACHE memory; COMPUTER vision; SENTIMENT analysis; PROGRAMMING languages; DEEP learning; MATHEMATICAL optimization
- Publication
Knowledge & Information Systems, 2020, Vol 62, Issue 11, p4169
- ISSN
0219-1377
- Publication type
Article
- DOI
10.1007/s10115-020-01487-8