Journal of Advances in Developmental Research
E-ISSN: 0976-4844
•
Impact Factor: 9.71
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Home
Research Paper
Submit Research Paper
Publication Guidelines
Publication Charges
Upload Documents
Track Status / Pay Fees / Download Publication Certi.
Editors & Reviewers
View All
Join as a Reviewer
Reviewer Referral Program
Get Membership Certificate
Current Issue
Publication Archive
Conference
Contact Us
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 16 Issue 1
2025
Indexing Partners
Demystifying Deep Learning Compiler Optimizations for Training and Inference
Author(s) | Vishakha Agrawal |
---|---|
Country | United States |
Abstract | Deep learning has achieved tremendous success in recent years, powering many artificial intelligence applications. However, deep learning models are computationally intensive to train, requiring massive amounts of data and compute resources. Once trained, deep learning models need to be deployed for inference to make predictions on new data. Hardware used for training differs from hardware used for inference. Deep learning compilers have revolutionized the field of artificial intelligence by optimizing the performance of deep learning models on various hardware platforms. In the current landscape of research on deep learning compilers, there is a notable absence of comprehensive studies that specifically differentiate between compiler optimizations and methodologies for training versus inference. This paper provides detailed description of deep learning compiler optimization, focusing on training, inference separately. We investigate the challenges, opportunities, and design considerations for compilers targeting each phase. |
Keywords | Training, Inference, Optimization, ASIC, Fusion, Quantization, Mixed Precision, Dynamic Batching, Pruning |
Published In | Volume 12, Issue 2, July-December 2021 |
Published On | 2021-09-08 |
Cite This | Demystifying Deep Learning Compiler Optimizations for Training and Inference - Vishakha Agrawal - IJAIDR Volume 12, Issue 2, July-December 2021. DOI 10.5281/zenodo.14551855 |
DOI | https://doi.org/10.5281/zenodo.14551855 |
Short DOI | https://doi.org/g8wtgq |
Share this
doi
CrossRef DOI is assigned to each research paper published in our journal.
IJAIDR DOI prefix is
10.71097/IJAIDR
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.