Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
Type of Medium
Language
Region
Library
Years
Person/Organisation
  • 1
    Online Resource
    Online Resource
    [San Rafael, California] : Morgan & Claypool Publishers
    UID:
    gbv_1745277862
    Format: 1 Online-Ressource (xx, 245 Seiten) , Illustrationen
    ISBN: 9781681739670
    Series Statement: Synthesis lectures on computer architecture #53
    Content: Preface -- Acknowledgments -- Introduction -- Building Blocks -- Models and Applications -- Training a Model -- Distributed Training -- Reducing the Model Size -- Hardware -- Compiler Optimizations -- Frameworks and Compilers -- Opportunities and Challenges -- Bibliography -- Author's Biography.
    Content: This book describes deep learning systems: the algorithms, compilers, and processor components to efficiently train and deploy deep learning models for commercial applications. The exponential growth in computational power is slowing at a time when the amount of compute consumed by state-of-the-art deep learning (DL) workloads is rapidly growing. Model size, serving latency, and power constraints are a significant challenge in the deployment of DL models for many applications. Therefore, it is imperative to codesign algorithms, compilers, and hardware to accelerate advances in this field with holistic system-level and algorithm solutions that improve performance, power, and efficiency. Advancing DL systems generally involves three types of engineers: (1) data scientists that utilize and develop DL algorithms in partnership with domain experts, such as medical, economic, or climate scientists; (2) hardware designers that develop specialized hardware to accelerate the components in the DL models; and (3) performance and compiler engineers that optimize software to run more efficiently on a given hardware. Hardware engineers should be aware of the characteristics and components of production and academic models likely to be adopted by industry to guide design decisions impacting future hardware. Data scientists should be aware of deployment platform constraints when designing models. Performance engineers should support optimizations across diverse models, libraries, and hardware targets. The purpose of this book is to provide a solid understanding of (1) the design, training, and applications of DL algorithms in industry; (2) the compiler techniques to map deep learning code to hardware targets; and (3) the critical hardware features that accelerate DL systems. This book aims to facilitate co-innovation for the advancement of DL systems. It is written for engineers working in one or more of these areas who seek to understand the entire system stack in order to better collaborate with engineers working in other parts of the system stack. The book details advancements and adoption of DL models in industry, explains the training and deployment process, describes the essential hardware architectural features needed for today's and future models, and details advances in DL compilers to efficiently execute algorithms across various hardware targets. Unique in this book is the holistic exposition of the entire DL system stack, the emphasis on commercial applications, and the practical techniques to design models and accelerate their performance. The author is fortunate to work with hardware, software, data scientist, and research teams across many high-technology companies with hyperscale data centers. These companies employ many of the examples and methods provided throughout the book.
    Additional Edition: ISBN 9781681739663
    Additional Edition: ISBN 9781681739687
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9783031000669
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9783031006418
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9783031028977
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Did you mean 9783030000660?
Did you mean 9783030006679?
Did you mean 9783030007669?
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages