In:
ACM SIGPLAN Notices, Association for Computing Machinery (ACM), Vol. 53, No. 9 ( 2020-04-07), p. 79-92
Abstract:
Many modern application domains crucially rely on tensor operations. The optimization of programs that operate on tensors poses difficulties that are not adequately addressed by existing languages and tools. Frameworks such as TensorFlow offer good abstractions for tensor operations, but target a specific domain, i.e. machine learning, and their optimization strategies cannot easily be adjusted to other domains. General-purpose optimization tools such as Pluto and existing meta-languages offer more flexibility in applying optimizations but lack abstractions for tensors. This work closes the gap between domain-specific tensor languages and general-purpose optimization tools by proposing the Tensor optimizations Meta-Language (TeML). TeML offers high-level abstractions for both tensor operations and loop transformations, and enables flexible composition of transformations into effective optimization paths. This compositionality is built into TeML's design, as our formal language specification will reveal. We also show that TeML can express tensor computations as comfortably as TensorFlow and that it can reproduce Pluto's optimization paths. Thus, optimized programs generated by TeML execute at least as fast as the corresponding Pluto programs. In addition, TeML enables optimization paths that often allow outperforming Pluto.
Type of Medium:
Online Resource
ISSN:
0362-1340
,
1558-1160
DOI:
10.1145/3393934.3278131
Language:
English
Publisher:
Association for Computing Machinery (ACM)
Publication Date:
2020
detail.hit.zdb_id:
2079194-X
detail.hit.zdb_id:
282422-X