Format:
1 Online-Ressource (1 PDF (xvii, 141 pages))
,
illustrations (some color)
Edition:
Also available in print
ISBN:
9781681739649
Series Statement:
Synthesis lectures on artificial intelligence and machine learning #46
Content:
1. Introduction -- 1.1. What is a graph? -- 1.2. Machine learning on graphs
Content:
2. Background and traditional approaches -- 2.1. Graph statistics and kernel methods -- 2.2. Neighborhood overlap detection -- 2.3. Graph Laplacians and spectral methods -- 2.4. Toward learned representations
Content:
part I. Node embeddings. 3. Neighborhood reconstruction methods -- 3.1. An encoder-decoder perspective -- 3.2. Factorization-based approaches -- 3.3. Random walk embeddings -- 3.4. Limitations of shallow embeddings
Content:
4. Multi-relational data and knowledge graphs -- 4.1. Reconstructing multi-relational data -- 4.2. Loss functions -- 4.3. Multi-relational decoders
Content:
part II. Graph neural networks. 5. The graph neural network model -- 5.1. Neural message passing -- 5.2. Generalized neighborhood aggregation -- 5.3. Generalized update methods -- 5.4. Edge features and multi-relational GNNs -- 5.5. Graph pooling -- 5.6. Generalized message passing
Content:
6. Graph neural networks in practice -- 6.1. Applications and loss functions -- 6.2. Efficiency concerns and node sampling -- 6.3. Parameter sharing and regularization
Content:
7. Theoretical motivations -- 7.1. Gnns and graph convolutions -- 7.2. Gnns and probabilistic graphical models -- 7.3. Gnns and graph isomorphism
Content:
part III. Generative graph models. 8. Traditional graph generation approaches -- 8.1. Overview of traditional approaches -- 8.2. Erdös-Rényi model -- 8.3. Stochastic block models -- 8.4. Preferential attachment -- 8.5. Traditional applications
Content:
9. Deep generative models -- 9.1. Variational autoencoder approaches -- 9.2. Adversarial approaches -- 9.3. Autoregressive methods -- 9.4. Evaluating graph generation -- 9.5. Molecule generation.
Content:
Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry. Building relational inductive biases into deep learning architectures is crucial for creating systems that can learn, reason, and generalize from this kind of data. Recent years have seen a surge in research on graph representation learning, including techniques for deep graph embeddings, generalizations of convolutional neural networks to graph-structured data, and neural message-passing approaches inspired by belief propagation. These advances in graph representation learning have led to new state-of-the-art results in numerous domains, including chemical synthesis, 3D vision, recommender systems, question answering, and social network analysis. This book provides a synthesis and overview of graph representation learning. It begins with a discussion of the goals of graph representation learning as well as key methodological foundations in graph theory and network analysis. Following this, the book introduces and reviews methods for learning node embeddings, including random-walk-based methods and applications to knowledge graphs. It then provides a technical synthesis and introduction to the highly successful graph neural network (GNN) formalism, which has become a dominant and fast-growing paradigm for deep learning with graph data. The book concludes with a synthesis of recent advancements in deep generative models for graphs--a nascent but quickly growing subset of graph representation learning
Note:
Part of: Synthesis digital library of engineering and computer science
,
Includes bibliographical references (pages 131-140)
,
Compendex
,
INSPEC
,
Google scholar
,
Google book search
,
Also available in print.
,
Mode of access: World Wide Web.
,
System requirements: Adobe Acrobat Reader.
Additional Edition:
ISBN 9781681739632
Additional Edition:
ISBN 9781681739656
Additional Edition:
Erscheint auch als Druck-Ausgabe ISBN 9781681739632
Additional Edition:
ISBN 9781681739656
Language:
English
Keywords:
Electronic books
Bookmarklink