Format:
1 Online-Ressource (xiii, 111 Seiten)
,
Illustrationen
Edition:
Also available in print
ISBN:
9781627052023
Series Statement:
Synthesis lectures on artificial intelligence and machine learning # 29
Content:
While labeled data is expensive to prepare, ever increasing amounts of unlabeled data is becoming widely available. In order to adapt to this phenomenon, several semi-supervised learning (SSL) algorithms, which learn from labeled as well as unlabeled data, have been developed. In a separate line of work, researchers have started to realize that graphs provide a natural way to represent data in a variety of domains. Graph-based SSL algorithms, which bring together these two lines of work, have been shown to outperform the state-of-the-art in many applications in speech processing, computer vision, natural language processing, and other areas of Artificial Intelligence. Recognizing this promising and emerging area of research, this synthesis lecture focuses on graph based SSL algorithms (e.g., label propagation methods). Our hope is that after reading this book, the reader will walk away with the following: (1) an in-depth knowledge of the current state-of- the-art in graph-based SSL algorithms, and the ability to implement them; (2) the ability to decide on the suitability of graph-based SSL methods for a problem; and (3) familiarity with different applications where graph-based SSL methods have been successfully applied
Content:
1. Introduction -- 1.1 Unsupervised learning -- 1.2 Supervised learning -- 1.3 Semi-supervised learning (SSL) -- 1.4 Graph-based semi-supervised learning -- 1.4.1 Inductive vs. transductive SSL -- 1.5 Book organization --
Content:
2. Graph construction -- 2.1 Problem statement -- 2.2 Task-independent graph construction -- 2.2.1 K-nearest neighbor (k-NN) and -neighborhood methods -- 2.2.2 Graph construction using b-matching -- 2.2.3 Graph construction using local reconstruction -- 2.3 Task-dependent graph construction -- 2.3.1 Inference-driven metric learning (IDML) -- 2.3.2 Graph kernels by spectral transform -- 2.4 Conclusion --
Content:
3. Learning and inference -- 3.1 Seed supervision -- 3.2 Transductive methods -- 3.2.1 Graph cut -- 3.2.2 Gaussian random fields (GRF) -- 3.2.3 Local and global consistency (LGC) -- 3.2.4 Adsorption -- 3.2.5 Modified adsorption (MAD) -- 3.2.6 Quadratic criteria (QC) -- 3.2.7 Transduction with confidence (TACO) -- 3.2.8 Information regularization -- 3.2.9 Measure propagation -- 3.3 Inductive methods -- 3.3.1 Manifold regularization -- 3.4 Results on benchmark SSL data sets -- 3.5 Conclusions --
Content:
4. Scalability -- 4.1 Large-scale graph construction -- 4.1.1 Approximate nearest neighbor -- 4.1.2 Other methods -- 4.2 Large-scale inference -- 4.2.1 Graph partitioning -- 4.2.2 Inference -- 4.3 Scaling to large number of labels -- 4.4 Conclusions --
Content:
5. Applications -- 5.1 Text classification -- 5.2 Phone classification -- 5.3 Part-of-speech tagging -- 5.4 Class-instance acquisition -- 5.5 Knowledge base alignment -- 5.6 Conclusion --
Content:
6. Future work -- 6.1 Graph construction -- 6.2 Learning & inference -- 6.3 Scalability --
Content:
A. Notations -- B. Solving modified adsorption (MAD) objective -- C. Alternating minimization -- D. Software -- D.1. Junto label propagation toolkit -- Bibliography -- Authors' biographies -- Index
Note:
Abstract freely available; full-text restricted to subscribers or individual document purchasers
,
Includes bibliographical references (pages 97-108) and index
,
Part of: Synthesis digital library of engineering and computer science
,
Series from website
,
1. Introduction1.1 Unsupervised learning -- 1.2 Supervised learning -- 1.3 Semi-supervised learning (SSL) -- 1.4 Graph-based semi-supervised learning -- 1.4.1 Inductive vs. transductive SSL -- 1.5 Book organization
,
2. Graph construction2.1 Problem statement -- 2.2 Task-independent graph construction -- 2.2.1 K-nearest neighbor (k-NN) and -neighborhood methods -- 2.2.2 Graph construction using b-matching -- 2.2.3 Graph construction using local reconstruction -- 2.3 Task-dependent graph construction -- 2.3.1 Inference-driven metric learning (IDML) -- 2.3.2 Graph kernels by spectral transform -- 2.4 Conclusion
,
3. Learning and inference3.1 Seed supervision -- 3.2 Transductive methods -- 3.2.1 Graph cut -- 3.2.2 Gaussian random fields (GRF) -- 3.2.3 Local and global consistency (LGC) -- 3.2.4 Adsorption -- 3.2.5 Modified adsorption (MAD) -- 3.2.6 Quadratic criteria (QC) -- 3.2.7 Transduction with confidence (TACO) -- 3.2.8 Information regularization -- 3.2.9 Measure propagation -- 3.3 Inductive methods -- 3.3.1 Manifold regularization -- 3.4 Results on benchmark SSL data sets -- 3.5 Conclusions
,
4. Scalability4.1 Large-scale graph construction -- 4.1.1 Approximate nearest neighbor -- 4.1.2 Other methods -- 4.2 Large-scale inference -- 4.2.1 Graph partitioning -- 4.2.2 Inference -- 4.3 Scaling to large number of labels -- 4.4 Conclusions
,
5. Applications5.1 Text classification -- 5.2 Phone classification -- 5.3 Part-of-speech tagging -- 5.4 Class-instance acquisition -- 5.5 Knowledge base alignment -- 5.6 Conclusion
,
6. Future work6.1 Graph construction -- 6.2 Learning & inference -- 6.3 Scalability
,
A. NotationsB. Solving modified adsorption (MAD) objective -- C. Alternating minimization -- D. Software -- D.1. Junto label propagation toolkit -- Bibliography -- Authors' biographies -- Index.
,
Also available in print.
,
Mode of access: World Wide Web.
,
System requirements: Adobe Acrobat Reader.
Additional Edition:
ISBN 9781627052016
Additional Edition:
Print version Graph-Based Semi-Supervised Learning
Language:
English
DOI:
10.2200/S00590ED1V01Y201408AIM029
Bookmarklink