000 | 06141nam a2200697 i 4500 | ||
---|---|---|---|
001 | 6813505 | ||
003 | IEEE | ||
005 | 20200413152854.0 | ||
006 | m eo d | ||
007 | cr cn |||m|||a | ||
008 | 090708s2009 caua foab 001 0 eng d | ||
020 | _a9781598295481 (electronic bk.) | ||
020 | _z9781598295474 (pbk.) | ||
024 | 7 |
_a10.2200/S00196ED1V01Y200906AIM006 _2doi |
|
035 | _a(CaBNVSL)gtp00534961 | ||
035 | _a(OCoLC)428541480 | ||
040 |
_aCaBNVSL _cCaBNVSL _dCaBNVSL |
||
050 | 4 |
_aQ325.75 _b.Z485 2009 |
|
082 | 0 | 4 |
_a006.31 _222 |
100 | 1 | _aZhu, Xiaojin. | |
245 | 1 | 0 |
_aIntroduction to semi-supervised learning _h[electronic resource] / _cXiaojin Zhu and Andrew B. Goldberg. |
260 |
_aSan Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA) : _bMorgan & Claypool Publishers, _cc2009. |
||
300 |
_a1 electronic text (xi, 116 p. : ill.) : _bdigital file. |
||
490 | 1 |
_aSynthesis lectures on artificial intelligence and machine learning, _x1939-4616 ; _v# 6 |
|
538 | _aMode of access: World Wide Web. | ||
538 | _aSystem requirements: Adobe Acrobat reader. | ||
500 | _aPart of: Synthesis digital library of engineering and computer science. | ||
500 | _aSeries from website. | ||
504 | _aIncludes bibliographical references (p. 95-112) and index. | ||
505 | 0 | _aIntroduction to statistical machine learning -- The data -- Unsupervised learning -- Supervised learning -- Overview of semi-supervised learning -- Learning from both labeled and unlabeled data -- How is semi-supervised learning possible -- Inductive vs. transductive semi-supervised learning -- Caveats -- Self-training models -- Mixture models and EM -- Mixture models for supervised classification -- Mixture models for semi-supervised classification -- Optimization with the EM algorithm -- The assumptions of mixture models -- Other issues in generative models -- Cluster-then-label methods -- Co-training -- Two views of an instance -- Co-training -- The assumptions of co-training -- Multiview learning -- Graph-based semi-supervised learning -- Unlabeled data as stepping stones -- The graph -- Mincut -- Harmonic function -- Manifold regularization -- The assumption of graph-based methods -- Semi-supervised support vector machines -- Support vector machines -- Semi-supervised support vector machines -- Entropy regularization -- The assumption of S3VMS and entropy regularization -- Human semi-supervised learning -- From machine learning to cognitive science -- Study one: humans learn from unlabeled test data -- Study two: presence of human semi-supervised learning in a simple task -- Study three: absence of human semi-supervised learning in a complex task -- Discussions -- Theory and outlook -- A simple PAC bound for supervised learning -- A simple PAC bound for semi-supervised learning -- Future directions of semi-supervised learning -- Basic mathematical reference -- Semi-supervised learning software -- Symbols -- Biography. | |
506 | 1 | _aAbstract freely available; full-text restricted to subscribers or individual document purchasers. | |
510 | 0 | _aCompendex | |
510 | 0 | _aINSPEC | |
510 | 0 | _aGoogle scholar | |
510 | 0 | _aGoogle book search | |
520 | 3 | _aSemi-supervised learning is a learning paradigm concerned with the study of how computers and natural systems such as humans learn in the presence of both labeled and unlabeled data. Traditionally, learning has been studied either in the unsupervised paradigm (e.g., clustering, outlier detection) where all the data is unlabeled, or in the supervised paradigm (e.g., classification, regression) where all the data is labeled. The goal of semi-supervised learning is to understand how combining labeled and unlabeled data may change the learning behavior, and design algorithms that take advantage of such a combination. Semi-supervised learning is of great interest in machine learning and data mining because it can use readily available unlabeled data to improve supervised learning tasks when the labeled data is scarce or expensive. Semi-supervised learning also shows potential as a quantitative tool to understand human category learning, where most of the input is self-evidently unlabeled. In this introductory book, we present some popular semi-supervised learning models, including self-training, mixture models, co-training and multiview learning, graph-based methods, and semisupervised support vector machines. For each model, we discuss its basic mathematical formulation. The success of semi-supervised learning depends critically on some underlying assumptions. We emphasize the assumptions made by each model and give counterexamples when appropriate to demonstrate the limitations of the different models. In addition, we discuss semi-supervised learning for cognitive psychology. Finally, we give a computational learning theoretic perspective on semisupervised learning, and we conclude the book with a brief discussion of open questions in the field. | |
530 | _aAlso available in print. | ||
588 | _aTitle from PDF t.p. (viewed on July 8, 2009). | ||
650 | 0 | _aSupervised learning (Machine learning) | |
650 | 0 | _aSupport vector machines. | |
690 | _aSemi-supervised learning | ||
690 | _aTransductive learning | ||
690 | _aSelf-training | ||
690 | _aGaussian mixture model | ||
690 | _aExpectation maximization (EM) | ||
690 | _aCluster-then-label | ||
690 | _aCo-training | ||
690 | _aMultiview learning | ||
690 | _aMincut | ||
690 | _aHarmonic function | ||
690 | _aLabel propagation | ||
690 | _aManifold regularization | ||
690 | _aSemi-supervised support vector machines (S3VM) | ||
690 | _aTransductive support vector machines (TSVM) | ||
690 | _aEntropy regularization | ||
690 | _aHuman semi-supervised learning | ||
700 | 1 | _aGoldberg, Andrew B. | |
730 | 0 | _aSynthesis digital library of engineering and computer science. | |
830 | 0 |
_aSynthesis lectures on artificial intelligence and machine learning, _x1939-4616 ; _v# 6. |
|
856 | 4 | 2 |
_3Abstract with links to resource _uhttp://ieeexplore.ieee.org/servlet/opac?bknumber=6813505 |
999 |
_c561690 _d561690 |