ABE-IPSABE HOLDINGABE BOOKS
English Polski
Dostęp on-line

Książki

0.00 PLN
Schowek (0) 
Schowek jest pusty
An Introduction to Machine Learning

An Introduction to Machine Learning

Autorzy
Wydawnictwo Springer, Berlin
Data wydania
Liczba stron 348
Forma publikacji książka w twardej oprawie
Język angielski
ISBN 9783319639123
Kategorie Wydobywanie danych
Zapytaj o ten produkt
E-mail
Pytanie
 
Do schowka

Opis książki

This textbook presents fundamental machine learning concepts in an easy to understand manner by providing practical advice, using straightforward examples, and offering engaging discussions of relevant applications. The main topics include Bayesian classifiers, nearest-neighbor classifiers, linear and polynomial classifiers, decision trees, neural networks, and support vector machines. Later chapters show how to combine these simple tools by way of "boosting," how to exploit them in more complicated domains, and how to deal with diverse advanced practical issues. One chapter is dedicated to the popular genetic algorithms.

This revised edition contains three entirely new chapters on critical topics regarding the pragmatic application of machine learning in industry. The chapters examine multi-label domains, unsupervised learning and its use in deep learning, and logical approaches to induction. Numerous chapters have been expanded, and the presentation of the material has been enhanced. The book contains many new exercises, numerous solved examples, thought-provoking experiments, and computer assignments for independent work.

An Introduction to Machine Learning

Spis treści

1              A Simple Machine-Learning Task                                                               1

1.1         Training  Sets and Classifiers.......................................................................... 1



1.2         Minor Digression:  Hill-Climbing Search....................................................... 5



1.3         Hill Climbing in  Machine Learning................................................................ 9



1.4         The Induced Classifier's Performance........................................................ 12



1.5         Some Di culties with  Available Data......................................................... 14



1.6         Summary and Historical Remarks............................................................... 18



1.7         Solidify Your Knowledge.............................................................................. 19



2              Probabilities:  Bayesian Classifiers                                                                22



2.1         The Single-Attribute Case............................................................................. 22



2.2         Vectors  of Discrete Attributes..................................................................... 27



2.3         Probabilities of Rare Events:  Exploiting the   Expert's Intuition............. 29



2.4         How  to Handle Continuous Attributes....................................................... 35



2.5         Gaussian "Bell" Function:  A  Standard pdf................................................. 38



2.6         Approximating PDFs with Sets  of Gaussians............................................ 40



2.7         Summary and Historical Remarks............................................................... 43



2.8         Solidify Your Knowledge.............................................................................. 46



3              Similarities:  Nearest-Neighbor Classifiers                                                 49



3.1         The k-Nearest-Neighbor Rule...................................................................... 49

3.2         Measuring Similarity...................................................................................... 52



3.3         Irrelevant  Attributes and Scaling Problems............................................... 56



3.4         Performance Considerations........................................................................ 60



3.5         Weighted Nearest Neighbors....................................................................... 63



3.6         Removing Dangerous Examples.................................................................. 65



3.7         Removing Redundant Examples.................................................................. 68



3.8         Summary and Historical Remarks............................................................... 71



3.9         Solidify Your Knowledge.............................................................................. 72




 



 



 



 



4              Inter-Class Boundaries:



Linear and Polynomial Classifiers                                                                  75


4.1         The Essence..................................................................................................... 75


4.2         The Additive Rule:  Perceptron Learning.................................................... 79


4.3         The  Multiplicative  Rule:  WINNOW............................................................ 85


4.4         Domains with More than  Two Classes........................................................ 88


4.5         Polynomial Classifiers..................................................................................... 91


4.6         Specific Aspects of Polynomial Classifiers................................................... 93


4.7         Numerical Domains and Support Vector Machines................................... 97


4.8         Summary and Historical Remarks.............................................................. 100


4.9         Solidify Your Knowledge............................................................................. 101


5              Artificial Neural Networks                                                                            105


5.1         Multilayer Perceptrons as Classifiers.......................................................... 105


5.2         Neural Network's Error............................................................................... 110


5.3         Backpropagation of Error........................................................................... 111


5.4         Special Aspects of Multilayer Perceptrons................................................ 117


5.5         Architectural Issues...................................................................................... 121


5.6         Radial Basis Function Networks................................................................. 123


5.7         Summary and Historical Remarks.............................................................. 126


5.8         Solidify Your Knowledge............................................................................. 128


6              Decision  Trees                                                                                                    130


6.1         Decision Trees

6.2         Induction of Decision Trees........................................................................ 134


6.3         How Much Information Does an   Attribute Convey?............................... 137


6.4         Binary Split of a   Numeric Attribute.......................................................... 142


6.5         Pruning.......................................................................................................... 144


6.6         Converting the Decision Tree  into Rules.................................................. 149


6.7         Summary and Historical Remarks.............................................................. 151


6.8         Solidify Your Knowledge............................................................................. 153


7              Computational Learning Theory                                                                  157


7.1         PAC Learning................................................................................................. 157


7.2         Examples  of PAC  Learnability.................................................................... 161


7.3         Some Practical and Theoretical Consequences......................................... 164


7.4         VC-Dimension and Learnability................................................................. 166


7.5         Summary and Historical Remarks.............................................................. 169


7.6         Exercises and Thought Experiments......................................................... 170



 



 



 



 



8              A  Few  Instructive Applications                                                                   173



8.1         Character Recognition................................................................................ 173



8.2         Oil-Spill Recognition.................................................................................... 177



8.3         Sleep Classification...................................................................................... 181



8.4         Brain-Computer Interface.......................................................................... 185



8.5         Medical Diagnosis........................................................................................ 189



8.6         Text Classification........................................................................................ 192



8.7         Summary and Historical Remarks............................................................ 194



8.8         Exercises and Thought Experiments........................................................ 195



9              Induction  of Voting Assemblies                                                                  198



9.1         Bagging.......................................................................................................... 198



9.2         Schapire's Boosting..................................................................................... 201



9.3         Adaboost:  Practical Version of Boosting................................................. <205



9.4         Variations on the  Boosting Theme........................................................... 210



9.5         Cost-Saving Benefits of  the Approach...................................................... 213



9.6         Summary and Historical Remarks............................................................ 215



9.7         Solidify Your Knowledge............................................................................ 216



10     Some  Practical  Aspects  to Know About                                                   219



10.1     A Learner's Bias.......................................................................................... 219



10.2     Imbalanced Training Sets........................................................................... 223



10.3     Context-Dependent Domains..................................................................... 228



10.4     Unknown Attribute Values......................................................................... 231



10.5     Attribute Selection....................................................................................... 234



10.6     Miscellaneous............................................................................................... 237



10.7     Summary and Historical Remarks............................................................ 238



10.8     Solidify Your Knowledge............................................................................ 240



11     Performance Evaluation                                                                                 243



11.1     Basic Performance Criteria........................................................................ 243



11.2     Precision and Recall.................................................................................... 247



11.3     Other Ways  to Measure Performance..................................................... 252



11.4     Learning Curves and  Computational Costs............................................. 255



11.5     Methodologies of Experimental Evaluation............................................. 258



11.6     Summary and Historical Remarks............................................................ 261



11.7     Solidify Your Knowledge............................................................................ 263




 



 



 



 



12     Statistical Significance                                                                                     266



12.1     Sampling a Population................................................................................ 266



12.2     Benefiting from the  Normal Distribution................................................ 271



12.3     Confidence Intervals................................................................................... 275



12.4     Statistical Evaluation of  a Classifier.......................................................... 277



12.5     Another Kind of  Statistical Evaluation..................................................... 280



12.6     Comparing Machine-Learning Techniques.............................................. 281



12.7     Summary and Historical Remarks............................................................ 284



12.8     Solidify Your Knowledge............................................................................ 285<



13     Induction  in Multi-Label Domains                                                              287



13.1     Classical Machine Learning in



Multi-Label Domains................................................................................... 287



13.2     Treating  Each  Class Separately:



Binary Relevance......................................................................................... 290



13.3     Classifier Chains........................................................................................... 293



13.4     Another Possibility: Stacking..................................................................... 296



13.5     A Note on Hierarchically  Ordered Classes............................................... 298



13.6     Aggregating the Classes.............................................................................. 301



13.7     Criteria for Performance Evaluation........................................................ 304



13.8     Summary and Historical Remarks............................................................ 307



13.9     Solidify Your Knowledge............................................................................ 308



14     Unsupervised Learning                                                                                    311



14.1     Cluster Analysis........................................................................................... 311



14.2     A Simple Algorithm: k-Means.................................................................... 315



14.3     More Advanced Versions  of k-Means...................................................... 321



14.4     Hierarchical Aggregation............................................................................ 323



14.5     Self-Organizing Feature Maps: Introduction........................................... 326



14.6     Some Important Details.............................................................................. 329



14.7     Why Feature Maps?.................................................................................... 332



14.8     Summary and Historical Remarks............................................................ 334



14.9     Solidify Your Knowledge............................................................................ 335



15     Classifiers in the Form   of Rulesets                                                           338



15.1     A Class Described  By Rules....................................................................... 338



15.2     Inducing Rulesets by  Sequential Covering............................................... 341



15.3     Predicates and Recursion.......................................................................... 344



15.4     More Advanced Search Operators............................................................ 347




 



 



 



 



15.5     Summary and Historical Remarks.............................................................. 349



15.6     Solidify Your Knowledge............................................................................ 350



16     The Genetic Algorithm<                                                                                    352<

16.1     The Baseline Genetic Algorithm................................................................ 352



16.2     Implementing the Individual Modules...................................................... 355



16.3     Why it Works............................................................................................... 359



16.4     The Danger of  Premature Degeneration................................................. 362



16.5     Other Genetic Operators............................................................................ 364



16.6     Some Advanced Versions........................................................................... 367



16.7     Selections in k-NN Classifiers..................................................................... 370



16.8     Summary and Historical Remarks............................................................ 373



16.9     Solidify Your Knowledge............................................................................ 374



17     Reinforcement Learning                                                                                 376



17.1     How  to Choose the Most  Rewarding Action........................................... 376



17.2     States and Actions in  a Game.................................................................... 379



17.3     The SARSA Approach................................................................................. 383



17.4     Summary and Historical Remarks............................................................ 384



17.5     Solidify Your Knowledge............................................................................ 384



Index                                                                                                                           395

Polecamy również książki

Strony www Białystok Warszawa
801 777 223