ABE-IPSABE HOLDINGABE BOOKS
English Polski
On-line access

Bookstore

0.00 PLN
Bookshelf (0) 
Your bookshelf is empty
Statistical Learning with Sparsity: The Lasso and Generalizations

Statistical Learning with Sparsity: The Lasso and Generalizations

Authors
Publisher Taylor & Francis Inc
Year 07/05/2015
Pages 367
Version hardback
Readership level General/trade
Language English
ISBN 9781498712163
Categories Probability & statistics
$141.48 (with VAT)
628.95 PLN / €134.85 / £117.06
Qty:
Delivery to United States

check shipping prices
Product to order
Delivery 3-4 weeks
Add to bookshelf

Book description

Discover New Methods for Dealing with High-Dimensional Data





A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.





Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso.





In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling. "The authors study and analyze methods using the sparsity property of some statistical models in order to recover the underlying signal in a dataset. They focus on the Lasso technique as an alternative to the standard least-squares method."
-Zentralblatt MATH 1319


"The book includes all the major branches of statistical learning. For each topic, the authors first give a concise introduction of the basic problem, evaluate conventional methods, pointing out their deficiencies, and then introduce a method based on sparsity. Thus, the book has the potential to be the standard textbook on the topic."
-Anand Panangadan, California State University, Fullerton





"It always first discusses regularized models based on equations, followed by example applications, before ending with a bibliography section detailing the historical development of the given method. Software recommendations (mostly open source R packages) are typically provided either in the main part or bibliography section of each chapter. And each chapter concludes with a set of selected exercises meant to deepen the gained knowledge on the given subject, which of course is of great help for teachers of statistics. For these reasons, we congratulate the authors of Statistical Learning with Sparsity and recommend the book to all statistically-inclined readers from intermediate to expert levels. In addition, it is worth pointing out that even for non-statisticians, the book is able to demonstrate,based on numerous real-world examples, the power of regularization."-Ivan Kondofersky and Fabian J. Theis, Institute for Computational Biology

Statistical Learning with Sparsity: The Lasso and Generalizations

Table of contents

Introduction











The Lasso for Linear Models

Introduction

The Lasso Estimator

Cross-Validation and Inference

Computation of the Lasso Solution

Degrees of Freedom

Uniqueness of the Lasso Solutions

A Glimpse at the Theory

The Nonnegative Garrote

q Penalties and Bayes Estimates

Some Perspective











Generalized Linear Models

Introduction

Logistic Regression

Multiclass Logistic Regression

Log-Linear Models and the Poisson GLM

Cox Proportional Hazards Models

Support Vector Machines

Computational Details and glmnet











Generalizations of the Lasso Penalty

Introduction

The Elastic Net

The Group Lasso

Sparse Additive Models and the Group Lasso

The Fused Lasso

Nonconvex Penalties











Optimization Methods

Introduction

Convex Optimality Conditions

Gradient Descent

Coordinate Descent

A Simulation Study

Least Angle Regression

Alternating Direction Method of Multipliers

Minorization-Maximization Algorithms

Biconvexity and Alternating Minimization

Screening Rules











Statistical Inference

The Bayesian Lasso

The Bootstrap

Post-Selection Inference for the Lasso

Inference via a Debiased Lasso

Other Proposals for Post-Selection Inference











Matrix Decompositions, Approximations, and Completion

Introduction

The Singular Value Decomposition

Missing Data and Matrix Completion

Reduced-Rank Regression

A General Matrix Regression Framework

Penalized Matrix Decomposition

Additive Matrix Decomposition











Sparse Multivariate Methods

Introduction

Sparse Principal Components Analysis

Sparse Canonical Correlation Analysis

Sparse Linear Discriminant Analysis

Sparse Clustering











Graphs and Model Selection

Introduction

Basics of Graphical Models

Graph Selection via Penalized Likelihood

Graph Selection via Conditional Inference

Graphical Models with Hidden Variables











Signal Approximation and Compressed Sensing

Introduction

Signals and Sparse Representations

Random Projection and Approximation

Equivalence between 0 and 1 Recovery











Theoretical Results for the Lasso

Introduction

Bounds on Lasso 2-error

Bounds on Prediction Error

Support Recovery in Linear Regression

Beyond the Basic Lasso











Bibliography





Author Index





Index





Bibliographic Notes and Exercises appear at the end of each chapter.

We also recommend books

Strony www Białystok Warszawa
801 777 223