Statistical Learning Theory and Stochastic Optimization [electronic resource] : Ecole d’Eté de Probabilités de Saint-Flour XXXI - 2001 /
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.
Main Authors: | , , |
---|---|
Format: | Texto biblioteca |
Language: | eng |
Published: |
Berlin, Heidelberg : Springer Berlin Heidelberg : Imprint: Springer,
2004
|
Subjects: | Mathematics., Artificial intelligence., Information theory., Numerical analysis., Mathematical optimization., Probabilities., Statistics., Probability Theory and Stochastic Processes., Statistical Theory and Methods., Optimization., Artificial Intelligence (incl. Robotics)., Information and Communication, Circuits., Numerical Analysis., |
Online Access: | http://dx.doi.org/10.1007/b99352 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|