000 03615nam a22003735i 4500
001 279392
003 MX-SnUAN
005 20160429153944.0
007 cr nn 008mamaa
008 150903s2009 xxu| o |||| 0|eng d
020 _a9780387848167
_99780387848167
024 7 _a10.1007/9780387848167
_2doi
035 _avtls000333065
039 9 _a201509030223
_bVLOAD
_c201404122337
_dVLOAD
_c201404092117
_dVLOAD
_y201402041101
_zstaff
040 _aMX-SnUAN
_bspa
_cMX-SnUAN
_erda
050 4 _aQA75.5-76.95
100 1 _aEmmert-Streib, Frank.
_eeditor.
_9303690
245 1 0 _aInformation Theory and Statistical Learning /
_cedited by Frank Emmert-Streib, Matthias Dehmer.
264 1 _aBoston, MA :
_bSpringer US,
_c2009.
300 _brecurso en línea.
336 _atexto
_btxt
_2rdacontent
337 _acomputadora
_bc
_2rdamedia
338 _arecurso en línea
_bcr
_2rdacarrier
347 _aarchivo de texto
_bPDF
_2rda
500 _aSpringer eBooks
505 0 _aAlgorithmic Probability: Theory and Applications -- Model Selection and Testing by the MDL Principle -- Normalized Information Distance -- The Application of Data Compression-Based Distances to Biological Sequences -- MIC: Mutual Information Based Hierarchical Clustering -- A Hybrid Genetic Algorithm for Feature Selection Based on Mutual Information -- Information Approach to Blind Source Separation and Deconvolution -- Causality in Time Series: Its Detection and Quantification by Means of Information Theory -- Information Theoretic Learning and Kernel Methods -- Information-Theoretic Causal Power -- Information Flows in Complex Networks -- Models of Information Processing in the Sensorimotor Loop -- Information Divergence Geometry and the Application to Statistical Machine Learning -- Model Selection and Information Criterion -- Extreme Physical Information as a Principle of Universal Stability -- Entropy and Cloning Methods for Combinatorial Optimization, Sampling and Counting Using the Gibbs Sampler.
520 _aInformation Theory and Statistical Learning presents theoretical and practical results about information theoretic methods used in the context of statistical learning. The book will present a comprehensive overview of the large range of different methods that have been developed in a multitude of contexts. Each chapter is written by an expert in the field. The book is intended for an interdisciplinary readership working in machine learning, applied statistics, artificial intelligence, biostatistics, computational biology, bioinformatics, web mining or related disciplines. Advance Praise for Information Theory and Statistical Learning: "A new epoch has arrived for information sciences to integrate various disciplines such as information theory, machine learning, statistical inference, data mining, model selection etc. I am enthusiastic about recommending the present book to researchers and students, because it summarizes most of these new emerging subjects and methods, which are otherwise scattered in many places." -- Shun-ichi Amari, RIKEN Brain Science Institute, Professor-Emeritus at the University of Tokyo
590 _aPara consulta fuera de la UANL se requiere clave de acceso remoto.
700 1 _aDehmer, Matthias.
_eeditor.
_9303691
710 2 _aSpringerLink (Servicio en línea)
_9299170
776 0 8 _iEdición impresa:
_z9780387848150
856 4 0 _uhttp://remoto.dgb.uanl.mx/login?url=http://dx.doi.org/10.1007/978-0-387-84816-7
_zConectar a Springer E-Books (Para consulta externa se requiere previa autentificación en Biblioteca Digital UANL)
942 _c14
999 _c279392
_d279392