INFORMATION THEORY, INFERENCE AND LEARNING ALGORITHMS
Passe o mouse na imagem para ver detalhes Ampliar

INFORMATION THEORY, INFERENCE AND LEARNING ALGORITHMS

This textbook offers comprehensive coverage of Shannon's theory of information as well as the theory of neural networks and probabilistic data modeling. Shannon's source encoding theorem and noisy channel theorem are explained and proved. Accompanying these theoretical results are descriptions of practical data compression systems including the Huffman coding algorithm and the less well known arithmetic coding algorithm. The treatment of neural networks is approached from two perspectives. On the one hand, the information-theoretic capabilities of some neural network algorithms are examined, and on the other hand, neural networks are motivated as statistical models. With many examples and exercises, this book is ideal for students to use as the text for a course, or as a resource for researchers who need to work with neural networks or state-of-the-art error correcting codes.
Editora: CAMBRIDGE UNIVERSITY PRESS
ISBN: 0521642981
ISBN13: 9780521642989
Edição: 1ª Edição - 2002
Número de Páginas: 550
Acabamento: HARDCOVER
por R$ 461,20 4x de R$ 115,30 sem juros