Skip to product information
1 of 1

Now Publishers

An Introduction to Neural Data Compression

An Introduction to Neural Data Compression

Regular price $134.43 USD
Regular price Sale price $134.43 USD
Sale Sold out
Shipping calculated at checkout.
Format
Quantity
The goal of data compression is to reduce the number of bits needed to represent useful information. Neural, or learned compression, is the application of neural networks and related machine learning techniques to this task. This monograph aims to serve as an entry point for machine learning researchers interested in compression by reviewing the prerequisite background and representative methods in neural compression. Neural compression is the application of neural networks and other machine learning methods to data compression. Recent advances in statistical machine learning have opened up new possibilities for data compression, allowing compression algorithms to be learned end-to-end from data using powerful generative models such as normalizing flows, variational autoencoders, diffusion probabilistic models, and generative adversarial networks. This monograph introduces this field of research to a broader machine learning audience by reviewing the necessary background in information theory (e.g., entropy coding, rate-distortion theory) and computer vision (e.g., image quality assessment, perceptual metrics), and providing a curated guide through the essential ideas and methods in the literature thus far. Instead of surveying the vast literature, essential concepts and methods in neural compression are covered, with a reader in mind who is versed in machine learning but not necessarily data compression.




Author: Yibo Yang, Stephan Mandt, Lucas Theis
Publisher: Now Publishers
Published: 04/25/2023
Pages: 100
Binding Type: Paperback
Weight: 0.34lbs
Size: 9.21h x 6.14w x 0.21d
ISBN: 9781638281740
View full details