Improving Latent Quantization of Learned Image Compression with Gradient Scaling

Heming Sun*, Lu Yu, Jiro Katto

*この研究の対応する著者

研究成果: Conference contribution

抄録

Learned image compression (LIC) has shown its superior compression ability. Quantization is an inevitable stage to generate quantized latent for the entropy coding. To solve the non-differentiable problem of quantization in the training phase, many differentiable approximated quantization methods have been proposed. However, the derivative of quantized latent to non-quantized latent are set as one in most of the previous methods. As a result, the quantization error between non-quantized and quantized latent is not taken into consideration in the gradient descent. To address this issue, we exploit the gradient scaling method to scale the gradient of non-quantized latent in the back-propagation. The experimental results show that we can outperform the recent LIC quantization methods.

本文言語English
ホスト出版物のタイトル2022 IEEE International Conference on Visual Communications and Image Processing, VCIP 2022
出版社Institute of Electrical and Electronics Engineers Inc.
ISBN(電子版)9781665475921
DOI
出版ステータスPublished - 2022
イベント2022 IEEE International Conference on Visual Communications and Image Processing, VCIP 2022 - Suzhou, China
継続期間: 2022 12月 132022 12月 16

出版物シリーズ

名前2022 IEEE International Conference on Visual Communications and Image Processing, VCIP 2022

Conference

Conference2022 IEEE International Conference on Visual Communications and Image Processing, VCIP 2022
国/地域China
CitySuzhou
Period22/12/1322/12/16

ASJC Scopus subject areas

  • コンピュータ ネットワークおよび通信
  • コンピュータ サイエンスの応用
  • コンピュータ ビジョンおよびパターン認識
  • 信号処理

フィンガープリント

「Improving Latent Quantization of Learned Image Compression with Gradient Scaling」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル