Communications of Discrete Information Models with Best 1:1 Mean Codeword Lengths and Suitable Codes

Authors

  • Retneer Sharma, Om Parkash, Rakesh Kumar, Vikramjeet Singh

DOI:

https://doi.org/10.17762/msea.v71i4.1245

Abstract

The well recognized reality concerning the discrete information models indicates the practical magnitude of their relevance towards countless information processing systems. On the other hand possible codeword lengths and their possible lower bounds in addition happen to be a subject of matter for the furtherance of research. At the same time coding theory formulates transactions with a variety of codes accessible with the literature of information theory including uniquely decipherable codes, instantaneous codes, possible codes, suitable codes and the best 1:1 codes. The primary objective of our learning is to extend the literature on these codes along with the illustration of involvement sandwiched discrete entropic, divergence and inaccuracy models. The foremost intention of our communiqué is to make available the correspondence between information theoretic entropic models and the best 1:1 code for revealing the fruitful results. Additionally, our principal objective is to deliberate contributions of discrete divergence and inaccuracy models for the development of suitable codes.

Downloads

Published

2022-11-23

How to Cite

Retneer Sharma, Om Parkash, Rakesh Kumar, Vikramjeet Singh. (2022). Communications of Discrete Information Models with Best 1:1 Mean Codeword Lengths and Suitable Codes. Mathematical Statistician and Engineering Applications, 71(4), 6572–6596. https://doi.org/10.17762/msea.v71i4.1245

Issue

Section

Articles