{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,10]],"date-time":"2026-03-10T21:54:17Z","timestamp":1773179657140,"version":"3.50.1"},"reference-count":74,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2021,2,5]],"date-time":"2021-02-05T00:00:00Z","timestamp":1612483200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>Over the last six decades, the representation of error exponent functions for data transmission through noisy channels at rates below capacity has seen three distinct approaches: (1) Through Gallager\u2019s E0 functions (with and without cost constraints); (2) large deviations form, in terms of conditional relative entropy and mutual information; (3) through the \u03b1-mutual information and the Augustin\u2013Csisz\u00e1r mutual information of order \u03b1 derived from the R\u00e9nyi divergence. While a fairly complete picture has emerged in the absence of cost constraints, there have remained gaps in the interrelationships between the three approaches in the general case of cost-constrained encoding. Furthermore, no systematic approach has been proposed to solve the attendant optimization problems by exploiting the specific structure of the information functions. This paper closes those gaps and proposes a simple method to maximize Augustin\u2013Csisz\u00e1r mutual information of order \u03b1 under cost constraints by means of the maximization of the \u03b1-mutual information subject to an exponential average constraint.<\/jats:p>","DOI":"10.3390\/e23020199","type":"journal-article","created":{"date-parts":[[2021,2,7]],"date-time":"2021-02-07T14:04:13Z","timestamp":1612706653000},"page":"199","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":19,"title":["Error Exponents and \u03b1-Mutual Information"],"prefix":"10.3390","volume":"23","author":[{"given":"Sergio","family":"Verd\u00fa","sequence":"first","affiliation":[{"name":"Independent Researcher, Princeton, NJ 08540, USA"}]}],"member":"1968","published-online":{"date-parts":[[2021,2,5]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A Mathematical Theory of Communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell Syst. Tech. J."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"60","DOI":"10.1002\/j.1538-7305.1950.tb00933.x","article-title":"Communication in the Presence of Noise\u2013Probability of Error for Two Encoding Schemes","volume":"29","author":"Rice","year":"1950","journal-title":"Bell Syst. Tech. J."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"611","DOI":"10.1002\/j.1538-7305.1959.tb03905.x","article-title":"Probability of Error for Optimal Codes in a Gaussian Channel","volume":"38","author":"Shannon","year":"1959","journal-title":"Bell Syst. Tech. J."},{"key":"ref_4","first-page":"37","article-title":"Coding for Noisy Channels","volume":"4","author":"Elias","year":"1955","journal-title":"IRE Conv. Rec."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"13","DOI":"10.1109\/TIT.1955.1055131","article-title":"Error Bounds in Noisy Channels without Memory","volume":"1","author":"Feinstein","year":"1955","journal-title":"IRE Trans. Inf. Theory"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"6","DOI":"10.1016\/S0019-9958(57)90039-6","article-title":"Certain Results in Coding Theory for Noisy Channels","volume":"1","author":"Shannon","year":"1957","journal-title":"Inf. Control"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Fano, R.M. (1961). Transmission of Information, Wiley.","DOI":"10.1063\/1.3057290"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"3","DOI":"10.1109\/TIT.1965.1053730","article-title":"A Simple Derivation of the Coding Theorem and Some Applications","volume":"11","author":"Gallager","year":"1965","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_9","unstructured":"Gallager, R.G. (1968). Information Theory and Reliable Communication, Wiley."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"65","DOI":"10.1016\/S0019-9958(67)90052-6","article-title":"Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels, I","volume":"10","author":"Shannon","year":"1967","journal-title":"Inf. Control"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"522","DOI":"10.1016\/S0019-9958(67)91200-4","article-title":"Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels, II","volume":"10","author":"Shannon","year":"1967","journal-title":"Inf. Control"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"270","DOI":"10.1137\/1107027","article-title":"Asymptotic Estimates of the Error Probability for Transmission of Messages over a Discrete Memoryless Communication Channel with a Symmetric Transition Probability Matrix","volume":"7","author":"Dobrushin","year":"1962","journal-title":"Theory Probab. Appl."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"208","DOI":"10.1137\/1107020","article-title":"Optimal Binary Codes for Low Rates of Information Transmission","volume":"7","author":"Dobrushin","year":"1962","journal-title":"Theory Probab. Appl."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"79","DOI":"10.1214\/aoms\/1177729694","article-title":"On Information and Sufficiency","volume":"22","author":"Kullback","year":"1951","journal-title":"Ann. Math. Stat."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"5","DOI":"10.1109\/TIT.1981.1056281","article-title":"Graph Decomposition: A New Key to Coding Theorems","volume":"27","year":"1981","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"2568","DOI":"10.1109\/TIT.2002.800480","article-title":"Random codes: Minimum Distances and Error Exponents","volume":"48","author":"Barg","year":"2002","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1561\/0100000009","article-title":"Performance Analysis of Linear Codes under Maximum-likelihood Decoding: A Tutorial","volume":"3","author":"Sason","year":"2006","journal-title":"Found. Trends Commun. Inf. Theory"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"1945","DOI":"10.1109\/18.868471","article-title":"A New Upper Bound on the Reliability Function of the Gaussian Channel","volume":"46","author":"Ashikhmin","year":"2000","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"97","DOI":"10.1561\/0100000008","article-title":"Reliability Criteria in Information Theory and in Statistical Hypothesis Testing","volume":"4","author":"Haroutunian","year":"2007","journal-title":"Found. Trends Commun. Inf. Theory"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"4449","DOI":"10.1109\/TIT.2014.2322033","article-title":"Expurgated Random-coding Ensembles: Exponents, Refinements, and Connections","volume":"60","author":"Scarlett","year":"2014","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Somekh-Baruch, A., Scarlett, J., and Guill\u00e9n i F\u00e0bregas, A. (2019, January 7\u201312). A Recursive Cost-Constrained Construction that Attains the Expurgated Exponent. Proceedings of the 2019 IEEE International Symposium on Information Theory, Paris, France.","DOI":"10.1109\/ISIT.2019.8849522"},{"key":"ref_22","first-page":"29","article-title":"Estimates of the Exponent of the Error Probability for a Semicontinuous Memoryless Channel","volume":"4","author":"Haroutunian","year":"1968","journal-title":"Probl. Inf. Transm."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"405","DOI":"10.1109\/TIT.1974.1055254","article-title":"Hypothesis Testing and Information Theory","volume":"20","author":"Blahut","year":"1974","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_24","unstructured":"Csisz\u00e1r, I., and K\u00f6rner, J. (1981). Information Theory: Coding Theorems for Discrete Memoryless Systems, Academic."},{"key":"ref_25","unstructured":"Neyman, J. (1961). On Measures of Information and Entropy. Berkeley Symposium on Mathematical Statistics and Probability, University of California Press."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"423","DOI":"10.1016\/S0019-9958(65)90332-3","article-title":"A Coding Theorem and R\u00e9nyi\u2019s Entropy","volume":"8","author":"Campbell","year":"1965","journal-title":"Inf. Control"},{"key":"ref_27","unstructured":"Arimoto, S. (1975). Information Measures and Capacity of Order \u03b1 for Discrete Memoryless Channels. Topics in Information Theory, Bolyai."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"4","DOI":"10.1109\/TIT.2017.2757496","article-title":"Arimoto-R\u00e9nyi conditional entropy and Bayesian M-ary hypothesis testing","volume":"64","author":"Sason","year":"2018","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_29","unstructured":"Fano, R.M. (1953). Class Notes for Course 6.574: Statistical Theory of Information, Massachusetts Institute of Technology."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"191","DOI":"10.1007\/BF02018661","article-title":"A Class of Measures of Informativity of Observation Channels","volume":"2","year":"1972","journal-title":"Period. Mat. Hung."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"149","DOI":"10.1007\/BF00537520","article-title":"Information Radius","volume":"14","author":"Sibson","year":"1969","journal-title":"Z. Wahrscheinlichkeitstheorie Und Verw. Geb."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"26","DOI":"10.1109\/18.370121","article-title":"Generalized Cutoff Rates and R\u00e9nyi\u2019s Information Measures","volume":"41","year":"1995","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"665","DOI":"10.1109\/TIT.1976.1055640","article-title":"Computation of Random Coding Exponent Functions","volume":"22","author":"Arimoto","year":"1976","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"1515","DOI":"10.1109\/LSP.2020.3018661","article-title":"Chebyshev Center Computation on Probability Simplex with \u03b1-divergence Measure","volume":"27","author":"Candan","year":"2020","journal-title":"IEEE Signal Process. Lett."},{"key":"ref_35","first-page":"9","article-title":"Random Coding Bounds for Discrete Memoryless Channels","volume":"18","author":"Poltyrev","year":"1982","journal-title":"Probl. Inf. Transm."},{"key":"ref_36","unstructured":"Augustin, U. (1978). Noisy Channels. [Ph.D. Thesis, Universit\u00e4t Erlangen-N\u00fcrnberg]."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"1064","DOI":"10.1109\/TIT.2017.2776900","article-title":"Operational Interpretation of R\u00e9nyi Information Measures via Composite Hypothesis Testing against Product and Markov Distributions","volume":"64","author":"Tomamichel","year":"2018","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_38","unstructured":"Polyanskiy, Y., and Verd\u00fa, S. (October, January 29). Arimoto Channel Coding Converse and R\u00e9nyi Divergence. Proceedings of the 48th Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA."},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Shayevitz, O. (August, January 31). On R\u00e9nyi Measures and Hypothesis Testing. Proceedings of the 2011 IEEE International Symposium on Information Theory, St. Petersburg, Russia.","DOI":"10.1109\/ISIT.2011.6034266"},{"key":"ref_40","unstructured":"Verd\u00fa, S. (2015, January 1\u20136). \u03b1-Mutual Information. Proceedings of the 2015 Information Theory and Applications Workshop (ITA), San Diego, CA, USA."},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Ho, S.W., and Verd\u00fa, S. (2015, January 15\u201319). Convexity\/Concavity of R\u00e9nyi Entropy and \u03b1-Mutual Information. Proceedings of the 2015 IEEE International Symposium on Information Theory, Hong Kong, China.","DOI":"10.1109\/ISIT.2015.7282554"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"841","DOI":"10.1109\/TIT.2018.2861002","article-title":"The R\u00e9nyi Capacity and Center","volume":"65","author":"Nakiboglu","year":"2019","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_43","unstructured":"Nakiboglu, B. (2018). The Augustin Capacity and Center. arXiv."},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Dalai, M. (2017). Some Remarks on Classical and Classical-Quantum Sphere Packing Bounds: R\u00e9nyi vs. Kullback\u2013Leibler. Entropy, 19.","DOI":"10.3390\/e19070355"},{"key":"ref_45","doi-asserted-by":"crossref","unstructured":"Cai, C., and Verd\u00fa, S. (2019). Conditional R\u00e9nyi Divergence Saddlepoint and the Maximization of \u03b1-Mutual Information. Entropy, 21.","DOI":"10.3390\/e21100969"},{"key":"ref_46","doi-asserted-by":"crossref","unstructured":"V\u00e1zquez-Vilar, G., Martinez, A., and Guill\u00e9n i F\u00e0bregas, A. (2015, January 15\u201319). A Derivation of the Cost-constrained Sphere-Packing Exponent. Proceedings of the 2015 IEEE International Symposium on Information Theory, Hong Kong, China.","DOI":"10.1109\/ISIT.2015.7282591"},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"1449","DOI":"10.1109\/18.21284","article-title":"Capacity and Error Exponent for the Direct Detection Photon Channel","volume":"34","author":"Wyner","year":"1988","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_48","doi-asserted-by":"crossref","unstructured":"Csisz\u00e1r, I., and K\u00f6rner, J. (2011). Information Theory: Coding Theorems for Discrete Memoryless Systems, Cambridge University Press. [2nd ed.].","DOI":"10.1017\/CBO9780511921889"},{"key":"ref_49","doi-asserted-by":"crossref","first-page":"441","DOI":"10.1007\/BF02024507","article-title":"On Measures of Dependence","volume":"10","year":"1959","journal-title":"Acta Math. Hung."},{"key":"ref_50","doi-asserted-by":"crossref","first-page":"3797","DOI":"10.1109\/TIT.2014.2320500","article-title":"R\u00e9nyi Divergence and Kullback-Leibler Divergence","volume":"60","year":"2014","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_51","doi-asserted-by":"crossref","first-page":"1474","DOI":"10.1109\/TIT.2003.810633","article-title":"Information Projections Revisited","volume":"49","year":"2003","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_52","first-page":"299","article-title":"Information-type Measures of Difference of Probability Distributions and Indirect Observations","volume":"2","year":"1967","journal-title":"Stud. Sci. Math. Hung."},{"key":"ref_53","doi-asserted-by":"crossref","first-page":"816","DOI":"10.1109\/TIT.2018.2882547","article-title":"The Sphere Packing Bound via Augustin\u2019s Method","volume":"65","author":"Nakiboglu","year":"2019","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_54","doi-asserted-by":"crossref","first-page":"299","DOI":"10.1134\/S003294601904001X","article-title":"The Augustin Capacity and Center","volume":"55","author":"Nakiboglu","year":"2019","journal-title":"Probl. Inf. Transm."},{"key":"ref_55","doi-asserted-by":"crossref","unstructured":"V\u00e1zquez-Vilar, G. (2019). Error Probability Bounds for Gaussian Channels under Maximal and Average Power Constraints. arXiv.","DOI":"10.1109\/ISIT.2019.8849543"},{"key":"ref_56","first-page":"1","article-title":"Geometrische Deutung einiger Ergebnisse bei der Berechnung der Kanalkapazit\u00e4t","volume":"10","author":"Shannon","year":"1957","journal-title":"Nachrichtentechnische Z."},{"key":"ref_57","doi-asserted-by":"crossref","first-page":"1147","DOI":"10.1109\/18.335960","article-title":"A General Formula for Channel Capacity","volume":"40","author":"Han","year":"1994","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_58","doi-asserted-by":"crossref","first-page":"101","DOI":"10.1016\/1385-7258(74)90000-6","article-title":"On the Shannon Capacity of an Arbitrary Channel","volume":"77","author":"Kemperman","year":"1974","journal-title":"K. Ned. Akad. Van Wet. Indag. Math."},{"key":"ref_59","unstructured":"Aubin, J.P. (1979). Mathematical Methods of Game and Economic Theory, North-Holland."},{"key":"ref_60","unstructured":"Luenberger, D.G. (1969). Optimization by Vector Space Methods, Wiley."},{"key":"ref_61","doi-asserted-by":"crossref","first-page":"1147","DOI":"10.1109\/TIT.2003.810631","article-title":"To Code, or Not to Code: Lossy Source\u2013Channel Communication Revisited","volume":"49","author":"Gastpar","year":"2003","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_62","doi-asserted-by":"crossref","first-page":"357","DOI":"10.1109\/TIT.1973.1055007","article-title":"On the Converse to the Coding Theorem for Discrete Memoryless Channels","volume":"19","author":"Arimoto","year":"1973","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_63","doi-asserted-by":"crossref","first-page":"23","DOI":"10.1109\/TIT.2015.2504100","article-title":"On the R\u00e9nyi Divergence, Joint Range of Relative Entropies, Measures and a Channel Coding Theorem","volume":"62","author":"Sason","year":"2016","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_64","first-page":"5603","article-title":"Constant Compositions in the Sphere Packing Bound for Classical-quantum Channels","volume":"63","author":"Dalai","year":"2017","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_65","doi-asserted-by":"crossref","first-page":"201","DOI":"10.1134\/S0032946020030011","article-title":"The Sphere Packing Bound for Memoryless Channels","volume":"56","author":"Nakiboglu","year":"2020","journal-title":"Probl. Inf. Transm."},{"key":"ref_66","doi-asserted-by":"crossref","first-page":"8027","DOI":"10.1109\/TIT.2013.2283794","article-title":"Lower Bounds on the Probability of Error for Classical and Classical-quantum Channels","volume":"59","author":"Dalai","year":"2013","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_67","doi-asserted-by":"crossref","first-page":"8","DOI":"10.1109\/TIT.1956.1056798","article-title":"The Zero Error Capacity of a Noisy Channel","volume":"2","author":"Shannon","year":"1956","journal-title":"IRE Trans. Inf. Theory"},{"key":"ref_68","doi-asserted-by":"crossref","first-page":"259","DOI":"10.1109\/18.272494","article-title":"Relations Between Entropy and Error Probability","volume":"40","author":"Feder","year":"1994","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_69","doi-asserted-by":"crossref","first-page":"152","DOI":"10.1109\/TCOM.1979.1094267","article-title":"Signal Design for the Amplitude-limited Gaussian Channel by Error Bound Optimization","volume":"27","author":"Einarsson","year":"1979","journal-title":"IEEE Trans. Commun."},{"key":"ref_70","doi-asserted-by":"crossref","first-page":"4","DOI":"10.1109\/18.481773","article-title":"Bits through Queues","volume":"42","author":"Anantharam","year":"1996","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_71","first-page":"86","article-title":"The Exponential Distribution in Information Theory","volume":"32","year":"1996","journal-title":"Probl. Inf. Transm."},{"key":"ref_72","doi-asserted-by":"crossref","first-page":"1681","DOI":"10.1109\/TIT.2002.1003846","article-title":"On the Reliability Exponent of the Exponential Timing Channel","volume":"48","author":"Arikan","year":"1996","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_73","doi-asserted-by":"crossref","first-page":"2307","DOI":"10.1109\/TIT.2010.2043769","article-title":"Channel Coding Rate in the Finite Blocklength Regime","volume":"56","author":"Polyanskiy","year":"2010","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_74","unstructured":"Royden, H.L., and Fitzpatrick, P. (2010). Real Analysis, Prentice Hall. [4th ed.]."}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/23\/2\/199\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T05:20:19Z","timestamp":1760160019000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/23\/2\/199"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,2,5]]},"references-count":74,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2021,2]]}},"alternative-id":["e23020199"],"URL":"https:\/\/doi.org\/10.3390\/e23020199","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,2,5]]}}}