LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Belavkin, Roman V. (2012)
Publisher: Springer
Languages: English
Types: Article
Subjects: Mathematics - Optimization and Control, Mathematical Physics, Statistics - Machine Learning, Computer Science - Computational Complexity, Computer Science - Information Theory, Mathematics - Functional Analysis
We study optimal solutions to an abstract optimization problem for measures, which is a generalization of classical variational problems in information theory and statistical physics. In the classical problems, information and relative entropy are defined using the Kullback-Leibler divergence, and for this reason optimal measures belong to a one-parameter exponential family. Measures within such a family have the property of mutual absolute continuity. Here we show that this property characterizes other families of optimal positive measures if a functional representing information has a strictly convex dual. Mutual absolute continuity of optimal probability measures allows us to strictly separate deterministic and non-deterministic Markov transition kernels, which play an important role in theories of decisions, estimation, control, communication and computation. We show that deterministic transitions are strictly sub-optimal, unless information resource with a strictly convex dual is unconstrained. For illustration, we construct an example where, unlike non-deterministic, any deterministic kernel either has negatively infinite expected utility (unbounded expected error) or communicates infinite information.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [1] Accardi, L., Cecchini, C.: Conditional expectations in von Neumann algebras and a theorem of Takesaki. Journal of Functional Analysis 45(2), 245-273 (1982)
    • [2] Alesker, S.: Integrals of smooth and analytic functions over Minkowski's sums of convex sets. In: K.M. Ball, V. Milman (eds.) Convex Geometric Analysis, vol. 34, pp. 1-15. MSRI Publications (1998)
    • [3] Amari, S.I.: Differential-Geometrical Methods of Statistics, Lecture Notes in Statistics, vol. 25. Springer, Berlin, Germany (1985)
    • [4] Amari, S.I., Ohara, A.: Geometry of q-exponential family of probability distributions. Entropy 13, 1170-1185 (2011)
    • [5] Asplund, E., Rockafellar, R.T.: Gradients of convex functions. Transactions of the American Mathematical Society 139, 443-467 (1969)
    • [6] Banerjee, A., Merugu, S., Dhillon, I.S., Ghosh, J.: Clustering with Bregman divergences. Journal of Machine Learning Research 6, 1705-1749 (2005)
    • [8] Belavkin, R.V.: On evolution of an information dynamic system and its generating operator. Optimization Letters pp. 1-14 (2011). 10.1007/s11590-011-0325-z
    • [9] Belavkin, V.P.: New types of quantum entropies and additive information capacities. In: L. Accardi, W. Freudenberg, M. Ohya (eds.) Quantum Bio-Informatics IV, QP-PQ: Quantum Probability and White Noise Analysis, pp. 61-89. World Scientific (2011)
    • [10] Bobkov, S.G., Zegarlinski, B.: Entropy bounds and isoperimetry. Memoirs of the American Mathematical Society 176(829) (2005)
    • [11] Bourbaki, N.: Ele´ments de mathe´matiques. Inte´gration. Hermann (1963)
    • [12] Chentsov, N.N.: Statistical Decision Rules and Optimal Inference. Nauka, Moscow, U.S.S.R. (1972). In Russian, English translation: Providence, RI: AMS, 1982
    • [13] Crame´r, H.: Mathematical Methods of Statistics. Princeton University Press, Princeton, NJ (1946)
    • [14] Csisza´r, I.: Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. Annals of Statistics 19(4), 2032-2066 (1991)
    • [15] Dixmier, J.: von Neumann algebras. North-Holland Publishing Company, AmsterdamNew York (1981)
    • [16] Goldreich, O.: Computational Complexity: A Conceptual Perspective. Cambridge University Press (2008)
    • [17] Jaynes, E.T.: Information theory and statistical mechanics. Physical Review 106, 108, 620-630, 171-190 (1957)
    • [18] Kachurovskii, R.I.: Nonlinear monotone operators in Banach spaces. Russian Mathematical Surveys 23(2), 117-165 (1968)
    • [19] Khinchin, A.I.: Mathematical Foundations of Information Theory. Dover, New York (1957)
    • [20] Kirkpatrick, S., Gelatt, C.D., Vecchi, J.M.P.: Optimization by simulated annealing. Science 220(4598), 671-680 (1983)
    • [21] Kolmogorov, A.N., Uspenskii, V.A.: On the definition of an algorithm. Uspekhi Mat. Nauk 13(4), 3-28 (1958). In Russian
    • [22] Kozen, D., Ruozzi, N.: Applications of metric coinduction. Logical Methods in Computer Science 5(3:10), 1-19 (2009)
    • [23] Kullback, S.: Information Theory and Statistics. John Wiley and Sons (1959)
    • [24] Markov, A.A., Nagornyi, N.M.: The theory of algorithms. Kluwer Academic, Dordrecht, Boston, London (1988). Translated from Russian
    • [25] Moreau, J.J.: Functionelles Convexes. Lectrue Notes, Se´minaire sur les e´quations aux derive´es partielles. Colle´ge de France, Paris (1967)
    • [26] Naudts, J.: Generalised exponential families and associated entropy functions. Entropy 10, 131-149 (2008)
    • [27] von Neumann, J., Morgenstern, O.: Theory of games and economic behavior, first edn. Princeton University Press, Princeton, NJ (1944)
    • [28] Petz, D.: Conditional expectation in quantum probability. Lecture Notes in Mathematics 1303, 251-260 (1988)
    • [29] Phelps, R.R.: Lectures on Choquet's theorem, Lecture Notes in Mathematics, vol. 1757, 2nd edn. Springer, Berlin (2001)
    • [30] Pistone, G., Sempi, C.: An infinite-dimensional geometric structure on the space of all the probability measures equivalent to a given one. The Annals of Statistics 23(5), 1543- 1561 (1995)
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article