28–29 May 2026
Granada, Spain
Europe/Madrid timezone

Connections between information measures and activation functions

Not scheduled
20m
Granada, Spain

Granada, Spain

Spring Meeting

Speaker

Giulia Pisano (Università degli Studi di Salerno)

Description

In this work, we introduce a generalized measure of uncertainty, namely the cumulative information $\psi$-measure, in order to provide a unified perspective to the uncertainty framework. Indeed, it is a variability measure which reduces to several well-known information measures for appropriate choices of the function $\psi$. In particular, cumulative versions of Shannon and Tsallis entropies, Gini’s dispersion indices, cumulative information generating functions, and additional new classes of uncertainty measures arise as special cases. We also investigate a suitable relative version, referred to as the relative cumulative information $\psi$-measure. We establish fundamental properties of the proposed measures, including monotonicity properties, bounds, and a set of related inequalities. We derive covariance-based and quantile-based representations, too. Illustrative examples are presented to clarify the behaviour of the measures under different model assumptions and choices of $\psi$. A notable feature of the proposed approach is the natural connection between the function $\psi$ and certain increasing activation functions, commonly employed in neural networks. For instance, the function $\psi$ associated with the cumulative paired Shannon entropy can be derived from the sigmoid activation function, and conversely. Motivated by this link, we introduce the generalized logistic-linear family of activation functions, which include classical models such as sigmoid and linear activations as particular cases within a parametric structure. Finally, we develop an application motivated by a neural network scenario to demonstrate the flexibility and novelty of the proposed framework. The results highlight the potential of the cumulative information $\psi$-measure as a bridge between information theory and modern neural networks.

References:
M. Capaldo, A. Di Crescenzo, and A. Meoli,
``Cumulative information generating function and generalized Gini functions,'' Metrika, vol. 87, pp. 775--803, 2024.
DOI: 10.1007/s00184-023-00931-3.

M. Capaldo, A. Di Crescenzo, and G. Pisano,
``Information measures and activation functions,'' submitted, 2026.

A. Di Crescenzo and M. Longobardi,
``On cumulative entropies,'' Journal of Statistical Planning and Inference, vol. 139, no. 12, pp. 4072--4087, 2009.
DOI: 10.1016/j.jspi.2009.05.038.

M. Rao, Y. Chen, B. C. Vemuri, and F. Wang,
``Cumulative residual entropy: A new measure of information,''
IEEE Transactions on Information Theory, vol. 50, no. 6, pp. 1220--1228, 2004.
DOI: 10.1109/TIT.2004.828057.

Primary authors

Prof. Antonio Di Crescenzo (Università degli Studi di Salerno) Giulia Pisano (Università degli Studi di Salerno) Dr Marco Capaldo (RWTH Aachen University)

Presentation materials

There are no materials yet.