fbpx
Wikipedia

Hartley function

The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set A uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function

where |A| denotes the cardinality of A.

If the base of the logarithm is 2, then the unit of uncertainty is the shannon (more commonly known as bit). If it is the natural logarithm, then the unit is the nat. Hartley used a base-ten logarithm, and with this base, the unit of information is called the hartley (aka ban or dit) in his honor. It is also known as the Hartley entropy or max-entropy.

Hartley function, Shannon entropy, and Rényi entropy edit

The Hartley function coincides with the Shannon entropy (as well as with the Rényi entropies of all orders) in the case of a uniform probability distribution. It is a special case of the Rényi entropy since:

 

But it can also be viewed as a primitive construction, since, as emphasized by Kolmogorov and Rényi, the Hartley function can be defined without introducing any notions of probability (see Uncertainty and information by George J. Klir, p. 423).

Characterization of the Hartley function edit

The Hartley function only depends on the number of elements in a set, and hence can be viewed as a function on natural numbers. Rényi showed that the Hartley function in base 2 is the only function mapping natural numbers to real numbers that satisfies

  1.   (additivity)
  2.   (monotonicity)
  3.   (normalization)

Condition 1 says that the uncertainty of the Cartesian product of two finite sets A and B is the sum of uncertainties of A and B. Condition 2 says that a larger set has larger uncertainty.

Derivation of the Hartley function edit

We want to show that the Hartley function, log2(n), is the only function mapping natural numbers to real numbers that satisfies

  1.   (additivity)
  2.   (monotonicity)
  3.   (normalization)

Let f be a function on positive integers that satisfies the above three properties. From the additive property, we can show that for any integer n and k,

 

Let a, b, and t be any positive integers. There is a unique integer s determined by

 

Therefore,

 

and

 

On the other hand, by monotonicity,

 

Using equation (1), one gets

 

and

 

Hence,

 

Since t can be arbitrarily large, the difference on the left hand side of the above inequality must be zero,

 

So,

 

for some constant μ, which must be equal to 1 by the normalization property.

See also edit

References edit

hartley, function, measure, uncertainty, introduced, ralph, hartley, 1928, sample, from, finite, uniformly, random, picked, information, revealed, after, outcome, known, given, displaystyle, mathrm, vert, vert, where, denotes, cardinality, base, logarithm, the. The Hartley function is a measure of uncertainty introduced by Ralph Hartley in 1928 If a sample from a finite set A uniformly at random is picked the information revealed after the outcome is known is given by the Hartley function H 0 A l o g b A displaystyle H 0 A mathrm log b vert A vert where A denotes the cardinality of A If the base of the logarithm is 2 then the unit of uncertainty is the shannon more commonly known as bit If it is the natural logarithm then the unit is the nat Hartley used a base ten logarithm and with this base the unit of information is called the hartley aka ban or dit in his honor It is also known as the Hartley entropy or max entropy Contents 1 Hartley function Shannon entropy and Renyi entropy 2 Characterization of the Hartley function 3 Derivation of the Hartley function 4 See also 5 ReferencesHartley function Shannon entropy and Renyi entropy editThe Hartley function coincides with the Shannon entropy as well as with the Renyi entropies of all orders in the case of a uniform probability distribution It is a special case of the Renyi entropy since H 0 X 1 1 0 log i 1 X p i 0 log X displaystyle H 0 X frac 1 1 0 log sum i 1 mathcal X p i 0 log mathcal X nbsp But it can also be viewed as a primitive construction since as emphasized by Kolmogorov and Renyi the Hartley function can be defined without introducing any notions of probability see Uncertainty and information by George J Klir p 423 Characterization of the Hartley function editThe Hartley function only depends on the number of elements in a set and hence can be viewed as a function on natural numbers Renyi showed that the Hartley function in base 2 is the only function mapping natural numbers to real numbers that satisfies H m n H m H n displaystyle H mn H m H n nbsp additivity H m H m 1 displaystyle H m leq H m 1 nbsp monotonicity H 2 1 displaystyle H 2 1 nbsp normalization Condition 1 says that the uncertainty of the Cartesian product of two finite sets A and B is the sum of uncertainties of A and B Condition 2 says that a larger set has larger uncertainty Derivation of the Hartley function editWe want to show that the Hartley function log2 n is the only function mapping natural numbers to real numbers that satisfies H m n H m H n displaystyle H mn H m H n nbsp additivity H m H m 1 displaystyle H m leq H m 1 nbsp monotonicity H 2 1 displaystyle H 2 1 nbsp normalization Let f be a function on positive integers that satisfies the above three properties From the additive property we can show that for any integer n and k f n k k f n displaystyle f n k kf n nbsp Let a b and t be any positive integers There is a unique integer s determined by a s b t a s 1 1 displaystyle a s leq b t leq a s 1 qquad 1 nbsp Therefore s log 2 a t log 2 b s 1 log 2 a displaystyle s log 2 a leq t log 2 b leq s 1 log 2 a nbsp and s t log 2 b log 2 a s 1 t displaystyle frac s t leq frac log 2 b log 2 a leq frac s 1 t nbsp On the other hand by monotonicity f a s f b t f a s 1 displaystyle f a s leq f b t leq f a s 1 nbsp Using equation 1 one gets s f a t f b s 1 f a displaystyle sf a leq tf b leq s 1 f a nbsp and s t f b f a s 1 t displaystyle frac s t leq frac f b f a leq frac s 1 t nbsp Hence f b f a log 2 b log 2 a 1 t displaystyle left vert frac f b f a frac log 2 b log 2 a right vert leq frac 1 t nbsp Since t can be arbitrarily large the difference on the left hand side of the above inequality must be zero f b f a log 2 b log 2 a displaystyle frac f b f a frac log 2 b log 2 a nbsp So f a m log 2 a displaystyle f a mu log 2 a nbsp for some constant m which must be equal to 1 by the normalization property See also editRenyi entropy Min entropyReferences editThis article incorporates material from Hartley function on PlanetMath which is licensed under the Creative Commons Attribution Share Alike License This article incorporates material from Derivation of Hartley function on PlanetMath which is licensed under the Creative Commons Attribution Share Alike License Retrieved from https en wikipedia org w index php title Hartley function amp oldid 1157012225, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.