If X={x1,x2,…,xn} are assigned probabilities p(xi), then the entropy is defined as
∑ni=1 p(xi)⋅(−logp(xi)).
One may call I(xi)=−logp(xi) the information associated with xi and consider the above an expectation value. In some systems it make sense to view p as the rate of occurrence of xi and then high low p(xi) the "value of your surprise" whenever xi happens corresponds with I(xi) being larger. It's also worth noting that p is a constant function, we get a Boltzmann-like situation.
Question: Now I wonder, given |X|>1, how I can interpret, for fixed indexed j a single term p(xi)⋅(−logp(xi)). What does this "xthj contribution to the entropy" or "price" represent? What is p⋅log(p) if there are also other probabilities.

Thoughts: It's zero if p is one or zero. In the first case, the surprise of something that will occur with certainty is none and in the second case it will never occur and hence costs nothing. Now
(−p⋅log(p))′=log(1p)−1.
With respect to p, The function has a maximum which, oddly, is at the same time a fixed point, namely 1e=0.368…. That is to say, the maximal contribution of a single term to p(xi)⋅(−logp(xi)) will arise if for some xj, you have p(xj)≈37%.
My question arose when someone asked me what the meaning for xx having a minimum x0 at x0=1e is. This is naturally exlog(x) and I gave an example about signal transfer. The extrema is the individual contribution with maximal entropy and I wanted to argue that, after optimization of encoding/minimization of the entropy, events that happen with a probability p(xj)≈37% of the time will in total "most boring for you to send". The occur relatively often and the optimal length of encoding might not be too short. But I lack interpretation of the individual entropy-contribution to see if this idea makes sense, or what a better reading of it is.
It also relates to those information units, e.g. nat. One over e is the minimum, weather you work base e (with the natural log) or with log2, and −log2(1e)=ln(2).
edit: Related: I just stumbled upon 1e as probability: 37% stopping rule.
This post imported from StackExchange Physics at 2015-02-15 11:55 (UTC), posted by SE-user NikolajK