The main criticism: The maximum entropy principle works (i.e., gives a
correct description of a physically system) if and only if
the knowledge of the observer is of a very special kind, namely consisting
precisely of the expectation values of
at least those
extensive quantities that are important for a thermodynamic description
of the system in question, and
the prior is chosen correctly, consistent with the well-known principles of statistical mechanics.
If one gets the prior wrong (e.g., forgets correct Boltzmann counting), the entropy of mixing doesn't come out correctly, eveen though everything else is done as usual.
If one gets the set of macroobservables
wrong - e.g., $H^2$ in place of $H$, or only the total energy when in
fact a spatially distributed energy distribution is required for an
adequate (nonequilibrium) description - then one gets a meaningless
theory inconsistent with observation.
Thus, essentially, Jaynes uses as input what should be a result -
namely the correct set of relevant variables, and the correct prior to use.
It is a ''derivation'' presupposing the facts, and indeed
it was presented only almost a century after the birth of statistical
mechanics.
I discuss the shortcomings of Jaynes' approach to statistical mechanics
(and a number of related problems)
in Section 10.7 of my book
Classical and Quantum Mechanics via Lie algebras,
and in various articles in my
theoretical physics FAQ:
- (in Chapter A3)
What about the subjective interpretation of probabilities?
Incomplete knowledge and statistics
Entropy and knowledge
The role of the ergodic hypothesis
- (in Chapter A6)
Entropy and missing information
Ignorance in statistical mechanics
This post has been migrated from (A51.SE)