You can use Boltzmann's H-Theorem to compute the entropy increase or decrease with time.
Consider you have a collision term C(f) for the probability Distribution function f, where it holds e.g. DfDt=C(f). Let s=lnf (generates the entropy density); multiply the collision term by s and integrate over the phase space Σ. Then you have (<> denotes the average)
D<s>Dt=∫ΣdσsC(f). (*)
Suppose that there exists an equilibrium probability Density f0 with C(f0)=0 (no entropy production). The Distribution function depends on all of the the i-th Phase space variables xi. From this you can expand the non-equilibrium probability Density in Terms of the Equilibrium Density by the series expansion
f=(1+(<xi>−<xi>0)∂∂xi+(<xixj>−<xj><xi>0−<xj>0<xi>
−<xixj>0)∂2∂xi∂xj+…)f0 (Summation convention is used).
The <>0 is averaging with Equilibrium Distribution function; These are also known. You can convince yourself, that this Expansion holds by taking various Moments of this and using that Integration of a total derivative vanishes.
Substituting this Expansion into the equation (*) gives you an Expansion of the entropy production rate in all nonequilibrium Moments <xi>,<xixj>. You will have a Connection between the entropy Change and some other quantities that are easy to measure.