For asymptotically flat spaces, a black hole can never be in thermal equilibrium with the rest of spacetime. If it were, the rest of spacetime would be filled with radiation at the Hawking temperature, and the resulting nonzero stress-energy tensor everywhere is inconsistent with the assumption of asymptotic flatness.
So, assuming spacetime sufficiently far away from the black hole is vacuumlike (i.e. no cosmic microwave background radiation), we have a temperature gradient ranging from the Hawking temperature right outside the event horizon to zero very far away.
But once you accept a temperature gradient, the temperature of the black hole no longer tells us anything about the temperature of spacetime far away from it. The Hawking temperature may be around the Planck temperature, but far away, the temperature is still approaches zero.
AdS black holes are an entirely different matter. I know this question isn't about AdS black holes, but they're instructive as we can have thermal equilibrium everywhere for AdS black holes. This is because we have a warp factor for clock rates which rescales the temperature as measured by a local observer compared to the temperature as measured by a distant observer. So, even at thermal equilibrium, the local temperature falls off exponentially beyond the AdS radius as we move away from the black hole. Any backreaction will remain finite because of this warp factor.
There are two cases to consider here:
- the black hole radius is smaller than the AdS radius
- the black hole radius is larger than the AdS radius
For the latter case, as the size of the black hole goes up, so does the temperature, and there's no upper limit to the temperature. This means the black hole has a positive heat capacity, and remains in thermal equilibrium with its environment.
For the former case, the temperature goes down as the size of the black hole goes up, leading to a negative heat capacity. So, even if we start off with an "equilibrium" state, it is unstable. If the black hole expands, it becomes cooler than its environment, and so, it will absorb a net amount of radiation and expand even further beyond the AdS radius after which it will start to cool off, eventually settling down to the thermal state in the latter case.
With a U-shaped temperature relation as a function of black hole size, we can see it's not possible to have black hole temperatures below the AdS scale, $k$. Instead, we have a phase transition to a thermal state with no black holes.
This post imported from StackExchange Physics at 2014-03-24 03:51 (UCT), posted by SE-user QGR