One can pinpoint the technical error in LQG explicity:

To recall, the starting point of LQG is to encode the Riemannian metric in terms of the parallel transport of the affine connection that it induces. This parallel transport is an assignment to each smooth curve in the manifold between points \(x\) and \(y\) of a linear isomorphism \(T_x X \to T_y Y\) between the tangent spaces over these points.

This assignment is itself smooth, as a function on the smooth space of smooth curves, suitably defined. Moreover, it satisfies the evident functoriality conditions, in that it respects composition of paths and identity paths.

It is a theorem that smooth (affine) connections on smooth manifolds are indeed equivalent to such smooth functorial assignments of parallel transport isomorphisms to smooth curves. This theorem goes back to Barrett, who considered it for the case that all paths are taken to be loops. For the general case it is discussed in arxiv.org/abs/0705.0452, following suggestion by John Baez.

So far so good. The idea of LQG is now to use this equivalence to equivalently regard the configuration space of gravity as a space of parallell transport/holonomy assignments to paths (in particular loops, whence the name "LQG").

But now in the next step in LQG, the smoothness condition on these parallel transport assignments is dropped. Instead, what is considered are general functions from paths to group elements, which are not required to be smooth or even to be continuous, hence plain set-theoretic functions. In the LQG literature these assignments are then called "generalized connections". It is the space of these "generalized connections" which is then being quantized.

The trouble is that there is no relation left between "generalized connections" and the actual (smooth) affine connections of Riemanniann geometry. The passage from smooth to "generalized connections" is an ad hoc step that is not justified by any established rule of quantization. It effectively changes the nature of the system that is being quantized.

Removing the smoothness and even the continuity condition on the assignment of parallel transport to paths loses all contact with how the points in the original spacetime manifold "cohere", as it were, smoothly or even continuously. The passage to "generalized connections" amounts to regarding spacetime as just a dust of disconnected points.

Much of the apparent discretization that is subsequently found in the LQG quantization is but an artifact of this dustification. Since it is unclear what (and implausible that) the generalized connections have to do with actual Riemannian geometry, it is of little surprise that a key problem that LQG faces is to recover smooth spacetime geometry in some limit in the resulting quantization. This is due to the dustification of spacetime that happened even before quantization is applied.

When we were discussing this problem a few years back, conciousness in the LQG community grew that the step to "generalized connections" is far from being part of a "conservative quantization" as it used to be advertized. As a result, some members of the community started to investigate the result of applying similar non-standard steps to the quantization of very simple physical systems, for which the correct quantization is well understood. For instance when applied to the free particle, one obtains the same non-separable Hilbert spaces that also appear in LQG, and which are not part of any (other) quantization scheme. Ashtekar tried to make sense of this in terms of a concept he called "shadow states" https://arxiv.org/abs/gr-qc/0207106 . But the examples considered only seemed to show how very different this shadowy world is from anything ever seen elsewhere.

Some authors argued that it is all right to radically change the rules of quantization when it comes to gravity, since after all gravity is special. That may be true. But what is troubling is that there is little to no motivation for the non-standard step from actual connections to "generalized connections" beyond the fact that it admits a naive quantization.