Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  Can black holes be created on a miniature scale?

+ 4 like - 0 dislike
4441 views

A black hole is so powerful to suck everything into itself. So is it possible that mini black holes can be created? If not then we could have actively disproved the rumors spread during LHC experiment.

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user funtime
asked Mar 19, 2012 in Phenomenology by funtime (20 points) [ no revision ]
To create a mini black hole (of a Planck mass) You would need a linear accelerator with an extension of about a galaxy, so these stupid rumors were ridiculous ;-) ...

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user Dilaton
Every rumor of danger from a man made accelerator is trivially disproved by noting that the cosmic ray flux would have triggered any available disaster long, long ago. That said, quantum black holes are expected to dissipate by the Hawking process faster than they accumulate mass in terrestrial environments so they would not represent a disaster and are not ruled out by the anthropic argument about cosmic rays.

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user dmckee
The notion that a black hole is "so powerful [as] to suck everything into itself" is itself a major misconception. Gravity remains what it is everywhere else, and black holes do not reach out to "suck in" everything out there.

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user Randall Schulz

2 Answers

+ 6 like - 0 dislike

The rumors that you could make black holes in particle accelerators are due to an obviously impossible model due to Arkani-Hamed, Dimopoulos, and Dvali, also supported by Lisa Randall and Raman Sundrum, that became very popular and highly cited due to an unfortunate spate of wishful thinking and delusional model building among phenomenologically minded string theorists in the last decade. This idea is called "large extra dimensions", and the rise and fall of large-extra dimensions was the physics Iraqi WMD's, it was indefensible groupthink which biased a political community. A political community is what string theory had been forced to become, just to scrape by through the dark ages of the 1980s and 1990s.

During these dark ages, despite a large amount of resistance, string theory underwent a scientific revolution with hardly any precedent, a revolution so extensive, it can only be compared to Galileo and Copernicus. The conceptual framework of strings was extended to more general structures, the D-branes of Polchinski and collaborators, and orbifolds of Dixon and others (all this work was intensely communal and collaborative, and it is difficult to pick out individuals--- the names are just handles into the literature). These structures acquired a full interpretation with Susskind's explicit recognition of the central place that t'Hooft's hologrphic principle plays in string theory, and the holographic principle itself revived, clarified, and modernized the Chew/Mandelstam S-matrix program which flourished in the 1960s, but was lost in the dark-ages.

These two ideas were related, in that the holographic principle explained the mysterious structure of the world-volume theories, while the D-branes provided new examples of objects with world-volume theories to describe. Within a few short years, M-theory, Matrix theory, and AdS/CFT, solved the central unsolved problem of physics, which, by all rights, we should never have been able to solve: they gave a complete consistent non-perturbative description of quantum mechanical model universes with General Relativistic gravity. It is no longer correct to say that we do not know how to unify quantum mechanics and General Relativity--- we know for sure for certain AdS spaces, for 11-dimensional matrix-theory in at least some domains, and many other cases where the universe is asymptotically cold (meaning not surrounded by a thermal horizon, like deSitter space is). Unfortunately our own universe is not in this class, at least not yet, it looks like it is headed towards a deSitter phase, and we know it was deSitter in the past.

Anyway, the solution to the cold universe problem of quantum gravity solve the in-principle problem of a theory of everything correctly and persuasively. Further, the consistency relations were obviously so stringent, that only 1990s string theory could possibly obey them--- it required that different theories, each formulated in a different number of world-volume dimensions ranging from 0 to 4 (and more, if you allow little-strings), will each describe the same emergent local physics in a different number of spatial dimensions! This idea is commonplace today, but one must always remember that it is, on its face, impossible. While it is not a mathematical proof, there is no way in heck that anything other than strings is going to do this.

So the problem of the theory of everything, was, to a large extent, solved in the 1990s, with the modern formulation of string theory.

You would think this would lead to a revolution in particle physics. Unfortunately, these ideas do not directly connect with particles--- they are theoretical gravitational constructions that are inspired by and linked more closely to gravitational physics. But the mathematical structures were in a large part those of particle physics--- the world-volume theories of branes resemble the gauge theories of particle physics.

So there was this tremendous pressure to make a direct link between these developments and particle physics, and there were very few direct clues to how to do this, because the scale of quantum gravity is so remote. When the planck scale is big, as it is in our universe, there are strong constraints to stuffing crap into a string vacuum, because there isn't a lot of structure you can support in a tiny compactified space. But people were considering crazy constructions with brane-stacks all the time, and people wanted a way to turn this into particle physics, and they couldn't, because string theory is so tightly constrained.

In this political situation, ADD proposed their model. Their idea was to throw away the requirement that the Planck energy is big, and instead, make the extra dimensions big. This immediately frees you up to do whatever the hell you want, because you can stuff anything into large extra dimensions, so it allowed all the string folks to start making particle models using the new tools. This was a nice exercise in using the new tools, so you get a billion citations. But it is crap as physics, because there are good reasons to believe the Planck scale is big.

These reasons are all based on the fact that the standard model is renormalizable, and the standard model works. If the Planck scale is small, there is no reason that the effective theory should be so perfectly close to renormalizable. So each not-observed non-renormalizable correction is a strike against a large-extra dimension scenario. Here are the ones everyone thought of immediately when ADD came out:

  • Proton decay: the proton is only stabilized by an accidental symmetry. So if the Planck scale is about 1 TeV, you would expect instantaneous proton decay from non-renormalizable interactions.
  • Neutrino masses: The neutrino mass scale is .01 eV, which is correctly predicted from the non-renormalizable suppression of the dimension 5 two-Higgs two-neutrino scattering.
  • Precision Electroweak corrections: You would wreck the muon magnetic moment at the third decimal place, and the electrons at the sixth. These are measured much better than that.
  • Coupling unification: The strong and SU(2) and U(1) couplings approximately unify at the GUT scale. This fails when the running changes, as it must in extra dimension theories.

But then, instead of the theory disappearing, something indefensible happened. People started writing bogus theories which claimed to explain these phenomena, and other people took them seriously. I was flabbergasted that otherwise intelligent people were taking this obvious garbage seriously. Further, you couldn't get a job unless you were studying this garbage, so basically only clowns and frauds were getting positions in the US. This is depressing to see, and it mades one want to be a biologist.

The bogus papers claimed the following:

  • Proton decay: This can be suppressed by putting quarks and leptons on different branes which are far apart in the large extra dimensions. This doesn't work because you have anomaly and SU(2) instantons linking quarks and leptons, so you need SU(2)/U(1) fields delocalized and running crazy to even pretend to make this work.
  • Neutrino masses: This can be suppressed if all the masses come from a right handed partner, which is delocalized in the extra dimensions. Of course, this doesn't work because there is no reason to postulate a right-handed partner when you can just put in a dimension 5 nonrenormlizable term by hand without a partner. In order to actually suppress neutrino masses for real, you have to do Kakushadze style fantasy models where you suppress 10 orders by hand using discrete symmetries, and only these 10 orders, because you need to keep the little bit of observed mass. The neutrino mass by itself is enough to kill this theory, and any other where the new-physics scale is not $10^8$ GeV or higher, and realistically, what everyone always thought it was, around $10^{16}$GeV.
  • Running couplings: People made up just-so stories about how the running can speed up and still make the couplings meet. Without reading these papers (I didn't) you can just see that this is crap--- the coupling meeting is based on logarithmic corrections which are sensitive to the details of stuff well past a TeV.
  • Precision electroweak: Particles past 1TeV generally make tiny negligible corrections to the magnetic moment, so people just assumed that quantum gravity at a TeV would do the same. This is complete nonsense, because these particles have extra constraints from their own renormalizable interactions, and their loop structure can sometimes force them to give small corrections, just because of some accidental renormalization property. A generic quantum gravity regime will generate a Pauli-term for the electron and muon which is only suppressed by the ratio of their mass to 1 TeV, which is not enough/

There are other obvious strikes against this theory from other nonrenormalizable corrections, these are not in the literature, I thought of them while writing the negative parts of the Wikipedia page on this:

  • Strong CP: Each extra non-renormalizable pure-strong interaction term will break strong CP in a different way, which can't be fixed by axions. The constraints on strong CP violation are insane, so you need a to fix this.
  • W and Z corrections: These would get nonrenormalizable condensate interactions (they are close to the Planck scale) wrecking the standard model mass/coupling relation which is measured to at least 5 orders of precision.
  • Quark-lepton direct coupling: You could make leptons turn into quarks via a direct contact interation which is dimension 6. Besides not being observed, if I remember what I worked out years ago correctly, this would give neutrinos a pion-condensate mass, separate from a Higgs mechanism mass, which would be many orders of magnitude bigger than the observed neutrino mass.

This stuff, although it might seem convoluted to an outsider, is so elementary to a particle physics that I didn't think it even merited a publication. This stuff is what you get on referee reports. Some people tried to publish comments like this about large extra dimensions, but nobody listened to them, and when they failed to penetrate phenomenological string theorists' cloud of delusion, this also served as a warning to others not to repeat the criticism, but to go along with the fraud. Needless to say, this must never happen again.

I have to point out that despite this sorry episode, the physics literature is still by far the most intellectually honest literature in academia, it is still a beacon of honesty to fields like linguistics and philosophy, where wishful thinking and fraudulent research is the rule, not the exception (this was especially true during those recent dark ages).

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user Ron Maimon
answered Mar 20, 2012 by Ron Maimon (7,730 points) [ no revision ]
Most voted comments show all comments
@RonMaimon - does the Goldberger-Wise mechanism not adress stabilizing the extra dimension in a natural way?

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user DJBunk
@DJBunk: I didn't see that paper in the day--- I saw similar papers with twisting scalars for ADD model. The twisting scalars just hid the unnaturalness, but in this case of Golberger-Wise, they have a sensible mechanism, they are using the warp factor to enforce large size. The problem I have is that these mathematical shenanigans are really ad-hoc, and in warped scenarios, you need a physical picture to know when you are fine-tuning (because you can fool youself by using unnatural log-parameters). The physical picture makes it clear Randall-Sundrum is unstable to collapse...

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user Ron Maimon
the scalar field trick is assuming the values of the scalar field are clamped on the branes with potentials which give a gradient, and indeed, when this happens, the thing stabilizes as they say. But if the branes are actual branes, black-hole type branes, I don't see how the scalar field can be clamped. This is the usual problem that the models are ad-hoc, not string constructions (but it's a good paper). There are ordinary gravitational arguments that compactifications like this are unstable, so clamping scalars is usually unnatural. But you probably can do it in this ad-hoc way...

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user Ron Maimon
you have to remember a lot of people were fooled by this, so I know that there are clever arguments made in this subfield. Arkani-Hamed and Schmalz gave a proton stabilization trick that sort of works (at the cost of electroweak running). But fixing the non-renormalizable corrections is insurmountable, and people just had absolutely no respect for how stringent these are, because they aren't so stringent in non-string modifications at a TeV, because you can easily ramp up the dimension of the corrections from the constraint that the high energy theory is renormalizable. Not so for gravity.

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user Ron Maimon
@RonMaimon - thanks for taking the time for the thorough comments- much appreciated!

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user DJBunk
Most recent comments show all comments
Do you think my history is wrong, or that there is some unstated sociological motivation (particle physicists wanting to impress string theorists, I don't know) at work? By not mentioning the hierarchy problem are you saying that you think this is not a pressing issue in particle physics, or not as pressing as people make it? I ask because if it is indeed a pressing issue in particle physics, then it makes sense to solve this first, and patch up those models so they agree with other observations. But if you don't think its pressing, why not?

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user DJBunk
@DJBunk: The heirarchy problem was the stated motivation, but the other motivation was that it allowed brane-stack constructions for the standard model without any constraints. The day the theory was published (I was around) it was evident that this theory leads to violations of every renormalizability constraint, proton decay, neutrino masses, ridiculous strong interaction phenomena. My advisor advised me to work on this, and I just couldn't. All the competent young people were the same. Kakushadze, instead of explicity criticizing it, worked on it disingenuously. But they all left science.

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user Ron Maimon
+ 6 like - 0 dislike

The search is on at the running LHC experiments for signatures of black holes from large extra dimensions. Despite what @Ronmaimon claims in his answer, experimentalists are not convinced that the probability of some of the models that expect large extra dimensions to be right is zero.

A search for microscopic black hole production and decay in pp collisions at a centerof- mass energy of 7 TeV has been conducted by the CMS Collaboration at the LHC, using a data sample corresponding to an integrated luminosity of 35 pb

answered Mar 20, 2012 by anna v (2,005 points) [ no revision ]
Experiments can't rule out these claims with the same certainty that theorists can, because the theorists have so many parameters to tune in these models, they can get whatever answer they want. The problem is naturalness, which is the main theory killer in physics, and naturalness is a theoretical constraint.

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user Ron Maimon
@RonMaimon Data trumps theory every single time. If an enhancement compatible with the thermodynamic decay into jets and particles were observed at the LHC, simplicity and complexity of the theory would be reexamined.

This post imported from StackExchange Physics at 2015-03-29 04:23 (UTC), posted by SE-user anna v

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOv$\varnothing$rflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...