Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  Foundations of quantum physics II. The thermal interpretation

Originality
+ 2 - 0
Accuracy
+ 2 - 0
Score
4.18
4176 views
Referee this paper: arXiv:1902.10779 by Arnold Neumaier

Please use comments to point to previous work in this direction, and reviews to referee the accuracy of the paper. Feel free to edit this submission to summarise the paper (just click on edit, your summary will then appear under the horizontal line)

(Is this your paper?)


Abstract: This paper presents the thermal interpretation of quantum physics. The insight from Part I of this series that Born's rule has its limitations -- hence cannot be the foundation of quantum physics -- opens the way for an alternative interpretation: the thermal interpretation of quantum physics. It gives new foundations that connect quantum physics (including quantum mechanics, statistical mechanics, quantum field theory and their applications) to experiment. The thermal interpretation resolves the problems of the foundations of  quantum physics revealed in the critique from Part I. It improves the traditional foundations in several respects:

  • The thermal interpretation reflects the actual practice of quantum physics, especially regarding its macroscopic implications.
  • The thermal interpretation gives a fair account of the interpretational differences between quantum mechanics and quantum field theory.
  • The thermal interpretation gives a natural, realistic meaning to the standard formalism of quantum mechanics and quantum field theory in a single world, without introducing additional hidden variables.
  • The thermal interpretation is independent of the measurement problem. The latter becomes a precise problem in statistical mechanics rather than a fuzzy and problematic notion in the foundations. Details will be discussed in Part III.
requested Mar 1, 2019 by Arnold Neumaier (15787 points)
summarized by Arnold Neumaier
paper authored Feb 26, 2019 to quant-ph by Arnold Neumaier
  • [ revision history ]
    edited Mar 1, 2019 by Arnold Neumaier

    2 Reviews

    + 2 like - 0 dislike

    The motivation for a new interpretation of quantum mechanics to have a framework, that includes the applications of quantum mechanics and actual macroscopic measurements, was presented in the previous paper.  Accordingly, the thermal interpretation discussed in this work is intended to bridge the gap between the formal core of quantum mechanics and its applications and macroscopic measurements.  The name for the interpretation is chosen from the fact that real-world measurements are done in a thermal environment at a certain temperature.


    To describe quantum mechanics, the thermal interpretation uses as usual a Hilbert space, the field content of the fundamental forces, and a unitary representations of the Heisenberg, Galilei or Poincare group.  The q-expectations follow the deterministic dynamics of the so-called Ehrenfest picture.  The Ehrenfest picture is obtained by introducing a Lie operator that represents the Poisson bracket in the classical case and the commutator in the quantum case, as well as a unified notation for Liouville integration and the quantum trace. The dynamics of the q-expectations of a quantity can then be shown to be given by the Ehrenfest equations. The thermal interpretation considers two kinds of uncertainty; one that can in principle be resolved and a conceptual uncertainty. The quantum uncertainty is assumed to be of the same conceptual kind as the uncertainty of position of macroscopic objects. It can be shown, the thermal interpretation fullfills the requirements for good foundations as outlined in the previous paper.
    To explain how probabilities and statistics are treated in the thermal interpretation, the classical formalism for probability notions is introduced by means of expectations along the lines of Whittle. Due to the weak law of large numbers, quantities become arbitrarily significant with increasing size of the sample.  Deterministic  reasoning  is then  appropriate  for  all  sufficiently significant quantities.  Statistical reasoning is necessary for noisy quantities, and requires that these quantities are sufficiently similar and sufficiently independent to ensure that their mean is significant. Measuring probabilities by a relative frequencies gives in principle arbitrarily accurate results due to the law of large numbers. Generally, a stochastic description of a deterministic system is obtained by applying a reduced or coarse grained description.
    In rigorous terms, quantum field operators are distribution valued operators. The quantities in quantum field theory are smeared fields, described by local space-time integrals using an appropriate test function. In the thermal interpretation, the observables of QFT  are the q-expectations of those smeared quantum fields and correlations of them. The dynamics of theses q-expectations are given by generalized covariant Ehrenfest equations in the Ehrenfest picture and generalized covariant von Neumann equations in the Schrödinger picture. Depending on the specific situation, the Ehrenfest equations are equivalent to the  hydrodynamic or Boltzmann-like equations or to thermodynamic equations of state in thermal equilibrium.  Generally, in the thermal interpretation everything is based on the expectations of the smeared quantum fields, including the universe as a whole. Three different notions of relativistic causality are introduced. Particularly new is the extended causality for correlated systems with spatially separated parts. This new notion of causality, together with calling entangled entities an extended system (or object) are used to resolve some apparent paradoxes in Bell type entanglement experiments.  The apparent FTL propagation of information is resolved by the propagation of conditional information.

    The thermal interpretation stretches quantum mechanics, which is usually considered to be a theory for the microscopic regime, far beyond its original purpose. It is explicitly meant to be an IOE (interpretation of everything) and indeed it seems to me to be some kind of a merger of theoretical, experimental, and applied quantum mechanics. In the thermal interpretation, macroscopic and microscopic considerations are mixed in a rather unusual way. By trying to simultaneously describe physics at all scales at once, it seems to reject the Wilsonian concept of effective description of physics that are only valid at specific scales. As the thermal interpretation is based on the Ehrenfest picture, where the q-expectations calculated from tracing over the statistical operator are the dynamical variables, the distinction between the uncertainty for a single quantum system (conventially seen as limitations to predictability/deterministicity  of quantum mechanics) and the uncertainty when actually doing the measurements, is eliminated. Instead, the thermal interpretation “fills” the quantum uncertainty by extending the microscopic objects considered. The conventional quantum uncertainty as well as the microscopic physics is  some kind of “hidden” in the density operator of the thermal interpretation.
    Calling correlated objects in quantum mechanics (spatially) extended objects in a literal sense is highly unusual. Quantum entanglement is just correlation, which by definition does not provide any causal relationship between the entangled objects.
    Statistical correlations due to preparation of the entangled parts of the system have nothing to do with dynamical propagating of information. So for me personally that's all that is about Bell-type and other entanglement experiments and I don’t see the need for new constructs such as conditional information in this context.

    To me it seems that the thermal interpretation is motivated by a strong preference for what can be observed at everyday human scales, whereas the reality of everything else (such as for example fundamental particles, or physics at astronomical/cosmological regimes that is assumed to have happened before humans were present or takes place too far away) is questioned. However, most theoretical physicists see the movement of science away from what can just be directly observed at human accessible scales to smaller and larger regimes  (by theoretical extrapolations or complicated indirect measurements) as legitimate scientific progress.
    For those reasons it seems that the thermal interpretation is more appropriately an interpretation of (experimental) statistical mechanics or thermodynamics, than of theoretical microscopic quantum mechanics.

    reviewed Mar 6, 2019 by Dilaton (6,240 points) [ revision history ]
    edited Mar 12, 2019 by Dilaton

    Thanks for the review; here some comments:

    Paragraph 2: The equations of hydrodynamics (and the other equations mentioned) are dissipative, hence not equivalent to the conservative Ehrenfest equations; they arise from the latter only through coarse-graining (dropping from the dynamics as irrelevant all multipoint correlations).

    Paragraph 3: The Wilsonian picture of quantum field theory does not conflict with the thermal interpretation. Effective field theories are simply coarse-grained approximations to more fundamental theories, obtained by integrating out the energies above the desired scale. 

    + 2 like - 0 dislike

    This paper treats the core of the thermal interpretation. It is the second in a series of three papers presenting a fully worked out version of Neumaier's thermal interpretation. A fourth paper that "summarizes the main features and adds intuitive explanations and new technical developments" was added later to the series. A fifth paper added later also claims to be part of the series, but is only indirectly related to the thermal interpretation. The book "Coherent Quantum Physics – A Reinterpretation of the Tradition" presents the material from these and more papers in a restructured form together with additional material. The most recent official description of the thermal interpretation of quantum physics is given in Section 9.2 of that book.

    Presentation of material

    The presentation is easy to read, and contains many remarks and observations that are spot on both practically and philosophically. They are so much spot on that I started to read Neumaier's old material just to learn whether he was so spot on right from the beginning. Especially I browsed a preliminary German version from 2007 (http://arnold-neumaier.at/physfaq/therm/ThermDeutsch.txt) and a paper which contains the beginnings
        A. Neumaier,
        Ensembles and experiments in classical and quantum physics,
        Int. J. Mod. Phys. B 17 (2003), 2937-2980.
        quant-ph/0303047
        http://arnold-neumaier.at/papers/physpapers.html#ensembles
        http://arnold-neumaier.at/ms/ensembles.pdf
    The old material also seems spot on, but not as smooth to read. Instead of chaining "convenient" truth together, it asks difficult questions and gives "inconvenient" answers. For example

    --------------------------------------------
    S33. Was wird aus dem Superpositionsprinzip?
    --------------------------------------------

    In der traditionellen Analyse des Messsprozesses nach von Neumann wird radikal vereinfacht (wodurch die Probleme entstehen),  indem man Messungen als Reduktion auf Eigenwerte auffasst, und allgemeinere Situationen dann mit Hilfe des Superpositionprinzips analysiert.

    In der Thermischen Interpretation ist das ein klein bisschen komplizierter. Wenn man nämlich ein Experiment wiederholt, hat sich der Zustand des Rests der Welt schon verändert, und man hat daher nicht mehr exakt dieselbe Situation.

    Sondern nur noch im Mittel dieselbe. Das macht den ganzen Unterschied, Man kann nämlich nicht ganze Universen superponieren. Jedenfalls wüsste ich nicht, wie das präpariert werden soll. Es gibt in der Thermischen Interpretation nur _einen_ Zustand, den des gesamten Universums. Alles andere sind Derivate.

    Das Superpositionsprinzip gilt nur für Systeme, die so klein sind, dass man sie innerhalb dieses Universums in praktisch beliebiger Anzahl herstellen und manipulieren kann. Makroskopische Systeme gehören definitiv nicht mehr dazu!

    Diese Einschränkung bringt Wigners klassisches Argument
        J.A. Wheeler and W. H. Zurek (eds.),
        Quantum theory and measurement.
        Princeton Univ. Press, Princeton 1983,
        Kapitel II.2, insbes. pp. 285-288.
        (siehe dazu den Beitrag ''Does decoherence solve the
        measurement problem?'' in meinem theoretical physics FAQ
        auf http://arnold-neumaier.at/physics-faq.txt)
    zu Fall, das die Unvereinbarkeit von uneingeschränkter Unitarität, dem uneingeschränkten Superpositionsprinzip und dem Kollaps des Zustands bei einer Messung beweist.

    Wir betrachten das detailliert im nächsten Beitrag anhand der Messung eines einzelnen Spins.

    After browsing the old material, I am no longer sure whether the easy readability of the current presentation is really a virtue. However, there is also the fact that I was able to work completely through the new presentation. I noticed that I could just read page after page, and the subsections and sections would be finished before it got difficult. However, there are the many references to the literature, and trying to read those (or lookup some of the used concepts on wikipedia) is slow and tiring. (At least I did work through the relevant concepts on wikipedia after having read about 2/3 of the new presentation (i.e. the book/the entire series), and it was hard for me.) But I never managed to work through the old material (I first tried in 2014), and even now I only managed to browse it.

    (Section "1 Introduction" ends with the following remark: "The bulk of this paper is intended to be nontechnical and understandable for a wide audience being familiar with some traditional quantum mechanics. [...] However, quite a number of remarks are addressed to experts and then refer to technical aspects explained in the references given." My impressions above confirm that this remark contains some truth.)

    Because I quoted one difficult question and Neumaier's "inconvenient" answer above, I will give my (also "inconvenient") opinion later. The question also came up in discussions of Neumaier's papers, and I will give links demonstrating this (and also that my "inconvenient" opinion already occurred in those discussions). But let me now return to the paper itself, and some of its topics.

    Probability via expectation (non-ensemble interpretation)

    This paper contains the non-ensemble interpretation of q-expectations which also applies to single systems, not just to ensembles. This was what initially interested me, and why I knew I had to schedule the time to read Neumaier's paper(s). This part is discussed in subsection "2.3 Uncertainty" (~2 pages), subsection "3.3 Deterministic and stochastic aspects" (~3 pages) and subsection "3.6 The stochastic description of a deterministic system" (~2 pages). Is is well written. I especially liked the eight "important examples of statistical models for deterministic situations with increasingly random appearance," with an explicit reference for each example to a paper or book where it is discussed in more detail. However, since section "3 Thermal interpretation of statistics and probability" (~14 pages) is part of a paper on the interpretation of quantum mechanics (~40 pages), the question arises how this non-ensemble interpretation can contribute to good foundations.

    There is a bullet point in the section "5 Conclusion" that the thermal interpretation "applies both to single quantum objects (like a quantum dot, a neutron star, or the universe) and to statistical populations". The non-ensemble interpretation also allows to use the same concept/interpretation in both classical and quantum mechanics. (Can't find it explicitly mentioned in the paper, but this answer would fit well. In the next paper, it is conjectured that randomness too is the same concept (and has the same origin) in both classical and quantum mechanics.) And section "2.5 Formal definition of the thermal interpretation" adds the answer "The thermal interpretation avoids both the philosophically problematic notion of probability, and the anthropomorphic notions of knowledge and measurement."

    However, an important answer is missing: His non-ensemble interpretation does not run into problems with "counterfactual definiteness". This is where previous attempts to base interpretation on the objective reality of q-expectations remained incomplete. David Mermin in What is quantum mechanics trying to tell us? explains (in section "IX. Absence of Correlata") why his "Ithaca interpretation of quantum mechanics" (IIQM) that insists that "only correlations between subsystems have objective reality" is forced to be a theory of "correlations without correlata":
      "The correlata cannot all have physical reality because in spite of the existence of all subsystem joint distributions and of unique marginal distributions for individual subsystems, it is impossible to construct, in the standard way, a full and mutually consistent set of conditional distributions from the joint and individual subsystem distributions."

    Mermin explicitly acknowledges that this forces his interpretation to remain incomplete:

    This problem — how to make sense of correlations without correlata — brings us up against two major puzzles:
      (1) How is probability to be understood as an intrinsic objective feature of the physical world, rather than merely as a tactical device for coping with our ignorance? How is one to make sense of fundamental, irreducible correlation?
      (2) [...]
    I propose to set aside both of these puzzles.

    The old initial version of the "consistent histories interpretation" also explicitly acknowledges that it remains incomplete. However, it addressed the problems with "counterfactual definiteness" in a completely different way, a very formal and "calculational" one. In section "XI. Comments on other approaches," Mermin explains this as follows:
       "The consistent histories interpretation of quantum mechanics applies to time-dependent as well as equal-time correlations. In contrast to the IIQM, consistent historians are not at all shy about dealing with the correlata that underly a given set of correlations. They gain this interpretive flexibility by insisting that any talk about either correlations or correlata must be restricted to sets of observables singled out by certain quite stringent consistency conditions."

    Core of the thermal interpretation

    Subsection "2.2 Properties" discusses the ontological status of the thermal interpretation, making precise the concept of properties of a quantum system. Those are:

    (S1) The state of a system (at a given time) encodes everything that can be said about the system, and nothing else.
    (S2) Every property of a subsystem is also a property of the whole system.
    (S3) The state of a system determines the state of all its subsystems.
    (CC) Callen’s criterion: Operationally, a system is in a given state if its properties are consistently described by the theory for this state.

    This is based on discussions from part I. It was shown that "If the state of every composite quantum system contains all information that can be known about this system, it cannot be a pure state in general." The density operator was defended as more fundamental than pure states, for example "... the deficiency always has the same root – the treatment of the density operator as representing a state of incomplete knowledge, a statistical mixture of pure states – ..." Even the state of the "one single world" is not assumed to be pure.

    The states of a system in the thermal interpretation are encoded by density operators. However, a state itself is rather the collection of all q-expectations for that state. Section "4.1 Beables and observability in quantum field theory" states this as follows:
      "According to the thermal interpretation, there is nothing in quantum field theory apart from q-expectations of the fields and q-correlations. The quantities accessible to an observer are those q-expectations and q-correlations whose arguments are restricted to the observer’s world tube. More precisely, what we can observe is contained in the least oscillating contributions to these q-expectations and q-correlations. The spatial and temporal high frequency part is unobservable due to the limited resolution of our instruments."

    This quote is relevant for a number of reasons. The q-expectations have a spatial and temporal dependence (as parameters). The q-correlations even depend on more than one different spatial and temporal parameter. (This is how I interpret the difference between q-expectations and q-correlations in this quote.) We are given an explicit reason why some q-expectations are not observable (because their high frequency dependence of the spatial and temporal parameters exceed the resolution limits of our instruments). And this quote shows that the thermal interpretation applies to quantum field theory just as it applies to quantum mechanics, and also to classical statistic mechanics. This is also written explicitly in section "5 Conclusion":
      "The thermal interpretation of quantum physics (including quantum mechanics, statistical mechanics and quantum field theory) is an interpretation of everything. It allows a consistent quantum description of the universe from the smallest to the largest levels of modeling, including its classical aspects."

    Superdeterminism rescues "Many Worlds minus the Many Worlds"

    There are many other nice and interesting things I could (and would want to) write about this paper specifically and the thermal interpretation in general. But this review is already quite long now, and I still need to express my promised "inconvenient" opinion on Neumaier's difficult question together with relevant links. Scott Aaronson described the difficult question as making sense of "Many Worlds minus the Many Worlds" in an old comment (2011):

    Tim Maudlin #6: The view that I take Banks to be defending here is actually one I’ve found extremely common among physicists, so maybe it would be worth philosophers trying to understand it sympathetically and seeing how much sense they can make of it. I like to think of this view as “Many Worlds minus the Many Worlds” — i.e., many worlds without calling it that, or even acknowledging a need to discuss that apparent implication of what you’re saying.

    My "inconvenient" opinion is that Neumaier's "inconvenient" answer implicitly invokes (a valid form of emergent) superdeterminism, but still can't prevent Many Worlds completely. His answer only seems to succeed to prevent Many Worlds for our world today, but doesn't seem to exclude the possibility that the world initially splitted many times before our current macroscopic world emerged. Here is the translation of the relevant part of Neumaier's "inconvenient" answer:

    You cannot superpose entire universes. In any case, I don't see how that should be prepared. There is only _one_ state in the thermal interpretation, that of the entire universe. Everything else is derivatives.

    The superposition principle only applies to systems that are so small that they can be produced in practically any number and manipulated within this universe. Macroscopic systems are definitely no longer one of them!

    This limitation brings down Wigner's classic argument, which proves the incompatibility of unrestricted unitarity, the unrestricted superposition principle and the collapse of the state during a measurement.

    The implicit superdeterminism in this argument is that whenever we prepare a small system and measure it, the state of the measurement device together with the rest of the universe will be such that the measurement device ends up in a valid (i.e. non-superposed, neither coherent nor incoherent) macroscopic state. It is a valid form of emergent superdeterminism, because the macroscopic observables emerged such that they will never encounter superpositions from the evolution of the _one_ state of the universe.

    This is less troublesome than to invoke superdeterminism to get locally realistic determinstic results emulating the results from weird quantum superpositions, which is how the superdeterminism loophole of the Bell test experiments is normally understood. In discussions of Neumaier's papers, this comment by charters (Apr 22, 2019) was not the first to mention superdeterminism, but it explained well why it is relevant. When charters wrote later (May 15, 2019) "I think we eventually clarified that the TI is not superdeterministic", it only meant that the meaning of superdeterminism how it is normally understood as a loophole of the Bell tests didn't apply.

    This answer by Neumaier (May 8, 2019) in another thread shows how the debate with charters evolved. I didn't find the comment where they resolved their dispute, but when eloheim later tried to restart the dispute,  Neumaier wrote (May 14, 2019): "The measurement problem is the problem of why there are unique and discrete outcomes for single quantum systems although the wave function produces only superpositions of (measurement,detector state) pairs. This problem is solved by the TI; see Subsection 5.1 of Part III and Section 3 of Part IV." And Subsection 5.1 of Part III says: "In the thermal interpretation, the traditional difficulty to show that there is always a unique outcome is trivially solved since by definition, the outcome of reading a macroscopic quantity is its expectation value, with negligible uncertainty."

    I think this quote from Subsection 5.1 of Part III nicely illustrates how Neumaier has polished away his old difficult questions and "inconvenient" answers. But if he wants to convince people like Tim Maudlin or Scott Aaronson that his thermal interpretation must be taken seriously, I guess acknowleding the implication of what he once wrote would have a better chance to succeed. (Just like S. Hossenfelder and T.N. Palmer in Rethinking Superdeterminism directly address objections raised by Tim Maudlin and Mateus Araújo in section "4.4 The Tobacco Company Syndrome".)

    reviewed Oct 23, 2020 by Thomas Klimpel (280 points) [ no revision ]

    Because my review was too long, I removed the following two paragraphs:

    The formulation of QM in section "2.1 The Ehrenfest picture of quantum mechanics" via (6), (7), and (8) shows another interesting advantage of using the collection of q-expectation as state instead of the density operator. That presentation unifies the Schrödinger, Heisenberg, and Dirac picture, but the density operator itself is different in each picture. That presentation even unifies classical and quantum mechanics.

    However, that unification may be treacherous. It doesn't just include classical mechanics, but also classical mechanics with epistemic uncertainty about the state of the system. But in classical mechanics, there is a clear way to distinguish a state with epistemic uncertainty from a state without. In quantum mechanics, people tried resorting to pure states to achieve this distinction. But the thermal interpretation explicitly denies pure states this priviledge, and explains well why it is important to deny pure states any special status.

    It felt like the least important part of the review at the time, basically mostly a parenthetical remark. But when physicists googled for Arnold Neumaier after I told them my opinion about his thermal interpretation, I felt the urge to tell them that they should start by trying to appreciate the beauty and depth of (6), (7), and (8). That beauty goes deeper than my parenthetical remark. Note for example that those equations don't contain i. (How could they, given that they unify classical and quantum mechanics?)

    I only noticed this after reading in "How Is Quantum Field Theory Possible?" by Sunny Y. Auyang in section "§ 12. The Form of Observation and the Reality of Quantum States" in the subsection "The Irreducible Complex Nature of Quantum States" that "Quantum characteristics are irreducibly complex, they cannot be decomposed into real and imaginary parts, i appears explicitly in the Schrodinger equation and other fundamental relations. ... I do not mean that the symbol i must be present in all formulations of quantum mechanics. There are many mathematical ways of expressing the same thing."

    After appreciating the beauty, the difficult next step would be to understand why this is still not enough. Even so i no longer occurs explicitly in the equations, and all beables are real valued, complex numbers will continue to play a key role behind the scenes. How physical are those real valued beables? I once wrote: "I admit that it is often easier to compute with the vector potential instead of the actual fields. But the actual fields are measurable, at least in principle, while the vector potential is not." Are those real valued beables more like the actual fields than like the vector potential? They still share properties with the vector potential, i.e. some gauge freedom is still left. Only the complex phase which explained the gauge freedom is more hidden now.

    Interestingly, (6), (7), and (8) were not even present in the old material.

    Your Review:

    Please use reviews only to (at least partly) review submissions. To comment, discuss, or ask for clarification, leave a comment instead.
    To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
    Please consult the FAQ for as to how to format your post.
    This is the review box; if you want to write a comment instead, please use the 'add comment' button.
    Live preview (may slow down editor)   Preview
    Your name to display (optional):
    Privacy: Your email address will only be used for sending these notifications.
    Anti-spam verification:
    If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
    p$\varnothing$ysicsOverflow
    Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
    Please complete the anti-spam verification




    user contributions licensed under cc by-sa 3.0 with attribution required

    Your rights
    ...