Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  How does Landauer's Principle apply in quantum (and generally reversible) computing

+ 4 like - 0 dislike
4781 views

I understand that a reversible computer does not dissipate heat through the Landauer's principle whilst running - the memory state at all times is a bijective function of the state at any other time.

However, I have been thinking about what happens when a reversible computer initializes. Consider the state of the physical system that the memory is built from just before power up and initialization. By dint of the reversibility of underlying microscopic physical laws, this state stay encoded in the overall system's state when it is effectively "wiped out" as the computer initialized and replaces it with a state representing the initialized memory (set to, say, all noughts).

So it seems to me that if $M$ bits is the maximum memory a reversible algorithm will need to call on throughout its working, by the reasoning of Landauer's principle, ultimately we shall need do work $M\, k\,T\,\log 2$ to "throw the excess entropy out of the initialized system".

Question 1: Is my reasoning so far right? If not, please say why.

Now, specializing to quantum computers, this seems to imply some enormous initialization energy figures. Suppose we have a system with $N$ qubits, so the quantum state space has $2^N$ basis states. Suppose further that, for the sake of argument, the physics and engineering of the system is such that the system state throughout the running of the system only assumes "digitized" superpositions, $i.e.$ sums of the form:

$$\frac{1}{\sqrt{\cal N}}\sum_{j\in 1\cdots 2^N} x_j \,\left|\left.p_{1,j},p_{2,j},\cdots\right>\right.$$

where $x_j, \;p_{k,j}\in{0,1}$ and ${\cal N}$ the appropriate normalization. To encode the beginning state that is wiped out at power up and initialization, it seems to me that the Landauer-principle-behested work needed is $2^N \,k\,T\,\log 2$. This figure reaches 6 000kg of energy (about humanity's yearly energy consumption) at around 140 qubits, assuming we build our computer in deep space to take advantage of, say, 10K system working temperature.

Question 2: Given that we could build a quantum computer with 140 qubits in "digital" superpositons as above, do we indeed need such initialization energies?

One can see where arguments like this might go. For example, Paul Davies thinks that similar complexity calculations limit the lower size of future quantum computers because their complexity (information content) will have to respect the Bekestein Bound. P.C.W. Davies, "The implications of a holographic universe for quantum information science and the nature of physical law", Fluctuation and Noise Lett 7, no. 04, 2007 (see also http://arxiv.org/abs/quantph/0703041)

Davies points out that it is the Kolmogorov complexity that will be relevant, and so takes this as an indication that only certain "small" subspaces of the full quantum space spanned by a high number of qubits will be accessible by real quantum computers. Likewise, in my example, I assumed this kind of limitation to be the "digitization" of the superposition weights, but I assumed that all of the qubits could be superposed independently. Maybe there would be needfully be correlations between the superpositions co-efficients in real quantum computers.

I think we would hit the Landauer constraint as I reason likewise, but at a considerably lower number of qubits.

Last Question: Am I applying Landauer's principle to the quantum computer in the right way? Why do my arguments fail if they do?*

This post imported from StackExchange Physics at 2014-03-30 15:15 (UCT), posted by SE-user WetSavannaAnimal aka Rod Vance
asked Nov 2, 2013 in Theoretical Physics by WetSavannaAnimal (485 points) [ no revision ]
Interestingly, I found a paper ("unfortunately" in French), here, which states , that in the case of a hybrid optomechanic system (See fig $2$ page $7$, fig $5$ page $10$), the work necessary to initialize a qbit is proportionnal to the Rabi frequency, which plays the role of a temperature (formula $23$ page $12$)

This post imported from StackExchange Physics at 2014-03-30 15:15 (UCT), posted by SE-user Trimok
@Trimok Ca marche bien pour moi! How very interesting indeed, thanks heaps.

This post imported from StackExchange Physics at 2014-03-30 15:15 (UCT), posted by SE-user WetSavannaAnimal aka Rod Vance
Yes, no, and no. Initializing one qubit dissipates $kT$ of energy, and thus, initializing N qubits dissipates an energy of $NkT$. (Note that if the energy would not scale linearly with the number of qubits, this would likely give rise to all kind of contradictions!) This is closely related to the question whether $N$ qubits "contain" $N$ bits or $2^N$ bits of information (and typically $N$ is the more appropriate answer) -- e.g., arxiv.org/abs/quant-ph/0507242 contains some arguments about that.

This post imported from StackExchange Physics at 2014-03-30 15:15 (UCT), posted by SE-user Norbert Schuch
... you might also want to check arxiv.org/abs/1306.4352.

This post imported from StackExchange Physics at 2014-03-30 15:15 (UCT), posted by SE-user Norbert Schuch

@Norbert Schuch: A system composed of $N$ qubits is described by $2^N$-dimensional space, in other words, it has $2^N$ distinct/independent quantum states. Exactly the same thing as $N$ classical bits that have $2^N$ states. I suppose a confusion between number of states and amount of information (that grows logarithmically).

1 Answer

+ 2 like - 0 dislike

What you are describing is the Landauer principle for initializing a classical computer which simulates a given quantum computer. The bits required for the classical description grow exponentially, and the resources for simulating such a superposition become unmanagable already at a hundred qubits as you said, and become unfeasable for the whole universe to do even at less than a thousand qubits.

But if nature is fundamentally quantum, the quantum computer is not initializing a classical computer sitting underneath, rather it is using a quantum Landauer principle, which says that to initialize a set of N qubits, you need Nlog2 entropy dumped into the environment. The quantum state is just exponentially large when the information inside it is described on a classical computer.

To understand the quantum Landauer principle, note that the loss of entropy for N qubits going from a pure state to a mixed state in which all values are equally likely is N log 2, as can be computed easily from the definition (because the density matrix is diagonal). This quantum entropy is the statistical cost of derandomizing some qubits, the cost you are computing is unphysical, it is computing the cost of running the hypothetical classical computer that the aliens who set up the matrix built to simulate our quantum reality. If our reality is really quantum, and quantum computers work, these aliens have access to resources exponentially larger than anything that could fit inside our universe.

answered Aug 12, 2014 by Ron Maimon (7,730 points) [ no revision ]

Thanks Ron, I do believe you are right. Certainly the von Neumann entropy argument makes sense - I need to think on this a little.

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOverfl$\varnothing$w
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...