Long overdue update 2/14/2007: The quantum computing result described below remains unproven (though not disproven). It is true that homogeneous mean field spin models (i.e., ones where every spin is coupled to every other and every coupling has the same strength) are characterized at criticality by an entanglement entropy that increases only logarithmically with the number of spins [see, for example, this article by Latorre, et al.] It is also true that systems whose entanglement entropy is O(log N) should be efficiently simulable classically [the seminal paper is this one by Vidal]. However, it remains unclear whether inhomogeneous mean-field models (i.e., ones where every spin is connected to every other, and the coupling strengths have some nonzero mean, but they are not all the same) also have logarithmic entanglement entropy scaling.  Most pertinently, the method described below can’t resolve the issue. That is, showing that the ground state energy of an inhomogeneous mean-field model can be given exactly in the thermodynamic limit by a separable Ansatz does not mean that the model’s true ground state is separable. Silly me. 😦 On the bright side, the calculation about the probability a pair of people in a room with N people share a birthday is definitely correct. 🙂

This whiteboard reflects the two things about which I was thinking today.

whiteboard2.jpg

[Click the picture for a full-size 1600 x 1200 JPEG.]

[The left hand side] Writing up a talk I gave at the Quantum Computation and Many-Body Systems (QCMBS) conference in Key West, FL in early February. (The conference website can be found here. The conference program page has downloadable PDFs of the presentations that were given. Mine is here.) What you see on the board is a formal power series expansion of the Gibbs free energy of a general, pairwise coupled quantum spin system around the noninteracting case. The notable aspects are that if there’s C nonzero couplings in the system, then at every order in the expansion there’s just C nonzero terms and they are such that you can say the magnitude of the nth order correction scales as CJn, where J is the typical coupling constant. For systems where, as the number of spins N goes to infinity,

(1) every spin is still connected to a finite fraction of all the other spins and
(2) the ratio of the mean coupling between any 2 spins to the standard deviation of the coupling between any 2 spins does not vanish,

J must be of order 1/N so as to have a finite free energy per spin. In such a case, only the first 2 terms in the expansion are extensive, and thus spin-spin correlations (be they classical or quantum) do not matter for the Gibbs free energy per spin at any temperature (and thus the Helmholtz free energy per spin at any temperature and thus the ground state energy per spin at zero temperature). This asymptotic lack of quantum entanglement in the ground state implies that a wide class of adiabatic quantum algorithms cannot provide any speedup over classical algorithms for NP-complete graph theory problems on graphs where each vertex is connected to a finite fraction of all the others.

[The right hand side] Recalling, after being embarassingly confused with two of my friends, how to do that old problem “In a room with N people, what’s the probability that at least 2 of them have the same birthday?”, and thus explain why there’s a large chance even in a room of, say, 30 people (much larger than the woefully naive estimate of 30/365).

Advertisements

Scientists at Cornell have made small self-replicating robots.

I notice that web traffic for this here blog of mine is pretty piddly.  Hence, taking a cue from every darn newscast in America, I’ve decided to institute a "News You Can Use" segment and inaugurate it news you can use with those perennial American obsessions of diet, exercise, and health.

But unlike all that filler you see on newscasts that alternates between the genuinely informative, the barely informative, and the overhyped and scare-mongering, I assure you, gentle reader, that my "News You Can Use" segments will always be genuinely informative.  (I’ll confine my barely informative, overhyped, and scare-mongering posts to my rants on current affairs.)

So without further ado, let me point you to the most comprehensive collection I know of relatively layperson-friendly, scientific review articles on diet, exercise and health covering issues important to both the sedentary and the athletic among us:

Position Stands of the American College of Sports Medicine

Mike_hbombtest_marshall_islands_1952_1

[U.S. Air Force photo via The Encyclopedia Britannica Online]

Fifty-two years ago the first hydrogen bomb was tested. The device was codenamed “Mike.” I don’t know whether this was some sort of allusion to Michael, the Archangel oft depicted as wielding God’s terrible swift sword to cast Lucifer out of Heaven and whose very name—“Who is like God?!” in Hebrew—was the faithful angels’ battlecry against their fallen brethren. Whether it was or not, I find it sad that humanity has found far more success in emulating God’s wrath rather than His mercy.

In some sort of crude sense, which no vulgarity, no humor, no overstatement can quite extinguish, the physicists have known sin, and this is a knowledge which they cannot lose.

J. Robert Oppenheimer

#1 Idea: Continuing to blog when you have a PhD qualifying oral exam in 5 days.

Hence, the blog’s going to be on hiatus until Thursday, April 29th or so, when I’ll start tackling some big questions like (1) what a grand centrist compromise between liberals and conservatives might look like and (2) a common sense guide to contemplating interpretations of quantum mechanics without going insane.

If you want to do some background reading on these topics…

************************

(For Topic 1 – On Radical Centrist Reform of Government)

My starting point is the proposal of Matthew Miller, which can be found on his blog and in his book The 2% Solution, which gets its name from the notion that by returning the size of government to 22% of GDP (the average during the Reagan-Bush years) rather than the 20% of GDP now, we could implement a comprehensive domestic policy including universal health care, lifting the working poor out of poverty, comprehensive education reform, and public financing of political campaigns. If all that sounds really lefty, note that the execution of all these ideas is done in ways conservatives should consider “market-friendly” and “freedom-promoting”.

My main contention will be that for a real grand centrist bargain between liberals and conservatives, a bit more is going to have to be on the table. Notably, major tax reform is going to have to be on the table, which is something Miller neglects. (NB: I said tax reform, i.e., changes to make tax law simpler, more enforceable, and less prone to promulgating perverse incentives, and thus make the economy more efficient, which we really can’t afford not to do anymore… not to be confused with tax cuts, which at this point we sadly can’t afford to do anymore). For ideas, consider:

Joel Slemrod and Jon Bakija Taxing Ourselves: A Citizen’s Guide to the Great Debate over Tax Reform, 2nd Edition (Cambridge, MA – MIT Press 2001)

The various works of Dale Jorgenson, Professor of Economics at Harvard on “Efficient Taxation of Income” (e.g., the following… which are listed in order from most to least difficult):

Efficient Taxation of Income, with Kun-Young Yun, (November 15, 2002) [a long, technical journal paper]

.pdf Slides to the Office of Tax Analysis, August 2, 2002. [a technical presentation]

A Smarter Type of Tax, Financial Times, June 19, 2002 [an op-ed piece]

Efficient Taxation of Income, Harvard Magazine, March-April 2003: Volume 105, Number 4. [a bit of Harvard alumni propaganda]

************************

(For Topic 2 – The Fundamental Nature of Reality or The Lack Thereof)

For laypersons:

John Polkinghorne’s stunningly concise and remarkably non-misleading popularization Quantum Mechanics in Oxford University Press’s “Very Short Introduction” series, which is full-text viewable at Amazon.com

For technical readers with positivist proclivities:

Are you the type of guy or gal who thinks the following?

I’ll ascribe objective reality to a system’s wavefunction with respect to a given observable only if when…

1) … I have a single rendition of the system, and…

2) … I don’t have full a priori knowledge of the system’s preparation, yet…

3) … I can still somehow determine what the wavefunction is without changing it in the process

(I impose this final condition since if it’s inevitably changed, how could I know I really had it and my measurement wasn’t a fluke? I want to be able to repeat my measurements to my heart’s content until I satisfy my inner Bayesian conscience—no matter how neurotically exacting it may be.)

Then you probably should take a look at Quantum Measurement of a Single System by Orly Alter and Yoshihsa Yamamoto, which is also full-text viewable at Amazon.com, because it strongly argues that quantum mechanics will never satisfy your criteria for objective reality. The book proves that quantum mechanics as it’s currently known will not allow you to obtain *any* information about a system’s wavefunction without changing the wavefunction *unless* you have full a priori knowledge of a system’s preparation. (For example, say you’re told that a system has been prepared in an definite energy eigenstate, but you don’t know the full Hamiltonian of the system. There’s no way you can determine what the eigenstate from a single copy of the system without changing the state, no matter how slowly—adiabatically–you perform your manipulations. Or, say you’re given a spin and told it has a definite orientation, but you’re not told what it is. There’s no way to determine the orientation of a single spin without changing it unless you already know what axis it lies on.)

For technical readers fond of noumenal notions:

The following is the most straightforward and sensible thing I’ve come across about the “Many Worlds Interpretation” of quantum mechanics (which, in case you haven’t noticed, underlies the joke that gives this blog its name)

Max Tegmark, “The interpretation of quantum mechanics: many worlds or many words?”, Fortschr. Phys. 46, 855-862 [downloadable Postscript file]

(His webpage devoted to Many Worlds http://www.hep.upenn.edu/~max/everett.html is worth a look too.)

For mathematically minded readers who want to codify a logical system that would encompass all possible interpretations of quantum mechanics that do not involve wavefunction collapse:

Does the following sound like a good idea to you?

… quantum mechanics is not about physical systems that exhibit a peculiar and elusive ontology, but rather about physical systems with a non-Boolean property structure. The problem is then how to make sense of a quantum world in which the properties of systems ‘fit together’ in a non-Boolean way.”

How about when it’s expressed technically?

The actual properties in a classical world are selected by a 2-valued homomorphism from a set of alternatives defined by a fixed Boolean algebra — the Boolean algebra of subsets of the phase space of the classical world — representing the same collection of possible properties at all times. The actual properties in a quantum world at time t are selected by a 2-valued homomorphism from a set of alternatives defined by a dynamically evolving non-Boolean sublattice of all subspaces of the Hilbert space of the quantum world. So the actual properties in a classical world evolve in a fixed Boolean possibility space, while the actual properties in a quantum world evolve in a dynamically changing non-Boolean possibility space. Classically, only the actual properties are time-indexed; quantum-mechanically, both the actual properties and the possible properties are time-indexed.

Modulo the usual philosophical worries about modality, there is nothing inherently strange about the notions of possibility or actuality in quantum mechanics. Once we have a precise handle on what is possible and what is actual, and how change proceeds, the rest is calculation. So measurement comes out as a process internal to the theory, on the basis of the dynamics alone, without requiring any privileged status for observers.

If you answered yes to either question, and especially if you answered yes to the latter question, you should really read the book from which they came:

Jeffrey Bub, Interpreting the Quantum World (Cambridge Univ. Press, corrected edition 1999), which, you guessed it, is also full-text viewable at Amazon.com.

[NB: Bub boasts the stunning academic lineage of being a student of physicist David Bohm and philosophers Karl Popper and Imre Lakatos.]

Critiques of “Mainstream” Economics

In case you haven’t noticed, my blog is meant to be an outlet for constructive procrastination… mostly for me, its author… but also for you, gentle readers. On that note, let me begin what I am sure will be a long, long series of ideas for constructive procrastination.

I’ll start with one of the most outstanding outlets I’ve found: critiques of “mainstream economics”.

Before continuing, let me emphasize why I put “mainstream” in quotes. As a friend of mine who’s in Stanford’s PhD Economics program rightly reminded me a few months ago:

In many ways “mainstream economics” is a very rickety and much abused straw man, and no one really does it any more. I think the basic tenets of modern economics are:

a) It is useful to assume that in making decisions, individuals are maximising something, given their limited information and cognition.

b) Equilibrium is a useful way to study behaviour.

Apart from this, everything else is open season. When doing policy, you have to either agree that

1) People are stupid, or

2) Society and its institutions are evil.

When thinking about policy, I tend to be more comfortable with (2): the idea that institutions (including markets) fail more often than not, but individuals do the best they can given the circumstances. Of course, behavioural economists are more willing to believe the people are stupid thesis.

But the much-maligned “mainstream” economics has not been studied, at least in active research institutions for quite some time. Doing it is not likely to get you published in a top journal. However, even then, there are a few people around, particularly in France (and amongst my old buddies in Cambridge, England), who feel that there is a lot lacking from the way economics is approached today. Their main concern is that we are too mathematical in our approach, thus confining the questions we ask. After all, Adam Smith and Ricardo did not use Kuhn-Tucker theorems or dynamic programming to find their insights. For more on this, it’s good to check out www.paecon.net.

With those requisite words of caution from a professional out of the way, let’s start the critiques. I’ll start with the best one I’ve found and understand: Section 6.14 of Herb Gintis’s Game Theory Evolving. It’s short, sweet, comprehensive, rigorous, chock-full of “news-you-can-use”, and diplomatic to boot. As he writes:

6.14.1 Costless Contract Enforcement: Achilles’ Heel of Neoclassical Economics

At the heart of neoclassical economics is the notion that the price mechanism equates supply and demand. The Fundamental Theorem of Welfare Economics, in particular, says that in a general equilibrium model, under appropriate conditions every competitive equilibrium is Pareto-optimal, and every feasible distribution of utility among agents can be attained by a suitable initial distribution of property rights, followed by competitive exchange (see Section 3.19). There have been many critiques of this model, and despite its enormous contribution to economic theory, it is no longer on the cutting edge of research these days. (Sidenote: That it is still taught to students as the “ideal case” from which the real world is an imperfect realization strikes me as scandalous.) Nonconvexity, externalities, multiple equilibria, and the absence of equilibration mechanisms entailing local stability, as well as other arcane issues, are often mentioned as the problem with the neoclassical model. But the biggest problem, easily trumping the others, is that the model assumes that agents can write costlessly enforceable contracts for all goods exchanged on markets. This may be reasonable for standardized goods (for instance, raw materials, basic chemicals, and easily graded agricultural goods) but, as we have seen, is misleading when applied to labor, credit, and consumer goods markets.

And then Gintis goes on to emphasize the key thing about labor, credit, and consumer goods markets that causes neoclassical economics to be misleading is that they are contingent renewal markets (i.e., an essential aspect of them is they’re repeated games in which the principal can tell an agent to take a hike if the agent doesn’t satisfactorily keep a promise). In particular, Gintis claims that the salient points about contigent renewal markets are:

In general, intuitively, “where product quality cannot be ensured by explicit contract, goods are in excess supply.” [Gintis, p. 138]

And specifically:

a) “In a Nash equilibrium of a contingent renewal market, there is an excess supply of agents.” [Gintis, p. 137]
b) “Contingent renewal markets do not clear, and in equilibrium they allocate power to agents located on the short side of the market.” [Gintis, p. 139]

Such facts make a lot of things that are actually observed in real-world labor, credit, and consumer goods markets a heck of a lot more explicable theoretically, so much so that Gintis believes the contingent renewal market model should be the second thing taught in any introductory economics class, right after supply and demand. Alas, it isn’t, and thus introductory economics courses have an annoying side effect of producing 19 year old libertarians who think the question of what constitutes the optimal society is a long-settled one (and worse, think that the pro-business wing of the modern Republican party is basically libertarian).