Processing math: 100%
Apr 052013
 

After another 550 km drive at the end of an already very long day, I finally made it home late last night, concluding a very productive 3-day visit at the Perimeter Institute.

While there, I gave another talk on the Pioneer anomaly. I felt that it went well and as far as I can tell, it was very well received.

All in all, it was time well spent.

 Posted by at 9:49 pm
Mar 192013
 

Looking out my window this morning, here is the winter landscape that I saw:

This is not what those blasted groundhogs promised. They are bold-faced liars, the little creeps. The next time you run into Punxsutawney Phil or Wiarton Willie, keep an eye on your wallet; you just don’t know what the little sons of bitches are capable of.

 Posted by at 8:23 am
Mar 172013
 

In a post a few days ago, I expressed my skeptical views concerning the interpretation of some of the recent Higgs results from CERN. I used a simple analogy, an example in which measuring the average height of the inhabitants in a set of buildings is used to determine which of them may house the Harlem Globetrotters.

However, I came to realize (thanks in part to some helpful criticism that my post received) that I left out one possibility. What if the buildings are too small? Or the ‘Trotters are just too, hmm, tired after a long party and end up in the wrong building? In that case, a measurement may look like this:

If we have an a priori reason to believe that, for whatever reason, the players are indeed spread out across several buildings, then we can indeed not expect to see a sharp peak at #4 (or whichever building is assigned to the Globetrotters); instead, we should see a broad excess, just what the CMS experiment is seeing when it measured the decay of the presumed Higgs boson into a τ+τ pair.

So is there an a priori reason for the data to be spread out like this? I believe there is. No instrument detects τ leptons directly, as their lifetime is too short. Instead, τ events are reconstructed from decay products, and all forms of τ decay involve at least one neutrino, which may carry away a significant portion of the lepton’s energy. So the final uncertainty in the total measured energy of the τ+τ pair can be quite significant.

In other words, many of the Globetrotters may indeed be sleeping in the wrong building.

Nonetheless, as my copy of the venerable 20-year old book, The Higgs Hunter’s Guide suggests, the decay into τ leptons can be a valuable means of confirmation. Which is perhaps why it is troubling that for now, the other major detector at the LHC, ATLAS, failed to see a similar broad excess of τ+τ events near the presumed Higgs mass.

 Posted by at 10:23 am
Mar 142013
 

I have been reading a lot today about the latest news from Europe, the supposed confirmation that the elementary particle observed at CERN may indeed by the Higgs boson.

And while they are probably right, I feel that the strong pronouncements may be a little premature and perhaps unwarranted.

Let me demonstrate my thoughts using a simple example and some pretty pictures.

Suppose you go to a camp site. At that camp site there are five buildings, each of the buildings housing a different team. One may be a literary club, another may be a club of chess enthusiasts… but you have reason to believe that one of the buildings is actually occupied by the Harlem Globetrotters.

Suppose that the only measurement available to you is a measurement of the average height of the people housed in each of the buildings. You of course know what the mean height and its standard deviation are for the entire population. So then, suppose you are presented with a graph that shows the average height, with error bars, of the people housed in each of five buildings:

The red dashed line is the population average; the green and yellow shaded areas correspond to one and two standard deviations; and the black dots are the actual data points, with standard deviations, representing the average height of the residents in each of the five buildings.

Can you confirm from this diagram that one of the buildings may indeed be housing the Harlem Globetrotters? Can you guess which one? Why, it’s #4. Easy, wasn’t it. It is the only building in which the average height of the residents deviates from the population (background) average significantly, whereas the heights of the residents of all the other buildings are consistent with the “null hypothesis”, namely that they are random people from the population background.

But suppose instead that the graph looks like this:

Can you still tell which building houses the Globetrotters? Guess not. It could be #2… or it could be #4. But if you have other reasons to believe that #4 houses the Globetrotters, you can certainly use this data set as a means of confirmation, even though you are left wondering why #2 also appears perhaps as an outlier. But then, outliers sometimes happen as mere statistical flukes.

But suppose instead that you see a plot like this one:

What can you conclude from this plot? Can you conclude anything? Is this a real measurement result and perhaps the entire camp site has been taken over by tall basketball players? Or perhaps you have a systematic error in your measurement, using the wrong ruler maybe? You simply cannot tell. More importantly, you absolutely cannot tell whether or not any of the buildings houses the Harlem Globetrotters, much less which one. Despite the fact that building #4 is still about four standard deviations away from the population average. Until you resolve the issue of the systematic, this data set cannot be used to conclude anything.

But then, why are we told that a similar-looking plot, this one indicating the rate of Higgs boson decay into a pair of τ particles (the heaviest cousin of the electron), indicates a “local significance of 2.9σ”? With a “best fit μ = 1.1 ± 0.4” for a 125 GeV Higgs boson?

It indicates no such thing. The only thing this plot actually indicates is the presence of an unexplained systematic bias.

Or am I being stubbornly stupid here?

 Posted by at 10:01 pm
Mar 102013
 

To the esteemed dinosaurs in charge of whatever our timekeeping bureaucracies happen to be: stop this nonsense already. We no more need daylight savings time in 2013 than we need coal rationing.

It is an outdated idea, the benefits of which may have been dubious even at the time of its inception, and are almost certainly nonexistent today. But the harm is real: you are subjecting the entire population to a completely unnecessary one-hour jetlag each spring.

Being self-employed and working mostly from my home, I am among the least affected, but I still find this clock-forwarding business just boneheadedly stupid and annoying.

Oh, and while you are at it… would you please get rid of leap seconds, too? Another harmful solution to a nonexistent problem. So what if our clocks are out of whack by a second with respect to the Earth’s rotation? Does it bother anyone?

Oh wait. The organization in charge of leap seconds is the ITU. The same ITU that is busy trying to place the Internet under international regulation, at the bidding of such champions of Internet freedom like China or Russia. No wonder they have little time left in their busy schedule to abolish leap seconds.

 Posted by at 9:07 am
Mar 012013
 

I was watching the seemingly flawless launch of SpaceX’s resupply flight to the ISS and like others, I was flabbergasted when the Webcast was suddenly blacked out (“Please Stand By”), then the flight director came on, announced that the spacecraft was experiencing an anomaly and more information will be provided at a press conference in a few hours and… that’s it. Webcast ends.

So like other good early 21st century netizens, I turned to Twitter: the speculation is that the spacecraft may have failed to deploy its solar arrays, perhaps because there was no fairing separation. This is Bad News. Some speculated that Dragon has sufficient battery power to make it to the ISS and that a spacewalk might fix things, but I don’t think things are that simple.

I guess there is nothing to do but wait for that press conference.

The live video was breathtaking, by the way. Watching the bell of the second stage engine glow yellowish-red was amazing.

spacex-small

So… my fingers remain firmly crossed.

 Posted by at 10:40 am
Feb 212013
 

The news is that Dennis Tito, the first ever space tourist to go to the International Space Station, is planning a privately financed manned flyby mission to Mars in 2018.

I don’t know how feasible it is. I actually have doubts that they will succeed. And the scientific value of such a mission would likely be negligible.

Even so… I dearly hope that they succeed. And if they asked me to go, I’d sign up without hesitation, despite the prospect of spending 500 days with another human being locked up in a tiny capsule, despite the significant probability that we won’t make it back alive.

It is okay to think about the economics, technical feasibility, and scientific value of a space mission, but all too often these days, we forget that other thing: inspiration. Sometimes, that’s worth a great deal. A generation of Soviet scientists and engineers inspired by Sputnik or the flight of Gagarin, and a generation of American scientists and engineers inspired by Apollo and Armstrong’s “one small step” can bear witness of this.

 Posted by at 7:01 pm
Feb 152013
 

Chances are that if you tuned your television to a news channel these past couple of days, it was news from the skies that filled the screen. First, it was about asteroid 2012DA14, which flew by the planet at a relatively safe distance of some 28,000 kilometers. But even before this asteroid reached its point of closest approach, there was the striking and alarming news from the Russian city of Chelyabinsk: widespread damage and about a thousand people injured as a result of a meteor that exploded in the atmosphere above the city.

What I found rather distressing is just how scientifically illiterate the talking heads proved to be on television. First, it was CNN’s turn to be ridiculed after their anchor, Deborah Feyerick, actually asked the astonishing question, “Is this an effect of, perhaps, of global warming, or is this just some meteoric occasion?”

But then came the rest. I think it was on the Canadian network CTV (but I might be misremembering) where an anchor announced that an asteroid “the size of Texas” is about to fly by the Earth. Well… 2012DA14 is not the size of Texas, not unless Texas has shrunk a great deal since the last time I visited the Lone Star State (which was just a few weeks ago); the asteroid was only about 50 meters across.

And then the impact event in Russia. Initial estimates that I heard indicated an object weighing a few tons, traveling perhaps at 30 km/s; that’s still a significant amount of kinetic energy, maybe about a quarter or half of a kiloton if I am not mistaken. But then, a later and apparently more reliable estimate said that the object was perhaps 15 meters in diameter, traveling at 18 km/s. That, depending on the density of the object, is consistent with another estimate that I heard, 300 kilotons of energy released. If this latter estimate is valid, this means the biggest event since the Tunguska impact of 1908.

So where does the illiteracy come in?

One CNN anchor, describing the event, mentioned that thankfully, it occurred over a sparsely populated area, and the outcome would have been much worse had it occurred over a major population center. I wonder if residents of Chelyabinsk, a city of well over a million people, are aware that they qualify as a “sparsely populated area”.

And then there were the completely inconsistent size and mass estimates. A release by The Planetary Society spoke of an object 15 meters in diameter and weighing 8 tons. Say what? That’s just four times the density of air. The object in question actually weighed more like 8,000 metric tons.

Another CNN anchor was interrogating a physicist, wondering what causes these meteors to explode. The physicist was unable to explain coherently, and the anchor was unable to comprehend, the concept that it is just the kinetic energy of a very rapidly moving object that gets converted into heat pretty much instantaneously, heating up the air, which then rapidly expands and creates a shock wave. Come on guys, this is really not that hard!

Later in the afternoon, 2012DA14 finally did make its closest approach, as harmlessly as predicted, but there was obvious confusion in the news media about its visibility; yes, it was over the Indian Ocean at the time, but no, even there nobody could see it with the naked eye, much less find it “spectacular”.

I don’t think I am needlessly pedantic, by the way. On the contrary, I find it alarming that in our world which relies on increasingly sophisticated technology, people who are entrusted with the task of keeping us informed are this illiterate on matters of science and technology. Or even geography.

 Posted by at 10:44 pm
Feb 142013
 

I always thought of myself as a moderate conservative. I remain instinctively suspicious of liberal activism, and I do support some traditionally conservative ideas such as smaller governments, lower taxes, or individual responsibility.

So why am I not a happy camper nowadays with a moderate conservative government in Ottawa?

Simple: because they are not moderate. To me, moderate conservatism means evidence-based governance. A government that, once its strategic goals are formulated, puts aside ideology and governs on the basis of available facts and the best scientific advice they can obtain.

But this is not what Mr. Harper’s conservative government is doing. Quite the contrary, they engage in one of the worst of sins: they try to distort facts to suit their ideology. Most recently, it is Fisheries and Oceans that is imposing confidentiality rules on participating researchers that “would be more appropriate for classified military research”.

I am appalled.

 Posted by at 10:58 am
Jan 252013
 

I came across this image on a Facebook page dedicated to the former glory of the Soviet Union. It is titled “Russia and the USSR: similar, yet noticeably different.”

There is, unfortunately, far too much truth in what the image depicts. It does not make me wish for Soviet times to return, but it does make me wonder why so much good had to be thrown away along with the bad.

 Posted by at 3:31 pm
Jan 202013
 

John Marburger had an unenviable role as Director of the United States Office of Science and Technology Policy. Even before he began his tenure, he already faced demotion: President George W. Bush decided not to confer upon him the title “Assistant to the President on Science and Technology”, a title born both by his predecessors and also his successor. Marburger was also widely criticized by his colleagues for his efforts to defend the Bush Administration’s scientific policies. He was not infrequently labeled a “prostitute” or worse.

I met Marburger once in 2006, though to be honest, I don’t recall if I actually conversed with him one-on-one. He gave the keynote address at an international workshop organized by JPL, titled From Quantum to Cosmos: Fundamental Physics Research in Space, which I attended.

If Marburger felt any bitterness towards his colleagues or towards his own situation as a somewhat demoted science advisor, he showed no signs of it during that keynote address. Just as there are no signs of bitterness or resentment in his book, Constructing Reality, which I just finished reading. Nor is there any hint of his own mortality, even though he must have known that his days were numbered by a deadly illness. No, this is a book written by a true scientist: it is about the immortal science that must have been his true passion all along.

It is an ambitious book. In Constructing Reality, Marburger attempts the impossible: explain the Standard Model of particle physics to the interested and motivated lay reader. Thankfully, he does not completely shy away from the math; he realizes that without at least a small amount of mathematics, modern particle physics is just not comprehensible. I admire his (and his publisher’s) courage to face this fact.

Is it a good book? I honestly don’t know. I certainly enjoyed it very much. Marburger demonstrated a thorough, and better yet, intuitive understanding of some of the most difficult aspects of the Standard Model and quantum field theory. But I am the wrong audience: I know the science that he wrote about. (That is not to say that his insight was not helpful in deepening my understanding.) Would this book be useful to the lay reader? Or the aspiring young physicist? I really cannot tell. Learning the principles of quantum field theory is not easy, and in my experience, we each take our own path towards a deeper understanding. Some books help more than others but ultimately, what helps the most is practice: there is no substitute for working out equations on your own. Still, if the positive reviews on Amazon are any indication, Marburger succeeded with writing a book “for [his] friends who are not physicists”.

Marburger died much too soon, at the age of 70, after he lost his battle with cancer. His book was published posthumously (which perhaps explains why the back flap of the book’s dust jacket contains his short bio and room for a photograph above, but no actual photo. Or perhaps I am just imagining things.) But his words survive and inspire others. Well done, Dr. Marburger. And thanks.

 Posted by at 10:37 am
Jan 192013
 

Recently I came across a blog post that suggests (insinuates, even) that proponents of modified gravity ignore the one piece of evidence that “incontrovertibly settles” the question in favor of dark matter. Namely this plot:

From http://arxiv.org/abs/1112.1320 (Scott Dodelson)

From http://arxiv.org/abs/1112.1320 (Scott Dodelson)

In this plot, the red data points represent actual observation; the black curve, the standard cosmology prediction; and the various blue curves are predictions of (modified) gravity without dark matter.

Let me attempt to explain briefly what this plot represents. It’s all about how matter “clumps” in an expanding universe. Imagine a universe filled with matter that is perfectly smooth and homogeneous. As this universe expands, matter in it becomes less dense, but it will remain smooth and homogeneous. However, what if the distribution of matter is not exactly homogeneous in the beginning? Clumps that are denser than average have more mass and hence, more gravity, so these clumps are more able to resist the expansion. In contrast, areas that are underdense have less gravity and a less-than-average ability to resist the expansion; in these areas, matter becomes increasingly rare. So over time, overdense areas become denser, underdense areas become less dense; matter “clumps”.

Normally, this clumping would occur on all scales. There will be big clumps and small clumps. If the initial distribution of random clumps was “scale invariant”, then the clumping remains scale invariant forever.

That is, so long as gravity is the only force to be reckoned with. But if matter in the universe is, say, predominantly something like hydrogen gas, well, hydrogen has pressure. As the gas starts to clump, this pressure becomes significant. Clumping really means that matter is infalling; this means conversion of gravitational potential energy into kinetic energy. Pressure plays another role: it sucks away some of that kinetic energy and converts it into density and pressure waves. In other words: sound.

Yes, it is weird to talk about sound in a medium that is rarer than the best vacuum we can produce here on the Earth, and over cosmological distance scales. But it is present. And it alters the way matter clumps. Certain size scales will be favored over others; the clumping will clearly show preferred size scales. When the resulting density of matter is plotted against a measure of size scale, the plot will clearly show a strong oscillatory pattern.

Cosmologists call this “baryonic acoustic oscillations” or BAO for short: baryons because they represent “normal” matter (like hydrogen gas) and, well, I just explained why they are “acoustic oscillations”.

In the “standard model” of cosmology, baryonic “normal” matter amounts to only about 4% of all the matter-energy content of the visible universe. Of the rest, some 24% is “dark matter”, the rest is “dark energy”. Dark energy is responsible for the accelerating expansion the universe apparently experienced in the past 4-5 billion years. But it is dark matter that determines how matter in general clumped over the eons.

Unlike baryons, dark matter is assumed to be “collisionless”. This means that dark matter has effectively no pressure. There is nothing that could slow down the clumping by converting kinetic energy into sound waves. If the universe had scale invariant density perturbations in the beginning, it will be largely scale invariant even today. In the standard model of cosmology, most matter is dark matter, so the behavior of dark matter will dominate over that of ordinary matter. This is the prediction of the standard model of cosmology, and this is represented by the black curve in the plot above.

In contrast, cosmology without dark matter means that the only matter that there is is baryonic matter with pressure. Hence, oscillations are unavoidable. The resulting blue curves may differ in detail, but they will have two prevailing characteristics: they will be strongly oscillatory and they will also have the wrong slope.

That, say advocates of the standard model of cosmology, is all the proof we need: it is incontrovertible evidence that dark matter has to exist.

Except that it isn’t. And we have shown that it isn’t, years ago, in our paper http://arxiv.org/abs/0710.0364, and also http://arxiv.org/abs/0712.1796 (published in Class. Quantum Grav. 26 (2009) 085002).

First, there is the slope. The theory we were specifically studying, Moffat’s MOG, includes among other things a variable effective gravitational constant. This variability of the gravitational constant profoundly alters the inverse-square law of gravity over very long distance scales, and this changes the slope of the curve quite dramatically:

From http://arxiv.org/abs/0710.0364 (J. W. Moffat and V. T. Toth)

From http://arxiv.org/abs/0710.0364 (J. W. Moffat and V. T. Toth)

This is essentially the same plot as in Dodelson’s paper, only with different scales for the axes, and with more data sets shown. The main feature is that the modified gravity prediction (the red oscillating line) now has a visually very similar slope to the “standard model” prediction (dashed blue line), in sharp contrast with the “standard gravity, no dark matter” prediction (green dotted line) that is just blatantly wrong.

But what about the oscillations themselves? To understand what is happening there, it is first necessary to think about how the actual data points shown in these plots came into existence. These data points are the result of large-scale galaxy surveys that yielded a three-dimensional data set (sky position being two coordinates, while the measured redshift serving as a stand-in for the third dimension, namely distance) for millions of distant galaxies. These galaxies, then, were organized in pairs and the statistical distribution of galaxy-to-galaxy distances was computed. These numbers were then effectively binned using a statistical technique called a window function. The finite number of galaxies and therefore, the finite size of the bins necessarily introduces an uncertainty, a “smoothing effect”, if you wish, that tends to wipe out oscillations to some extent. But to what extent? Why, that is easy to estimate: all one needs to do is to apply the same window function technique to simulated data that was created using the gravity theory in question:

From http://arxiv.org/abs/0710.0364 (J. W. Moffat and V. T. Toth)

From http://arxiv.org/abs/0710.0364 (J. W. Moffat and V. T. Toth)

This is a striking result. The acoustic oscillations are pretty much wiped out completely except at the lowest of frequencies; and at those frequencies, the modified gravity prediction (red line) may actually fit the data (at least the particular data set shown in this plot) better than the smooth “standard model” prediction!

To borrow a word from the blog post that inspired mine, this is incontrovertible. You cannot make the effects of the window function go away. You can choose a smaller bin size but only at the cost of increasing the overall statistical uncertainty. You can collect more data of course, but the logarithmic nature of this plot’s horizontal axis obscures the fact that you need orders of magnitude (literally!) more data to achieve the required resolution where the acoustic oscillations would be either unambiguously seen or could be unambiguously excluded.

Which leads me to resort to Mark Twain’s all too frequently misquoted words: “The report of [modified gravity’s] death was an exaggeration.”

 Posted by at 11:32 am
Jan 122013
 

jstor_logoComputer pioneer Alan Turing, dead for more than half a century, is still in the news these days. The debate is over whether or not he should be posthumously pardoned for something that should never have been a crime in the first place, his homosexuality. The British government already apologized for a prosecution that drove Turing into suicide.

I was reminded of the tragic end of Turing’s life as I am reading about the death of another computer pioneer, Aaron Swartz. His name may not have been a household name, but his contributions were significant: he co-created the RSS specifications and co-founded Reddit, among other things. And, like Turing, he killed himself, possibly as a result of government prosecution. In the case of Swartz, it was not his sexual orientation but his belief that information, in particular scholarly information should be freely accessible to all that brought him into conflict with authorities; specifically, his decision to download some four million journal articles from JSTOR.

Ironically, it was only a few days ago that JSTOR opened up their archives to limited public access. And the trend in academic publishing for years has been in the direction of free and open access to all scientific information.

Perhaps one day, the United States government will also find itself in the position of having to apologize for a prosecution that, far from protecting the public’s interests, instead deprived the public of the contributions that Mr. Swartz will now never have a chance to make.

 Posted by at 4:53 pm
Jan 122013
 

I only noticed it in the program guide by accident… and I even missed the first three minutes. Nonetheless, I had loads of fun watching last night a pilot for a new planned Canadian science-fiction series, Borealis, on Space.

Borealis 24

The premise: a town in the far north, some 30 years in the future, when major powers in the melting Arctic struggle for control over the Earth’s few remaining oil and gas resources.

In other words, a quintessentially Canadian science-fiction story. Yet the atmosphere strongly reminded me of Stalker, the world-famous novel of the Russian Strugatsky brothers.

I hope it is well received and the pilot I saw last night will be followed by a full-blown series.

 Posted by at 12:41 pm
Jan 092013
 

BlackHole-tinyA few weeks ago, I exchanged a number of e-mails with someone about the Lanczos tensor and the Weyl-Lanczos equation. One of the things I derived is worth recording here for posterity.

The Lanczos tensor is an interesting animal. It can be thought of as the source of the Weyl curvature tensor, the traceless part of the Riemann curvature tensor. The Weyl tensor, together with the Ricci tensor, fully determine the Riemann tensor, i.e., the intrinsic curvature of a spacetime. Crudely put, whereas the Ricci tensor tells you how the volume of, say, a cloud of dust changes in response to gravity, the Weyl tensor tells you how that cloud of dust is distorted in response to the same gravitational field. (For instance, consider a cloud of dust in empty space falling towards the Earth. In empty space, the Ricci tensor is zero, so the volume of the cloud does not change. But its shape becomes distorted and elongated in response to tidal forces. This is described by the Weyl tensor.

Because the Ricci tensor is absent, the Weyl tensor fully describes gravitational fields in empty space. In a sense, the Weyl tensor is analogous to the electromagnetic field tensor that fully describes electromagnetic fields in empty space. The electromagnetic field tensor is sourced by the four-dimensional electromagnetic vector potential (meaning that the electromagnetic field tensor can be expressed using partial derivatives of the electromagnetic vector potential.) The Weyl tensor has a source in exactly the same sense, in the form of the Lanczos tensor.

The electromagnetic field does not uniquely determine the electromagnetic vector potential. This is basically how integrals vs. derivatives work. For instance, the derivative of the function y=x2 is given by y=2x. But the inverse operation is not unambiguous: 2x dx=x2+C where C is an arbitrary integration constant. This is a recognition of the fact that the derivative of any function in the form y=x2+C is y=2x regardless of the value of C; so knowing only the derivative y does not fully determine the original function y.

In the case of electromagnetism, this freedom to choose the electromagnetic vector potential is referred to as the gauge freedom. The same gauge freedom exists for the Lanczos tensor.

Solutions for the Lanczos tensor for the simplest case of the Schwarzschild metric are provided in Wikipedia. A common characteristic of these solutions is that they yield a quantity that “blows up” at the event horizon. This runs contrary to accepted wisdom, namely that the event horizon is not in any way special; a freely falling space traveler would never know that he is crossing it.

But as it turns out, thanks to the gauge freedom of the Lanczos tensor, it is easy to construct a solution (an infinite family of solutions, as a matter of fact) that do not behave like this at the horizon.

Well, it was a fun thing to compute anyway.

 Posted by at 3:08 pm
Jan 092013
 

No, not Deep Purple the British hard rock group but deep purple the color. And pink… on Australian weather maps. These are the new colors to represent the temperature range between +50 and +54 degrees Centigrade.

Deadly deep purple (and pink)

There is another word to describe such temperatures: death. This is not funny anymore. If weather like this becomes more common, parts of our planet will simply become uninhabitable by humans without high technology life support (such as reliable, redundant air conditioning). In other words, it’s like visiting an alien planet.

 Posted by at 10:06 am