Jan 222010
 

Here’s a nice tennis ball, photographed from both sides:

It’s a big one, mind you, almost a thousand miles across. It’s Saturn’s moon Iapetus, famous because one side of it is significantly brighter than the other. The explanation, however, is more mundane than that offered in the book version of Clarke’s 2001: A Space Odyssey; the most recent hypothesis is that the discoloration is due to the thermal migration of ice.

 Posted by at 7:44 pm
Jan 122010
 

I just finished a neat little paper with John Moffat. Jordan-Brans-Dicke theory is a generalization of relativity theory, in which the gravitational constant becomes variable, and the resulting (scalar) field has kinetic energy. First described by Pascual Jordan in his 1952 book Schwerkraft und Weltall, it was perhaps the first serious, “modern” modification of Einstein’s gravity. Yet it is rejected in the solar system, because it predicts an observable parameter that is grossly out of whack with observation. Then again… in the original form, the scalar field arises as a result of changes in spacetime curvature, only indirectly responding to the presence of matter. If we add matter as a direct source, the picture changes. Even more interesting, if matter and curvature kind of cancel each other out, we get a (still variable, still dynamical) scalar field that’s just like a field in vacuum, and agreement with solar system observations is restored.

Incidentally, Pascual Jordan was one of the greatest 20th century German physicists. Unfortunately he was also an ardent Nazi, member of the NSDAP and an SA volunteer. This may have been the reason why he never received the Nobel prize. Goes to prove that politics and science are really not a very good mix.

 Posted by at 4:04 am
Nov 032009
 

It looks like the Mythbusters tend to ignore air resistance.

In a recent episode, they claimed to have demonstrated that a horizontally fired bullet and a bullet that is simply dropped fall to the ground in the same amount of time. They were wrong. What they actually demonstrated is that air resistance causes the fired bullet to hit the ground more slowly.

Their argument would apply perfectly in a vacuum, as on the surface of the Moon, but not here on the Earth, where the bullet’s motion is governed not just by the laws of gravity, but also by the laws of a non-conservative force, namely air resistance. (Why is it non-conservative? Some of the bullet’s kinetic energy is converted into heat, as it travels through the air at high speed. Unless we also include the thermodynamics of the air into our equations of motion, the equations will not conserve energy, as the amount of kinetic energy converted into heat will just appear “lost”.)

The bullet’s velocity, v, can be written as v2 = vh2 + vv2, where vh is the horizontal and vv is the vertical component. The initial horizontal velocity is v0. The initial vertical velocity is 0.

Air resistance is proportional to the square of the bullet’s velocity. To be precise, acceleration due to air resistance will be

aair = κv2,

where κ is an unknown proportionality factor. (To be more precise, κ = ½cAρm, where c is the dimensionless drag coefficient, A is the bullet’s cross-sectional area, ρ is the density of the air, and m is the bullet’s mass.) The direction of the acceleration will be opposite the direction of the bullet’s motion.

The total acceleration of the bullet will have two components: a vertical component g due to gravity, and a component aair opposite the direction of the bullet’s motion.

The resulting equations of motion can be written as:

dvh/dt = –κvvh,
dvv/dt = g – κvvv.

Right here we can see the culprit: air resistance not only slows the bullet down horizontally, it also reduces its downward acceleration.

This is a simple system of two differential equations in the two unknown functions vh(t) and vv(t). Its solution is not that simple, unfortunately. However, it can be greatly simplified if we notice that given that vv << vh, vvh, and therefore, we get

dvh/dt = –κvh2,
dvv/dt = g – κvvvh.

This system is solved by

vh = 1/(κt + C1),
vv = vh[(½κt2 + C1t)g + C2].

Given

v0 = 1/(κt0 + C1)

we have

C1 = 1/(v0 – κt0),

which leads to

vh = v0/[κ(tt0)v0 + 1],

or, if we set t0 = 0,

vh = v0/(κtv0 + 1),

Similarly, given vv(0) = 0, we get C2 = 0, thus

vv = gtv0t + 2)/(2κv0t + 2).

The distance traveled horizontally (sh) and vertically (sv) between t0 = 0 and t1 can be obtained by simple integration of the respective velocities with respect to t between 0 and t1:

sh = log(κv0t1 + 1)κ,
sv = g[κv0t1v0t1 + 2) – 2log(κv0t1 + 1)]/(2κv0)2.

The claim by the Mythbusters was that the time it took for the fired bullet to hit the ground was only ~40 ms more than the time it took for a dropped bullet to fall, which is a negligible difference. But it is not! Taking g ≅ 10 m/s2, it is easy to see that the time it takes for a bullet to fall from a height of 1 m, using the well-known formula ½gt2, is 447 ms; the difference measured by the Mythbusters is nearly 10% of this number!

Not only did the fired bullet take longer to hit the ground, the Myhtbusters’ exquisite setup allows us to calculate the bullet’s initial velocity v0 and drag coefficient κ. This is possible because the Mythbusters conveniently provided three pieces of information (I am using approximate numbers here): the length of the path that the bullet traveled sh ≅ 100 m), the height of the bullet at the time of firing (sv ≅ 1 m), and the time it took for the fired bullet to hit the ground. Actually, what they provided was the difference between the time for a fired vs. a dropped bullet to hit the ground, but we know what it is for the dropped bullet (and because it is never moving very rapidly, we can ignore air resistance in its case), so t1 = 447 + 40 = 487 ms. The solution is given by

κ = 0.0054 m–1,
v0 = 272.3 m/s.

Given a bullet cross-sectional area of A = 2 cm2 = 2 × 10–4 m2, an approximate air density of ρ = 1 kg/m3, and a bullet mass of m = 20 g = 0.02 kg, the dimensionless drag coefficient for the bullet can be calculated as c = 2κmAρ = 1.08, which is not at all unreasonable for a tumbling bullet. Of course the actual values of A and m may differ from the ones I’m using here, resulting in a different value for the dimensionless drag coefficient c.

 Posted by at 12:39 am
Oct 152009
 

First, it was the multiverse. Then came Boltzmann brains. Now here’s another intriguing idea: the cancellation of the Superconducting Supercollider project in the 1990s and last year’s failure of one of the Large Hadron Collider’s magnets at CERN are just two manifestations of manifest bad luck brought about by the fact that the Higgs particle simply cannot be discovered; that its discovery at any time in the future propagates backwards in time, causing events that prevent its discovery in the first place.

An intriguing idea, though not precisely original, as something much like it was already published in the form of John Cramer‘s excellent science fiction novel, Einstein’s Bridge. And when I say intriguing, I mean intriguing… as the basis of a science fiction story. But as the basis of a scientific paper? Another adjective comes to my mind… appalling.

 Posted by at 12:11 am
Sep 242009
 

There are certain areas of life where decades of computer expertise are quite useless, and even a reasonably thorough knowledge of theoretical physics is only of marginal use. Replacing the rotted subfloor around a leaky toilet is one such area.

Yet this is what I am presently engaged in. So far so good… using some rather evil, foul-sounding power tools, I managed to cut out much of a square hole around the drainpipe, I’m only having trouble with some corners where the power tools don’t reach. Unfortunately, I found out that the subfloor in this bathroom is actually an inch thick, as opposed to the standard, 5/8″ board that I already bought… oh well, it wasn’t a big expense anyway, and perhaps I can use that board for some other purpose later on.

For now, it’s back to Home Depot to get a piece of inch-thick wood and also some advice on cutting out those nasty corners. Maybe they can suggest a method that would be slightly more efficient than the hammer-and-chisel approach which I attempted, with  some limited success.

While I’m at it, I shall also inquire as to whether it is possible for them to cut my boards to shape to fit around the drainpipe, so that I wouldn’t have to attempt such precision cutting using my fairly limited skills and perhaps less-than-adequate set of tools. Not to mention that I value my fingers, and prefer to have all ten of them in the right place and in full working order after I’m done with all this…

But for now, it’s rest time. I have this nasty tensor algebra program to tackle, but no matter how difficult it is, I sweat a lot less doing it than when I’m cutting a subfloor with a circular saw.

 Posted by at 3:30 pm
Sep 022009
 

If you’re a scientist or engineer, you don’t need to be a pacifist never to work for the military. J. Reece Roth, a 72-year old professor emeritus at the University of Tennessee, didn’t know this when he hired two graduate students (one from Iran, one from China) and when he took his laptop to China. His reward, for a lifetime of working hard and being a loyal citizen of the United States? Four years in prison.

 Posted by at 5:50 pm
Jul 292009
 

I am somewhat surprised that this idea has not become more popular yet, even though it’s yet the clearest “scientific proof” that we are, in fact, all immortal.

The “many worlds” interpretation of quantum mechanics says that the wave function never collapses: instead, every time a measurement is made, corresponding to each possible outcome a new universe comes into existence. You measure the spin of an electron and presto: there are now two universes, in one of which the spin is +1/2, in the other, -1/2. You flip a coin and presto: there are now two universes, the “heads”-universe and the “tails”-universe. (And many other universes in which the coin lands edgewise, explodes in mid-air, gets snatched by a passing eagle, or any other bizarre, improbable, but not impossible outcome that you can imagine.)

But if this is true, well, human death is just another measurement; and whereas in one universe, your heart might stop beating, in another, it beats one more. Or two more. Or two hundred million more.

In other words, as the universe keeps branching, you may cease to exist on many of those branches but there will always be branches on which you continue to live.

Think about it. That which you call your present consciousness will exist in an ever growing number of copies; some of those will be extinguished, but a few won’t be, not for a very, very, very long time. There is a continuous line from the here and now to the then and there, no matter how far that “then” is in the future, along which you continue to live. In other words, you can look forward to everlasting life… at least in a few of the many universes that await you.

How do you know if you’re on one of those “lucky” branches? Well, so long as you’re still alive, you are on a lucky branch, since the possibility exists that you will stay alive. Forever.

Of course there is a downside. Among the many parallel universes that represent possible futures, there are those in which you stay alive, but just barely, and in terrible pain and suffering. Or, you stay alive but you lose all your loved ones and even when you decide that it’s time to end your own life, you cannot… there is, after all, a nonvanishing probability that all your attempts at suicide fail.

But that doesn’t change the basic concept: in the multiverse, everyone is immortal. Although I am personally not too fond of the many worlds interpretation of quantum mechanics, I remain a little surprised that this idea has not yet become more popular among the religiously inclined.

 Posted by at 7:08 pm
Jul 082009
 

The premier Internet physics and astronomy preprint archive, ArXiv, seems to be having some serious problems tonight. I used the catchup interface to check for new papers, only to find messages like this:

Problem displaying entry for arXiv:0907.1079

Apparently all new papers are unavailable, and many older papers, too… I checked briefly and found papers dating back to last October that appear to have vanished. Including some half a dozen or so papers of my own.

I sure hope they keep backups!

 Posted by at 3:00 am
Jun 272009
 

I recently read a review of Weinberg’s wonderful new book, Cosmology, a 2009 sequel of sorts to his 1972 classic, Gravitation and Cosmology. The reviewer mentioned two other books, that of Mukhanov and that of Dodelson, as books worth having. Mukhanov’s Physical Foundations of Cosmology was already on my bookshelf (and, like the reviewer, I also consider it worth having) but not Dodelson’s book… so I decided to buy it.

I was not disappointed: it is an excellent cosmology book. In particular, it offers a very thorough introduction to the quantitative aspects of physical cosmology.

However… although the book was published only six years ago, it feels surprisingly dated. Through no fault of the author, to be clear: it’s just that cosmology has made tremendous progress in a few short years. I can think of two things in particular: results from the Wilkinson Microwave Anisotropy Probe (WMAP), first released in 2005, providing precision maps of the cosmic microwave background, allowing accurate detection of the so-called acoustic peaks; and ever improving large scale galaxy surveys, notably the Sloan Digital Sky Survey (SDSS), providing spectra for many hundreds of thousands of galaxies, yielding 3D density maps of the deep cosmos that can be used to test models of structure formation.

The results speak for themselves. For instance, Dodelson’s book gives 12.6 ± 1.1 billion years as the age of the Universe… in contrast, the latest WMAP result is 13.73 ± 0.12 billion years, a tenfold improvement in the accuracy of the estimate. I guess it’s not an enviable task to write a book for a field that is changing as rapidly as Modern Cosmology… which also happens to be the title of Dodelson’s book.

 Posted by at 3:04 am
Jun 112009
 

I was channel-surfing for news this morning, and I caught a segment on CTV’s morning show about “dirty electricity”.

I shall refrain from calling the gentleman being interviewed using a variety of unflattering names, because it would not be polite, and in any case, it’s not the person but the message that I take issue with.

Basically, he put a bunch of electronic devices like cordless phones, baby monitors, Wi-Fi routers or even fluorescent light bulbs on a test bench, plugged them in, and then held a contraption with an antenna and a speaker close to them. The contraption was making loud noises, from which this gentleman concluded that these devices “emit radiation”, and “send dirty electricity back through the wires”.

So then… what? The whole Universe is emitting similar radiation at radio frequencies. Any warm object, including the walls of your house, emits radiation at such frequencies and higher. And why should I care?

Of course, it helps dropping a few scary phrases like, “skyrocketing rates of autism”. Oh, he wasn’t saying that they are related. Why should he? Merely mentioning autism while he’s talking about “dirty electricity” is enough to suggest a connection.

Just to be clear about it, almost all electronic devices emit radio frequency radiation that can then be picked up by a suitable receiver and converted into loud and scary noise. When I was 10 or so and got my first pocket calculator, I had endless fun holding it close to an AM receiver and listening to its “song”. Later, when I had my first programmable calculator, I could tell by listening to the sounds on a nearby radio if it was still executing a program, or even if it displayed the expected result or just showed an error condition. Modern calculators use so little power that their transmissions cannot be picked up so easily, but does this mean that the old calculators were a health threat? Of course not.

At such low frequencies, electromagnetic radiation does not interact with our bodies in harmful ways. To cause genetic damage, for instance, much shorter wavelengths would be needed, you need to go at least to the ultraviolet range to produce ionization and, possibly, damage to DNA. At lower frequencies, most emissions are not even absorbed by the body very effectively. The little energy that is being absorbed may turn into tiny currents, but those are far too tiny to have any appreciable biological impact. Note that we are not talking about holding a cell phone with a, say, 0.3W transmitter just an inch from your brain (though even that, I think, is probably quite harmless, never mind sensationalist claims to the contrary); we are talking about a few milliwatts of stray radio frequency emissions not mere inches, but feet or more from a person.

As to “dirty electricity”, any device that produces a capacitive or inductive load on the house wiring will invariably feed some high frequency noise back through the wiring. Motors are the worst offenders, like vacuum cleaners or washing machines. Is this a problem? I doubt it. House wiring already acts as a powerful transmission antenna, continuously emitting electromagnetic waves at 60 Hz (in North America); so what if this emission is modulated further by some higher frequency noise?

But even if I am wrong about all of this, and low-frequency, low-energy electromagnetic radiation has a biological effect after all… study it by all means, yes, but it is no excuse for CTV to bring a scaremongerer with his noisy gadget (designed clearly with the intent to impress, not measure) on live television.

 Posted by at 1:14 pm
Jun 092009
 

The reason why I am concerning myself with more Maxima examples for relativity is that I am learning some subtle things about Brans-Dicke theory and the Parameterized Post-Newtonian (PPN) formalism.

Brans-Dicke theory is perhaps the simplest modification of general relativity. Instead of the gravitational constant, G, the theory has a scalar field φ, and the theory’s Lagrangian now reads

L = [φR − ω∂μφ∂μφ/φ] / 16π.

Here, R is the curvature scalar and ω is an unspecified constant of the theory.

The resulting field equations are just like Einstein’s, except for two things. First, the field equations for the metric now have additional terms containing derivatives of φ; second, there is a new field equation for the scalar field φ that basically says that the d’Alembertian of φ is proportional to the trace of the stress-energy tensor.

Clever people tell you that Brans-Dicke theory is practically excluded by solar system data, as it would only work for insanely high values of ω. They demonstrate this by building approximate solutions for the theory using the PPN formalism, and find that one of the PPN parameters, γ, will have the value of γ = (1 + ω) / (2 + ω); on the other hand, observations by the Cassini spacecraft restrict γ to |γ − 1| < 2.3 × 10−5, so |ω| must be at least 40,000.

Now here’s the puzzling bit: if you solve Brans-Dicke theory in a vacuum, you find that the celebrated Schwarzschild solution of general relativity still applies:  keeping φ constant, you just get back this common solution which is known to fit solar system data well, and which has, most importantly, γ = 1 and the value of ω doesn’t matter.

So which is it? Is it γ = 1 or is it γ = (1 + ω) / (2 + ω)? Something is amiss here.

This dilemma can be resolved once you realize that whereas general relativity has a unique spherically symmetric, static vacuum solution, this is not the case for Brans-Dicke theory. This theory has an infinite family of spherically symmetric, static vacuum solutions. Indeed, I think you could actually use the value of γ to parameterize this solution space. However, once you allow some matter into that vacuum, no matter how little, you are locked in to a specific solution, for which γ = (1 + ω) / (2 + ω). In other words, the only vacuum solution that is consistent with the notion of taking the limit of a matter solution by gradually removing matter is NOT the Schwarzschild solution of general relativity, but another, incompatible solution.

This has extremely important implications for our work on MOG. So far, we have obtained a vacuum solution that appears consistent with observations on scales from the solar system to cosmology. However, a recent paper by Deng et al. challenges this work by suggesting that the MOG PPN parameter γ is not 1 and hence, the theory runs into the same trouble as Brans-Dicke theory in the solar system. Is this true? Did we pick a vacuum solution that happens to be inconsistent with matter solutions? This is what I am trying to investigate.

 Posted by at 12:45 pm
Jun 082009
 

Some moderately interesting Maxima examples.

First, this is how we can prove that the covariant derivative of the metric vanishes (but only if the metric is symmetric!)

load(itensor);
imetric(g);
ishow(covdiff(g([],[i,j]),k))$
%,ichr2$
ishow(contract(canform(contract(canform(rename(expand(%)))))))$
ishow(covdiff(g([i,j],[]),k))$
%,ichr2$
ishow(canform(contract(rename(expand(%)))))$
decsym(g,2,0,[sym(all)],[]);
decsym(g,0,2,[],[sym(all)]);
ishow(covdiff(g([],[i,j]),k))$
%,ichr2$
ishow(contract(canform(contract(canform(rename(expand(%)))))))$
ishow(covdiff(g([i,j],[]),k))$
%,ichr2$
ishow(canform(contract(rename(expand(%)))))$

Next, the equation of motion for a perfect fluid:

load(itensor);
imetric(g);
decsym(g,2,0,[sym(all)],[]);
decsym(g,0,2,[],[sym(all)]);
defcon(v,v,u);
components(u([],[]),1);
components(T([],[i,j]),(rho([],[])+p([],[]))*v([],[i])*v([],[j])
                        -p([],[])*g([],[i,j]));
ishow(covdiff(T([],[i,j]),i))$
ishow(canform(%))$
ishow(canform(rename(contract(expand(%)))))$
%,ichr2$
canform(%)$
ishow(canform(rename(contract(expand(%)))))$

Finally, the equation of motion in the spherically symmetric, static case:

load(ctensor);
load(itensor);
K:J([i],[])=covdiff(T([i],[j]),j);
E:ic_convert(K);
ct_coords:[t,r,u,v];
lg:ident(4);
lg[1,1]:B;
lg[2,2]:-A;
lg[3,3]:-r^2;
lg[4,4]:-r^2*sin(u)^2;
depends([A,B,T,rho,p],[r]);
derivabbrev:true;
cmetric();
christof(mcs);
J:[0,0,0,0];
ev(E);
T:ident(4);
T[1,1]:rho;
T[2,2]:T[3,3]:T[4,4]:p;
J,ev;

These examples are probably not profound enough to include with Maxima, but are useful to remember.

 Posted by at 5:07 pm
May 312009
 

I’ve been learning a lot about Web development these days: Dojo and Ajax, in particular. It’s incredible what you can do in Javascript nowadays, sophisticated desktop applications running inside a Web browser. I am spending a lot of time building a complex prototype application that has many features associated with desktop programs, including graphics, pop-up dialogs, menus, and more.

I’ve also been learning a lot about the intricacies Brans-Dicke gravity and about the parameterized post-Newtonian (PPN) formalism. Brans-Dicke theory is perhaps the simplest modified gravity theory that there is, and I have to explain to someone why the gravity theory that I spend time working on doesn’t quite behave like Brans-Dicke theory. In the process, I find out things about Brans-Dicke theory that I never knew.

And, I’ve also been doing a fair bit of SCPI programming this month. SCPI is a standardized way for computers to talk to measurement instrumentation, and an old program I wrote used to use a non-standard way… not anymore.

Meanwhile, in all the spare time that I’ve left, I’ve been learning Brook+, a supercomputer programming language based on C… that is because my new test machine is a supercomputer, sort of, with its graphics card that doubles as a numeric vector processor capable in theory of up to a trillion single precision floating point instructions per second… and nearly as many in practice, in the test programs that I threw at it.

I’m also learning a little more about the infamous cosmological constant problem (why is the cosmological constant at least over 50 orders magnitude too small but not exactly zero?) and about quantum gravity.

As I said in the subject… busy days. Much more fun though than following the news. Still, I did catch in the news that Susan Boyle lost in Britains Got Talent… only because an amazing dance group won:

 Posted by at 3:07 am
May 142009
 

Given the less than perfect record of the Ariane 5 launch vehicle, there was reason for concern given that two new great observatories, Herschell and Planck, were launched on the same rocket this morning. Fortunately, the launch was successful, and both spacecraft are now on their merry way. Herschell is an infrared/submillimeter wavelength telescope, while Planck is “WMAP on steroids”, expected to provide much higher resolution views of the cosmic microwave background than its predecessor.

 Posted by at 4:01 pm
Apr 272009
 

I’ve run the first realistic tests of the kind of computation that I am planning to perform on my new machine with the GPU “supercomputer” card. Here is a “before” picture:

Self-gravitating star cluster on the CPU

Self-gravitating star cluster on the CPU

And now, the exact same program running on the GPU:

Self-gravitating star cluster on the GPU

Self-gravitating star cluster on the GPU

I’d say that’s quite an improvement. To say the least.

The calculation in this case computed the self-gravitational forces in a cluster of 10,000 stars… it seems that the GPU can perform this computation at least 20 times a second. That’s quite remarkable.

 Posted by at 6:13 pm
Apr 202009
 

I am watchin Deep Impact tonight, a ten-year old film about a comet impacting the Earth. Why the Canadian History Channel is showing this film is a good question. Future history? Imagined history?

But putting that question aside, the movie made me go to Wikipedia again, and I ended up (re-)reading several articles there relating to the issue of global warming and controversies surrounding it.

One thing that struck me (and not for the first time) is this: criticism of global warming theories are often dismissed by the assertion that these go “against the mainstream” or are “not supported by scientific consensus.”

And global warming is by no means the only area of science where such arguments are frequently invoked. Take two topics that I have become involved with. There is scientific consensus that the inadequacy of Einstein’s theory of gravitation to explain the rotation of galaxies and large scale features of the universe is due to “dark matter” and “dark energy”. Even though no one knows what dark matter (or dark energy) is made of, and no one actually detected any dark matter or dark energy ever, the idea is treated as fact. True, dark matter theory can explain a few things and even made a few minor (but nonetheless impressive) predictions, but that doesn’t necessarily make it true, and it certainly doesn’t make the theory the only kid on the block worth considering. Still, try proposing an alternative gravity theory: no matter how firmly rooted in real physics it is, you will be fighting an uphill battle.

Or take the Higgs boson. This hypothetical particle (often along with the graviton) is often portrayed as if it has already been detected. It hasn’t. Indeed, the only thing experiments have accomplished to date is that they excluded the possibility that the Higgs boson exist at nearly the two-σ level. There are also significant unresolved issues with the Higgs boson that put the theoretical validity of the idea into question. Yet the “scientific consensus” is that the Higgs boson exists, and if you try to propose a quantum field theory without the Higgs, well, good luck!

Just to be clear about it, I am not saying that the climate skeptics got it right, and for all I know, maybe there is dark matter out there in abundant quantities, along with Higgs bosons behind every corner. But not because this is what the “scientific consensus” says but because the theory is supported by facts and by successful predictions. Otherwise, the theory remains “just a theory”, as the creationist crowd likes to say… neglecting the inconvenient fact that, of course, the theory of evolution is supported by an abundance of facts and successful predictions.

 Posted by at 2:55 am
Apr 042009
 

The gravitational theory that I’ve been working on for some time with John Moffat is called STVG, or Scalar-Tensor-Vector Gravity. It grew out of Moffat’s investigation of Nonsymmetric Gravity.

There is also a phenomenological formula called MOND (MOdified Newtonian Dynamics) that effectively flattens out the acceleration curve at high radii from a point source. MOND is nothing more than a formula designed by its creator, Mordechai Milgrom, to solve a specific problem, namely the rotation curves of galaxies. It is not rooted in any theory, and in fact, it is known to contradict some; for instance, it violates the law of conservation of energy and momentum. This is why Jacob Bekenstein endeavored to create a relativistic theory called TeVeS, which fixes MONDs problems, while still gives the approximate MOND acceleration formula in the case of weak fields.

Both STVG and TeVeS are gravity theories, and both happen to incorporate tensor, vector, and scalar fields. Beyond that, however, there’s nothing in common between the two theories.

Unfortunately, many Wikipedians don’t know this, and try from time to time to merge the STVG and TeVeS articles. Hopefully not any longer… I just posted a long, fairly complete description of STVG on Wikipedia.

 Posted by at 11:48 pm
Mar 312009
 

Folks working on quantum computers are busy trying to make sure that entangled states remain entangled, because decoherence is death for a quantum computation. But now, Gross et al. showed that too much entanglement may not be a good thing: it can result in quantum computers that offer no improvements in efficiency over conventional computers.

 Posted by at 12:04 am