Our monster Living Review about the Pioneer anomaly is now officially published. There were times when I thought we’d never get to this point.
No, I didn’t smoke anything unhealthy. The “cloud”, in this case, does not refer to a state of mind nor, for that matter, to structures formed by condensed water vapor in the atmosphere. I am talking about the computing “cloud”, the idea that you are using the Internet to access computing resources, the physical location of which is irrelevant.
This past weekend, I decided to set up a virtual server in the “cloud”. I am amazed how cheaply it can be done nowadays. And one day, it may help me migrate away from a home office based server to one that I no longer have to maintain myself. That’s the long-term plan anyway. For now, I am taking the first tentative steps as I am exploring my brand new server and test its robustness and reliability.
The founder of Wikileaks has been charged with rape in Sweden. As of this morning, his whereabouts are unknown.
Are these charges true? Is Assange a rapist? Perhaps. He is certainly a weird fellow, and for all I know, he’s not necessarily weird purely in a good sense.
But… are these charges true? He pissed off a lot of people, and not just people, but some of the most powerful institutions in the world, including the US and other governments, corporations, and even shady entities like the Church of Scientology. Just how far are governments (and non-governments) willing to go to get rid of him? Are they capable of theatrical dirty tricks? At one time I would have said no. But that was at a time when I could not have imagined that a modern-day government would poison a former agent on foreign soil, using an exotic radioactive substance. At that time, I could not have imagined that a modern-day democratic government would engage in a systematic campaign of lies and deception to justify an unjust war of aggression. Compared to such things, a trumped-up charge against a (to them, very) annoying individual is nothing. Perhaps he should be grateful that he’s still alive and he’s not setting off any Geiger-counters nearby.
Update: And now, a few hours after I wrote the paragraphs above, here’s breaking news from CNN: “WikiLeaks founder Julian Assange ‘no longer wanted’ and not a rape suspect, Swedish prosecutor says on website”. Sooo… What was this all about?
Speaking of blogs and people who yell back… sometimes, those people are actually yell for money. It seems that a new industry is about to be born, the industry of copyright trolls. Be careful what you quote in your blogs, especially in the US.
I went for a walk last morning and it gave me time to think. About this here blog of mine. Notably, about the fact that this is my first new entry in ten days, probably a record since I began this habit some eight years ago.
Of course eight years ago, I was not using blogging software. I was originally just adding content to a static HTML page. Eventually, I wrote some home-brew server-side code that allowed users to access a specific day. Which is how the software organized entries, by day that is. So I felt compelled to put something in every day, even if it was nothing more than just the comment, “another boring day”. (Not boring to me mind you, but to people reading my entries.) But then, two years ago I decided to join others in the 21st century and set up WordPress. (It was perhaps around this time that my resistance finally broke down and I began to accept the word “blog” as part of my vocabulary.) One side effect of this change was that I no longer added a new entry every day… but then, there were days when I added more than one. Even so, I blogged less. Was it because of the change in software?
Or perhaps I just have less to say? How many original (or, well, not too unoriginal) thoughts can be stored in an average human brain? How soon before we start repeating ourselves, griping about the same issues over and over again? Perhaps I am blogging less because I already said everything I needed to say?
Or maybe it’s something else altogether. Maybe it’s not the blogging software per se, but the fact that it allowed me to configure my Facebook account to pick up my blog entries and post them there. Suddenly, people actually responded to what I had to say. They actually commented. What on Earth?
You see, blogs (and I mean real, personal blogs, not news media outlets that call themselves blogs) are the ultimate write-only media. You write about things that matter to you, not about things that matter to others. You yell at the world, not expecting the world to yell (or, for that matter, whisper) back.
So perhaps I just became shy because suddenly the world talked back. Suddenly, I had to pay attention to what I wrote because there was a reaction. Usually a friendly one, but even so… I had to explain my thoughts. Heaven forbid, I sometimes had to revise them because somebody convinced me that I was mistaken. When you yell at the world, you’re not expecting the world to explain to you why you are wrong.
Maybe I’ll just establish a secret blog site. One that is not linked to Facebook or anything else, the URL of which only I know. (Who needs pesky readers?) Then, I’ll happily yell at the world again secure in the knowledge that nobody pays any attention whatsoever…
Sixty-five years ago Nagasaki was destroyed by nuclear flame. The beginning of the nuclear era, we sometimes say. But perhaps there is a more hopeful way of looking at it: whereas Hiroshima was the first time a nuclear weapon was exploded in anger, Nagasaki was the last. So perhaps Teller was right after all, and nuclear weapons remain the ultimate peacemaker. Here’s to hoping.
In the meantime, here’s a rather relevant clip from YouTube, showing all nuclear explosions to date on a map:
Well, almost 23,000. Mostly JavaScript and PHP, also HTML and SQL. This is a project I’ve been working on all summer. Lots more to do, but at least one deliverable is complete, and I can finally spend a little bit of time doing something else. Plenty of other things I’ve been putting off while I was on this programming binge.
It’s not for the first time I said this, but you just gotta love this Internet thing. The big news this morning of course is the leak of some 90,000 classified US military documents from Afghanistan. Guardians of state and military secrets are horrified: troops’ lives will be at risk, they say. What they should recognize is that the fact that we live in an open society, far from being a weakness, is really our greatest strength. Open discussion of the pros and cons, the successes and failures, the risks and possible outcomes of a war is part of living in a liberal democracy.
As to the release itself, it’s funny how times are changing. When I learned the database language SQL ages ago, it was because I make my living as a computer professional. I did not necessarily expect to use my SQL skills in scientific endeavors, but that, too, came to pass when I began using the wonderfully crafted SQL-based query interface of the Sloan Digital Sky Survey. What I certainly never expected is that one day, a journalistic leak will arrive in a variety of formats, perhaps the most useful of which is an SQL dump. I wonder: do they teach the building of SELECT queries in journalism school these days?
I think it is fair to say that Canada’s Industry Minister, Tony Clement, is not my friend these days. Quite the contrary, I can hardly wait for the day when he becomes Mr. Tony Clement, private citizen, along with the rest of his colleagues in government. That is because Industry Minister Clement is the minister behind our government’s latest attempt to pervert Canadian copyright law in favor of the likes of the Disney Corporation.
However, the news this morning strongly reminded me of US Vice President Joe Biden’s oft retold life lesson: when it comes to politicians with whom you disagree, it’s their judgment, not their character, that you should question. Last night, Clement risked life and limb as he jumped into a river to help save a drowning woman. He may not be a very good Minister of Industry, but his heart seems to be in the right place.
I have been reading the celebrated biography of Albert Einstein by Walter Isaacson, and in it, the chapter about Einstein’s beliefs and faith. In particular, the question of free will.
In Einstein’s deterministic universe, according to Isaacson, there is no room for free will. In contrast, physicists who accepted quantum mechanics as a fundamental description of nature could point at quantum uncertainty as proof that non-deterministic systems exist and thus free will is possible.
I boldly disagree with both views.
First, I look out my window at a nearby intersection where there is a set of traffic lights. This set is a deterministic machine. To determine its state, the machine responds to inputs such the reading of an internal clock, the presence of a car in a left turning lane or the pressing of a button by a pedestrian who wishes the cross the street. Now suppose I incorporate into the system a truly random element, such as a relay that closes depending on whether an atomic decay process takes place or not. So now the light set is not deterministic anymore: sometimes it provides a green light allowing a vehicle to turn left, sometimes not, sometimes it responds to a pedestrian pressing the crossing button, sometimes not. So… does this mean that my set of traffic lights suddenly acquired free will? Of course not. A pair of dice does not have free will either.
On the other hand, suppose I build a machine with true artificial intelligence. It has not happened yet but I have no doubt that it is going to happen. Such a machine would acquire information about its environment (i.e., “learn”) while it executes its core program (its “instincts”) to perform its intended function. Often, its decisions would be quite unpredictable, but not because of any quantum randomness. They are unpredictable because even if you knew the machine’s initial state in full detail, you’d need another machine even more complex than this one to model it and accurately predict its behavior. Furthermore, the machine’s decisions will be influenced by many things, possibly involving an attempt to comply with accepted norms of behavior (i.e., “ethics”) if it helps the machine accomplish the goals of its core programming. Does this machine have free will? I’d argue that it does, at least insofar as the term has any meaning.
And that, of course, is the problem. We all think we know what “free will” means, but is that true? Can we actually define a “decision making system with free will”? Perhaps not. Think about an operational definition: given an internal state I and external inputs E, a free will machine will make decision D. Of course the moment you have this operational definition, the machine ceases to have what we usually think of as free will, its behavior being entirely deterministic. And no, a random number generator does not help in this case either. It may change the operational definition to something like, given internal state I and external inputs E, the machine will make decision Di with probability Pi, the sum of all Pi-s being 1. But it cannot be this randomization of decisions that bestows a machine with free will; otherwise, our traffic lights here at the corner could have free will, too.
So perhaps the question about free will fails for the simple reason that free will is an ill-defined and possibly self-contradictory concept. Perhaps it’s just another grammatically correct phrase that has no more actual meaning than, say, “true falsehood” or “a number that is odd and even” or “the fourth side of a triangle”.
It’s been 41 years since Armstrong’s first “one small step” on the surface of the Moon.
Year after year, I express my hope that it won’t take another, well, 41 years before the next step is taken.
Some think that this video is in bad taste:
I disagree. If it were done by anyone other than a Holocaust survivor, it would be in bad taste. But a Holocaust survivor has EVERY right to dance in Auschwitz and be happy with his family. This is his best (and only) revenge. (Sadly, lawyers seem to be having their revenge, too, as this video was apparently taken down previously by YouTube for alleged copyright violation. Yet another painful demonstration of just how badly broken our system of copyright really is.)
America’s top fascist (well, if he isn’t, he is certainly a contender for the title) may be going to jail after all. You can only go so far abusing your power in a democracy.
(Note to self: just in case he survives this round, perhaps it’s a smart idea not to drive through Maricopa County, Arizona anytime soon? Who know what blogs Arpaio and his minions read…)
I hate dogma. I hate it even more when a valid scientific observation becomes dogma.
One case concerns the infamous goto statement in programming languages. It is true that a programming language does not need a goto statement in order to be universal. Unfortunately, this led some, most notably among them the late Edsger Dijkstra, to conclude that goto is actually harmful. While it is true that goto can be misused, and that misusing the constructs of a programming language can lead to bad code, I don’t think goto is unique in this regard (it is certainly no more harmful than pointers, global variables, or the side effects of passing variables by reference, just to name a few examples). Nonetheless, with Dijkstra’s letter on record, the making of a dogma was well under way.
And here I am, some 40 years later, trying to write a simple piece of code the logic of which flows like this:
LET X = A
LABEL:
Do something using X
IF some condition is not satisfied THEN LET X = B and GOTO LABEL
The condition, in particular, is always satisfied when X = B.
Yes, I know how to rewrite the above code using a loop construct, to satisfy structured programming purists. But why should I have to, when the most natural way to express this particular algorithm is through the use of a conditional jump, not a loop? Oh wait… it’s because someone who actually believes in dogma prevailed when the JavaScript was designed, and therefore, goto never made it into the language.
By the way, and before I forget: Happy 143rd birthday, Canada! It was nice to have the Queen here, celebrating with us. As she said, she’s been around for more than half the lifespan of this young country.
I do wonder though: how many of those who think that getting drunk and throwing firecrackers on this day is the right thing to do actually know what we are celebrating?
In recent years, one of the most welcome feature in Web browsers was tabbed browsing. Implementing in effect what Microsoft calls MDI (Multiple Document Interface), it helped reduce screen clutter while viewing multiple Web pages.
Meanwhile, Microsoft made a (valid) observation that MDI is application-centric, and for new users, not intuitive; the idea is that each document should live in its own window, regardless of which application is used to render or edit it.
Fortunately, even Microsoft were wise enough to recognize that for many (especially professional) users, who keep multiple documents open, SDI (Single Document Interface) is not always the best choice. On the contrary, MDI allows one to work significantly more efficiently when keeping a large number of documents open. Therefore, Office 2010 continues to support MDI mode (thankfully), although not near as elegantly as a Web browser, with visible tabs.
Enter Adobe. They also chose to follow in Microsoft’s footsteps. Unfortunately, in their infinite wisdom they also chose to take a step further: they not only made SDI mode the preferred mode, they removed MDI mode altogether. They offered the lamest excuses: 1. that’s not how it’s done on the Mac, 2. MDI mode was already considered deprecated in version 8, 3. Microsoft told us to do it, 4. it’s more work, and 5. it’s more costly to test.
Thankfully, Acrobat’s product manager came to his senses after receiving overwhelmingly negative feedback. He closed the discussion by referring to the proverbial dead horse.
Except that news of the horse’s death might have been slightly exaggerated. The discussion in question took place in October 2008. I cannot help but notice that my calendar says July 2010, yet Acrobat 9 is still the latest version, still sorely missing an MDI mode.
Here’s a headline from Google News that illustrates just how difficult it is for a non-native speaker of English (or, for that matter, for many a native speaker!) to understand journalists:
ABC Online: Houston expects changes for diggers under Petraeus
OK, so if you watch the news at all, you’d know that Petraeus is the US general who’ll be taking over in Afghanistan. But unless you also know that “ABC” can refer to the Australian Broadcasting Corporation, that Angus Houston is Australia’s Chief of the Defense Force, and that “digger” is a slang term for Australian or New Zealand soldiers, you could be excused if you thought that this was not an article title but a cryptic crossword entry.
Can both climate alarmists and climate deniers be right (or wrong) at the same time? Perhaps so. At least that’s my understanding after reading about a new study that was designed to evaluate the judgment of climate experts.
The way I see it, yes, there is consensus that the planet is warming. Yes, there is consensus that human activity contributes to the warming. Yes, there is consensus that the warming can have disastrous consequences.
However, there is no consensus regarding the magnitude of future warming. There is no consensus regarding the extent to which human activity vs. natural causes are responsible for the warming. And I don’t think a consensus exist that the consequences of the warming are uniformly bad for humanity, or even that the bad consequences outweigh the potentially good ones.
In any case, consensus is irrelevant. Science is not supposed to be a democracy of scientists, but a tyranny of facts.What makes a scientific theory right is not consensus but logical consistency and good agreement with observation.
Scientists are, however, responsible to communicate not only what they know but also what they don’t understand (this is what defines the line between a climate change advocate and a climate change alarmist, I guess.) Conversely, scientists are supposed to be able to express their doubts without questioning or withholding facts (this, perhaps, is what distinguishes a climate change skeptic from a climate change denier.)
Unfortunately when the debate becomes political, such nuances are often lost or ignored. Politics, especially populist politics, abhors uncertainties and prefers to paint everything in black and white. If uncertainties are mentioned at all, they are merely used as “proof” that the other side is wrong, therefore our side must be right, with no room in the middle. You either believe Al Gore’s Inconvenient Truth like the gospel, or you accuse Al Gore of being a fraud artist out to get rich on phony carbon credits.