Chandrayaan-3 landed safely on the Moon, cementing India’s position as a space superpower.
What can I say? Congratulations! Not sure who the gentleman is in this image that I saw on Twitter, but he certainly looks happy.
Russia’s first probe to the Moon in nearly half a century, Luna-25, has crashed.
It would be easy to react to the news with glee, with schadenfreude. We are, after all, talking about a crash that denied a propaganda opportunity to Putin’s Evil Empire (maybe not quite as evil as Stalin’s regime but more evil, to be sure, than the USSR ever was in my lifetime).
But space exploration transcends, always transcended, national boundaries. Our petty squabbles look pretty… well, petty from the Moon. Chances are, if human beings are still around with an advanced technical civilization a millennium from now (or, if our machine descendants are still around) they’re far more likely to remember Armstrong’s first step on the Moon than Russia’s attack on Ukraine, the Cold War, or even WW2.
So this crash is sad news, Putin or no Putin. I hope India’s probe, Chandrayaan-3, is more successful. Fingers crossed. Things are looking good for now but it has yet to accomplish a tricky landing.
Another paper of ours, in all its 36-page glory, was accepted by Physical Review D and I am delighted.
One of the things we do in this paper is the use of symmetric trace-free tensors (STF) that allow us, among other things, to effect a rotation of spherical harmonic coefficients. This approach likely has many uses as a bit of practical math, beyond modeling gravitational fields.
I just asked GPT-4 using my freshly developed UI about Starship.
In the background, GPT-4 correctly utilized a facility to perform Google searches and make use of the resulting snippets (I’m sure I’ll find a way to prevent it from referencing those snippets, which are not visible to the user, in the actual answer). Here is the critical part of the conversation that took place in the background:
User: What happened to Starship this morning?
GPT: SEARCH(Starship event on 2023/04/20)
System: Snippet 1: 3 hours ago … The rocket got off the launchpad before exploding above the Gulf of Mexico. This segment airs on April 20, 2023. Audio will be available after …
Snippet 2: UPCOMING. April 19, 2023. STARLINK MISSION · April 20, 2023. Starship Flight Test.
Snippet 3: 6 hours ago … April 20, 2023 … SpaceX launches the inaugural test flight of the integrated Starship and Super Heavy rocket from the company’s Starbase …
Snippet 4: 5 hours ago … April 20, 2023. Elon Musk has called it the “holy grail” for space technology. Starship, a new SpaceX rocket system that launched for the …
Snippet 5: ATCSCC ADVZY 007 DCC 04/20/2023 OPERATIONS PLAN. MESSAGE: EVENT TIME: 20/0200 – AND LATER … THE SPACEX STARSHIP SUPER HEAVY LAUNCH FROM BOCA CHICA, …
GPT: This morning, SpaceX launched the inaugural test flight of the integrated Starship and Super Heavy rocket from the company’s Starbase. Unfortunately, the rocket got off the launchpad but exploded above the Gulf of Mexico during the flight test (Snippet 1, Snippet 3).
In case you are wondering, yes, it knows the current date, too, thanks to the way my UI sets it up.
And yes, Starship blew up. But it blew up after the job of the first stage was nearly complete, at separation, at an altitude of some 40 km. Which means that despite the spectacular RUD (rapid unscheduled disassembly) event, this launch is still a huge success.
Now if we could only convince Elon to spend more time building rockets and less time honing his skills at corporate value destruction at Twitter, pardon me, X Corp…
Students at the California State University, Northridge, are currently working on a short documentary film about our work on the Solar Gravitational Lens.
The project hopes to attract modest crowdfunding. They also produced a teaser trailer.
Of course I hope they succeed; our SGL work could use some good publicity.
More than 20 years ago, I wrote a program that showed what Mars would look like if it had oceans.
I wrote it originally in C++, but then ported it to Java and turned it into a Web application.
That was a long time ago. Unfortunately, browsers no longer support Java applets for security reasons. So my beautiful applet was not working anymore.
Putting aside Trumpism, woke-ism, the politics of the day, populism, the whole kaboodle, here’s something to contemplate.
Tonight, Russia is continuing its efforts to subjugate the independent nation of Ukraine, not refraining from committing serious, intentional, criminal acts against the country’s civilian population to further its goals.
Also tonight, the space agency of the United States, NASA, is conducting a ground-breaking experiment, the first of its kind testing a method that might one day avert a global disaster, protecting the entire planet from an asteroid impact.
I think it speaks volumes about the different ways in which these countries see their respective roles in the world.
I almost long for Soviet times. The regime was assuredly brutal, but at least it professed to seek noble goals. Not anymore, I guess.
Oops. It’s past midnight already, so technically it was yesterday but to me it is still today, September 12.
The sixtieth anniversary of John F. Kennedy’s famous “we choose to go to the Moon” speech. How many more years before another human sets foot on the Moon?
Oh, and it was thirty years ago that Ildiko and I became married.
Yup, that’s us; 1979 vs. 2019.
Doesn’t this cloud, photographed in the skies above Ottawa by my beautiful wife moments ago, look just like the USS Enterprise?
Maybe it is, doing its time-traveling thing, with a malfunctioning cloaking device.
This morning, Google greeted me with a link in its newsstream to a Hackaday article on the Solar Gravitational Lens. The link caught my attention right away, as I recognized some of my own simulated, SGL-projected images of an exo-Earth and its reconstruction.
Reading the article I realized that it appeared in response to a brand new video by SciShow, a science-oriented YouTube channel.
Yay! I like nicely done videos presenting our work and this one is fairly good. There are a few minor inaccuracies, but nothing big enough to be even worth mentioning. And it’s very well presented.
I suppose I should offer my thanks to SciShow for choosing to feature our research with such a well-produced effort.
A beautiful study was published the other day, and it received a lot of press coverage, so I get a lot of questions.
This study shows how, in principle, we could reconstruct the image of an exoplanet using the Solar Gravitational Lens (SGL) using just a single snapshot of the Einstein ring around the Sun.
The problem is, we cannot. As they say, the devil is in the details.
Here is a general statement about any conventional optical system that does not involve more exotic, nonlinear optics: whatever the system does, ultimately it maps light from picture elements, pixels, in the source plane, into pixels in the image plane.
Let me explain what this means in principle, through an extreme example. Suppose someone tells you that there is a distant planet in another galaxy, and you are allowed to ignore any contaminating sources of light. You are allowed to forget about the particle nature of light. You are allowed to forget the physical limitations of your cell phone’s camera, such as its CMOS sensor dynamic range or readout noise. You hold up your cell phone and take a snapshot. It doesn’t even matter if the camera is not well focused or if there is motion blur, so long as you have precise knowledge of how it is focused and how it moves. The map is still a linear map. So if your cellphone camera has 40 megapixels, a simple mathematical operation, inverting the so-called convolution matrix, lets you reconstruct the source in all its exquisite detail. All you need to know is a precise mathematical description, the so-called “point spread function” (PSF) of the camera (including any defocusing and motion blur). Beyond that, it just amounts to inverting a matrix, or equivalently, solving a linear system of equations. In other words, standard fare for anyone studying numerical computational methods, and easily solvable even at extreme high resolutions using appropriate computational resources. (A high-end GPU in your desktop computer is ideal for such calculations.)
Why can’t we do this in practice? Why do we worry about things like the diffraction limit of our camera or telescope?
The answer, ultimately, is noise. The random, unpredictable, or unmodelable element.
Noise comes from many sources. It can include so-called quantization noise because our camera sensor digitizes the light intensity using a finite number of bits. It can include systematic noises due to many reasons, such as differently calibrated sensor pixels or even approximations used in the mathematical description of the PSF. It can include unavoidable, random, “stochastic” noise that arises because light arrives as discrete packets of energy in the form of photons, not as a continuous wave.
When we invert the convolution matrix in the presence of all these noise sources, the noise gets amplified far more than the signal. In the end, the reconstructed, “deconvolved” image becomes useless unless we had an exceptionally high signal-to-noise ratio, or SNR, to begin with.
The authors of this beautiful study knew this. They even state it in their paper. They mention values such as 4,000, even 200,000 for the SNR.
And then there is reality. The Einstein ring does not appear in black, empty space. It appears on top of the bright solar corona. And even if we subtract the corona, we cannot eliminate the stochastic shot noise due to photons from the corona by any means other than collecting data for a longer time.
Let me show a plot from a paper that is work-in-progress, with the actual SNR that we can expect on pixels in a cross-sectional view of the Einstein ring that appears around the Sun:
Just look at the vertical axis. See those values there? That’s our realistic SNR, when the Einstein ring is imaged through the solar corona, using a 1-meter telescope with a 10 meter focal distance, using an image sensor pixel size of a square micron. These choices are consistent with just a tad under 5000 pixels falling within the usable area of the Einstein ring, which can be used to reconstruct, in principle, a roughly 64 by 64 pixel image of the source. As this plot shows, a typical value for the SNR would be 0.01 using 1 second of light collecting time (integration time).
What does that mean? Well, for starters it means that to collect enough light to get an SNR of 4,000, assuming everything else is absolutely, flawlessly perfect, there is no motion blur, indeed no motion at all, no sources of contamination other than the solar corona, no quantization noise, no limitations on the sensor, achieving an SNR of 4,000 would require roughly 160 billion seconds of integration time. That is roughly 5,000 years.
And that is why we are not seriously contemplating image reconstruction from a single snapshot of the Einstein ring.
Our work on the Solar Gravitational Lens, complete with some diagrams created by yours truly, now made it onto TikTok.
@toknerdytome Solar Gravitational Lens #learnontiktok #tiktokpartner #science #space #astrotok #alien #nasa #physics #astronomy #technology ♬ Interstellar- Main Theme – Geek Music
How about that.
Although we are not religious, we celebrate Christmas.
And I still cannot think of a better way to celebrate Christmas than with the words of the astronauts of Apollo 8, and the sense of awe they felt when they became the first human beings ever in the history of our species to be completely cut off from Mother Earth, when their spaceship disappeared behind the Moon.
Re-emerging, they read passages from the Book of Genesis to their audience, with Frank Borman concluding with the words:
[G]ood night, good luck, a Merry Christmas – and God bless all of you, all of you on the good Earth.
To me, this is the most beautiful Christmas message ever.
Thanks to streaming services, I occasionally stumble upon films and television series from foreign lands that otherwise I’d not even know about. And no, I don’t mean Squid Game, that explosively popular Korean series: I only watched the opening few minutes of the first episode so far, and I don’t yet know if it is my cup of tea. Rather, this time around it is a Russian movie that I came across on Amazon Prime: a 2017 film titled Salyut-7.
Salyut-7 was a Soviet space station. In 1985, the space station was dead, without power. The Russians launched a daring rescue mission, Soyuz-T13, which was not only able to dock with the derelict station but also able to revive and repair it.
Consistent with Soviet era secrecy, we knew very little about this mission and didn’t appreciate its significance back then.
The movie itself combined the actual story of the Soyuz-T13 mission with other events, such as the fire on board the Mir space station 12 years later or a nonsensical fictitious mission by the space shuttle Challenger to “steal” the station, for dramatic effect. In that, I think they did a disservice to the cosmonauts who pulled off this repair: perhaps less spectacular in terms of visual effects, what they accomplished was no less significant.
But otherwise, I found the movie fun to watch, very well done, with top notch special effects and (insofar as my inexpert eye can tell) excellent acting and directing. I enjoyed the movie. And its faults notwithstanding, I think it offers a worthy reminder that the USSR’s space program brought enormous value to all of humanity. It saddens me deeply when I think of how much of it went to waste in the turbulent years following the breakup of the USSR.
I’ll be giving a virtual talk tonight, courtesy of a Texas astronomy club:
I hope it will be well received.
This morning, a drone took flight. It successfully took off from the ground, hovered for a few seconds, and then landed safely.
What, you ask? How is this supposed to be a big deal? There are millions of drones out there, kids playing with them and whatnot.
Oh, but this drone is special, and not only because it carries a small piece of fabric from the Wright brothers’ very first airplane.
It is special because it flew on Mars.
The next in our series of papers describing the extended gravitational lens (extended, that is, in that we are no longer treating the lensing object as a gravitational monopole) is now out, on arXiv.
Here’s one of my favorite images from the paper, which superimposes the boundary of the quadrupole caustic (an astroid curve) onto a 3D plot showing the amplitude of the gravitational lens’s point-spread function.
I was having lots of fun working on this paper. It was, needless to say, a lot of work.
I really cannot tell which impresses me more: The incredibly complex landing or the fact that there is now a de facto infrastructure in orbit around Mars, in the form of earlier spacecraft that provide communications relay capabilities for real-time tracking of the landing.
Or perhaps the fact that Perseverance also carries the Ingenuity helicopter. If successful, it will be the first drone to fly in the atmosphere of another planet.
The children of future settlers on Mars will be learning about these moments in school.
I was recently interviewed by a Hungarian podcaster, mostly about my participation in the early days of game development in Hungary, but also about my more recent work, including my scientific contributions.
I just listened to the interview and thankfully, I didn’t say anything colossally stupid.
Tonight, Slava Turyshev sent me a link to an article that was actually published three months ago on medium.com but until now, escaped our attention.
It is a very nice summary of the work that we have been doing on the Solar Gravitational Lens to date.
It really captures the essence of our work and the challenges that we have been looking at.
And there is so much more to do! Countless more things to tackle: image reconstruction of a moving target, imperfections of the solar gravitational field, precision of navigation… not to mention the simple, basic challenge of attempting a deep space mission to a distance four times greater than anything to date, lasting several decades.
Yes, it can be done. No it’s not easy. But it’s a worthy challenge.