On the evening of the Fourth of July, I was cringing every few seconds as volleys of illegal fireworks shot into the air a few houses over on my block. I was outside, poking halfway out my backyard garage with the telescope, looking at the moon to pass the time until Saturn rose over the treetops.
Conditions didn’t allow me any good Saturn photos, but the moon turned out to make a rewarding enough target. I took a minute and a half of video and fifty-eight photos. It probably seems silly, but I’ve wanted to stack the photos from the moon for a long while. The moon is an easy enough thing to see in plenty of detail, but it’s difficult to show it as a vivid, three-dimensional object—the way it looks through a telescope—in a photo. So much gets lost in the translation from eye to sensor, and much of this experience gets swallowed into the seeing disc at the moment of capture, maddeningly blurred at the final moment.
For comparison, here’s an individual photo of the moon that’s been converted from RAW and cropped but otherwise not tampered with at all (ISO 400, f/6, shutter speed 1/800 s). You might have to click on it to see it larger to get a sense of the difference I mean. You’ll see the same details from the image above, but they’ll be indistinct. In particular, look at the edge of the basalt plain along the top limb, where the terminator crosses it. Or look at the craters along the lower part of the terminator. I look at that and think, oh, yep, that’s the moon—no news there.
Last night, I tried stacking the frames of the video to get more detail, but the results were only so-so. I was pretty dissatisfied, and because I expected to get more, I kept pushing the image, getting distortions in some of the higher contrast parts of the image. I used all kinds of filters to get what I wanted (deconvolution filters of all sorts, wavelet sharpening, unsharp masking, custom convolution filters, all sorts of contrasting and denoising), but I just made things worse.
I am not sure why stacking from a video gave me a poorer result. The same problem probably limits my planetary photos as well, so it’s worthwhile figuring out. It might be some aspect of the sensor, or it might be that I’m using too many photos in the final stack, more than needed. Maybe I didn’t align the frames properly.
In either case, I took all the RAW photos as a backup, so I turned back to those today and stacked them. All the photos were taken with the same settings: ISO 400, f/6, shutter speed 1/800.
I’ve discussed this process before, but to run it down again,
- I converted all the RAW images to TIFF;
- I used AutoStakkert!3 (a beta version of the program) to load them up as individual frames, then stacked all forty-seven of them; and
- I loaded the resulting TIFF from that stack in AstraImage and, after much experimentation,
- first applied as much wavelet sharpening as I could before distortions became apparent, and
- then applied a very small amount of unsharp mask.
I’ve experimented a little with stacking, changing parameters here and there to see how the result changes, but mostly I’ve been trusting that it’s doing the job properly and concentrating on seeing how much I can get out of AstraImage, since that’s quicker. I’ll load up the stacked TIFF, make a change, and save a version. Each change, I’ll save, and when I’ve gone down a path too far, I’ll back up to a version that I want and start down a new path. With them all in the same directory, I can then open them all at once and shift between them quickly, as if I were using a blink comparator, to see which changes helped and which hurt.
After I was done with all these things, I took the photo over to Apple Photos to tweak the colors, levels, and contrast a bit and to share.