"Pay attention to the world." -- Susan Sontag
 

New Year’s Day 2026 (White Mums, Manipulated)

From “A Way to a Happy New Year” by Robert Brewster Beattie in Poems for Special Days and Occasions, compiled by Thomas Curtis Clark:

To leave the old with a burst of song,
To recall the right and forgive the wrong;
To forget the thing that binds you fast
To the vain regrets of the year that’s past;
To have the strength to let go your hold
Of the not worth while of the days grown old,
To dare go forth with a purpose true,
To the unknown task of the year that’s new;
To help your brother along the road
To do his work and lift his load;
To add your gift to the world’s good cheer,
Is to have and to give a Happy New Year.

From “The Snowman’s Resolution” by Aileen Fisher in More Poetry for the Holidays, selected by Nancy Larrick:

The snowman’s hat was crooked
and his nose was out of place
and several of his whiskers
had fallen from his face,

But the snowman didn’t notice
for he was trying to think
of a New Year’s resolution
that wouldn’t melt or shrink.

He thought and planned and pondered
with his little snowball head
till his eyes began to glisten
and his toes began to spread;

At last he said, “I’ve got it!
I’ll make a firm resolve
that no matter WHAT the weather
my smile will not dissolve.”

Now the snowman acted wisely
and his resolution won,
for his splinter smile was WOODEN
and it didn’t mind the sun.


Hello!

While I’ve been taking photographs regularly at Oakland Cemetery for about six years now, it wasn’t until 2022 that I discovered the large volume of late-blooming asters and mums that are featured throughout the property. Most of them initially flower from late October through late November, include a wide variety of species and colors, and persist into early December if we don’t have too much cold weather. They tend to fill the gap between the emergence of fall color among trees and shrubs in autumn, and those plants — mostly flowering shrubs and trees like quince and plums — that are capable of producing blooms as early as January or February. As these asters and mums put on their best shows just before the holidays — and just before I start my annual Christmas project posts, I tend to accumulate several hundred photos that I don’t work on until after the holiday project, and the holidays themselves, are in the rearview mirror.

So this is the first post (of a presently unknown number) of some of those asters and mums, and I picked the white ones to share today since I’ve gotten in the habit of associating white flowers with New Year’s Day. Subsequent posts will bring in rainbows of hues, including some rather amazing mums where the single stems of individual plants produce three to five flowers, each one a different color.

While members of these plant families are highly resistant to cold weather — which is of course what makes them so suitable for late autumn and early winter growth spurts — we’d had several days below freezing right around the time many of the plants were starting to bloom. You’ll see the effects of those freezing temperatures in three ways: some of the flowers formed non-traditional shapes that reminded me of what coastline trees look like when blown by wind off the water for decades or centuries; some of the blossoms (especially at the tips of their petals) had their colors shift from white to light pink or light red; and many of the stems and leaves froze to the point where they produced swatches of yellow or turned completely brown. The leaf color change is similar to what happens to many plants as autumn approaches, where a plant’s ability to produce chlorophyll (and stay green) is reduced by the cold and its leaves eventually desiccate, detach, and fall to the ground. The shape-shifting is a chemical reaction to reduction in water fluidity that contracts cells and collapses the flower structure; and the color-shift is a reminder that few flowers are actually pure white but instead are suppressing the appearance of alternate colors when they’re at optimal blooming stages.

All this means that this year I accumulated hundreds of aster and mum photos that are not entirely photogenic. Naturally, I was aware of that when I took the photos (how could I not be?), but took lots of them anyway, in part because I wanted to see what I could create from them once I had time to spend editing in Lightroom. While I’ve long been accustomed to using Lightroom’s healing tools to remove spots, pollen, bits of debris, or unfocused photobombing bugs from my images, the kind of repair and reconstruction needed for heavily damaged plants is beyond the capabilities of those tools.

Such reconstruction is not, however, beyond the capabilities of Lightroom’s Generative AI Remove tool, which was added to the software in mid-2024 and I’ve been experimenting with it since. Simply put, this tool lets you select parts of an image that you want to replace, and it fetches three potential replacements you can pick from to let you properly match colors and textures. It lets me think of an image’s creative reconstruction like this: what might have been in the photo if the damaged flower, broken leaf, or dead stem wasn’t there?

Here are some examples to help illustrate that thinking, three photos showing how the image looked before I selected elements to remove and replace (sometimes dozens of individual selections), and after. Select the first image and page through all six if you’d like to see how the changes worked out. Note, especially, how the tool generated new leaves for the plants that are botanically accurate: they not only match the colors and textures present in other parts of the photo I didn’t change, but are correctly rendered not as some generic leaf shape, but with the distinct appearance of chrysanthemum leaves.

Of course, the end result departs significantly from what I photographed, which generates all sorts of interesting questions at the intersections of photography, creativity, image manipulation, and even artificial intelligence. A photography theorist bound to originalist or documentary conceptions might think this distance between what I photographed and what I chose as an end result violates some picture-taking laws, but I’m not one of those theorists. I do, however, try to approach these changes intentionally, with the idea in mind that we — as humans who observe plants and flowers in real life — tend to focus on parts of a scene we consider the subject and worth contemplating, while disregarding those parts of that scene that we consider irrelevant. To the camera, everything it captures is equally significant or insignificant, though it may help us with subject isolation when we vary focal lengths, apertures, and other settings — so it’s up to us and our discretion, not the technology, to decide what matters.

If you’ve been following me here for a while, you’ll likely recall that I have in the past often produced galleries of images where I’ve removed the backgrounds behind the subject I wanted you to see by converting them to black. This was a different kind of image manipulation serving the same goal: presenting a photo based on what mattered to me when I viewed the subject, while discarding distracting or irrelevant items. Given the Generative Remove tool’s capabilities, I look back on that now as a transitional period in my own development, one that I’d probably still be using had this new tool not been invented, and one that permits me to take wider shots than I did in the past while knowing I can remove aberrations while still preserving the botanical accuracy and garden context of the original scene.

Because its operation is quite opaque — like a black box in technological terms — using it is heavily experimental, but with a lot of ambiguity since you can never get the same results twice even if you try to make an identical second selection. And while you can’t tell it what to do with words that represent your thoughts or your vision, eventually it sinks in through that experimentation that you can influence how it acts. I’ll explore that more fully in a later post, but here are two things I discovered that have turned out to be consistently true: if I want to repair damage to a particular leaf, I should first remove any small spots or blemishes on adjacent leaves, or the tool will incorporate nearly identical spots into its replacement; and, if I want the tool to construct something like a new leaf in an otherwise nearly blank location, it will do that accurately if I include a sliver of a leaf nearby in my selection. These two techniques tell me that the tool is contextual: in determining potential replacements, it’s looking at what else is in the photo in conjunction with what I’ve selected before providing replacement options.

Here’s another way to understand that. For this image, I selected everything in the sixth photo above for replacement — therefore asking Generative Remove to recreate the entire photo. What does this result tell us about how it works?

I’m glad you asked! We can see that — even though I’ve told the tool to replace all the image’s content — it still recognizes that the primary subject was a white flower of a particular shape, the background leaves were unimportant, and that the most prevalent colors in the image were white and shades of green. It probably knows nothing specific about objects we would identify as chrysanthemums, but takes the pattern it found in my photo’s subject and repeats it while varying the pattern to simulate randomness. It applies this same approach to the grass in the background: note how each swatch of grass is very similar though not precisely identical.

This arrangement of recurrent but slightly varying patterns is one of the reasons we would recognize this implausible field of flowers as likely created by an image generator: the patterns are too uniform, and any given section of the image looks nearly identical to any other. That has two implications: first, that when editing photos and using this Generative Remove tool, I have to keep an eye out for unnatural patterns or patterns that aren’t a logical fit; and second, when you see an image with patterns like this, your conclusion that it’s AI-generated is most likely correct.

Thanks for reading and taking a look…

And Happy New Year!