Keep Emotion Out Of The Blender
This week, I've released the first episode of the fourth season of the podcast Stories of Emotional Granularity. This episode explores the emotion of feeling grounded.
(You can hear the episode on Apple Podcasts, Spotify, or directly through my web site.)
I wanted to begin this new season of the podcast with the concept of feeling grounded because grounding is a thoroughly analog (and also analogue) experience. People feel grounded when they feel connected to something solid and fundamental. Grounded experiences are not floating "in the cloud", or swept up in the currents of abstract flow.
When people feel grounded, they have a strong sense of context. They know where they stand. They know who they are. They know how they relate to their surroundings.
As I was editing together the voices of Audrey Holocher, Pinelopi Margeti, Todd Saddler, Betti Rooted Lionheart, Ian Williams, Emily Avila, and Rachael Stamps, it struck me how each one of these people described experiences that were profoundly embedded in their individual perspectives.
My approach to producing Stories of Emotional Granularity is to wait to release an episode until I have interviewed many people talking about a single emotion. I center each episode around one emotion, described from many perspectives, instead of simply featuring a long conversation on many subjects with one person.
This is the twist that comes within feeling grounded: Although the emotion of grounding is about finding the center, everybody must find their own center when they are grounding. Being grounded is about forming a personal connection, in a context that's individually authentic. Trying to be grounded in a way that other people are grounded would completely miss the point. Grounding is a process of gathering awareness of oneself in relationship to one's surroundings. There is no common ground in the process of grounding.
Because grounding is experienced in an individual context, it would be inaccurate and inappropriate to describe the emotion of being grounded only by talking to one person about their experience of it. Likewise, to try to identify what feeling grounded means by gathering data about what large numbers of people say about feeling grounded, and then recreating an average of the most common words and phrases people use to describe feeling grounded, would entirely miss the point.
Being grounded is not something we feel when we approach an average, broadly predictable pattern of settings, behaviors, and other stimuli. It's about defying the pull of social and linguistic norms to find something that feels true to us in one particular moment, based on where we're coming from and where we are going to.
The Inapt Blender
I began this article with a photograph of a blender in a garden because that image represents what it's like for a useful piece of technology to be profoundly out of place. A blender does great work for us in the limited confines of a kitchen, but it's ridiculous to take it out into a garden and expect it to be of any use.
The problem isn't just that the blender needs electricity to work. We could go to the trouble of taking an extension cord out into the garden, after all. The core problem is that gardens aren't for blending.
Go back to the top and look closely at the garden in the photograph. You'll see that the garden is not a monoculture. There are many plants growing together in a kind of improvised ecosystem. You can see the leaves of geranium, sedum, wood sorrel, and sweet woodruff in that picture, all rambling together, with the bold lion-toothed leaves of a dandelion sprawling around the blender. The yellow flowers you can see are not dandelions, though. They're buttercups. I didn't plant those buttercups. They came into the garden all on their own, through lovely happenstance.
If you look especially close, you might see that over the leaves of some sedum, another flower is on display. It's not what you think of as a flower. It's the odd, brown, fuzzy sort of string in the lower right. It's a flower from an oak tree that grows above the garden. We don't think of oak trees going into bloom, but it's these odd, inconspicuous flowers that bring us the oak's iconic acorns later in the year.
This is not an image that would be produced by generative artificial intelligence. For reference, here's what Adobe's generative AI engine produced for me when I prompted it to give me a photorealistic image of "A blender in the middle of a garden with dandelions, geraniums, wood sorrel, sedum, and sweet woodruff."
At first glance, the generative AI image looks on target. It's superficially plausible. Generative AI is great at looking superficially plausible.
Take a second look, however, and the plausibility of the image fades. For one thing, the photograph looks designed. Of course it would, because most of the images of gardens that generative AI platforms have been trained on are photographs that were produced for gardening publications that aim for a very neat and tidy aesthetic. Garden design photographs tend to look too good to be true because they are too good to be true. Many of these photographs are of sites that have been heavily prepared for photographers so that they have excellent visual composition, are free of weeds, and feature only healthy looking, mature plants. So, generative AI bases its own images of gardens upon an implausible ideal that doesn't match what most real gardens look like.
Anyone who has spent time in a temperate garden in Europe or the United States can spot several botanical irregularities in the generative AI image. The dandelion on the left seems to represent the idea of a dandelion fairly well, until you pause to consider the leaves. The leaves are all tangled up with the dandelion's flower stalks, which is not what dandelion leaves have actually evolved to do. Dandelion leaves actually fall horizontally, to catch the rays of the sun. They don't stretch up vertically, like dandelion flower stalks do, because the two parts have different functions. Of course, generative AI has no comprehension of the different function of dandelion leaves and flower stalks. It is only capable of mushing together some kind of average of images that have been labeled "dandelion" by humans working behind the scenes of the "artificial" veneer.
The dandelion on the right is even more off base. It features dandelion flower stalks with mature seeds, which seem close enough to the real thing, until you realize that they appear out of nowhere, with no rooted plant at their base to feed them and anchor them in place. The generative AI has no sense of dandelions as complete organisms with full life cycles. That's because generative AI doesn't think. It just repeats patterns that have been encoded for it over and over and over again by armies of underpaid human workers.
What about the other plants? The "geraniums, wood sorrel, sedum, and sweet woodruff" are completely missing from the picture, probably because they are less commonly known plants than the iconic dandelion, and weren't strongly coded for in the model's training. Most of the human workers whose collective knowledge makes the generative AI model function couldn't identify these plants to verify or correct images of them. Generative AI functions as a garbage-in-garbage-out system, which is constituted out of extremely coarse data that represents broad patterns in garden photography without adequate data about specific garden plants.
At least the generative AI got the blender correct, shining brilliantly in the morning sun. Right?
Nope. Even the relatively simple technology of the blender is wrong. Blenders tend to look similar to that shining metallic thing in the middle of the image, but they sure don't function like it would. Look at those things that are where the buttons of a blender would exist. They are buttonesque squiggles that are located roughly where blender buttons would be, but they don't have any labels, and many of them aren't even the right shape or size to be pushed or turned by human fingers. Once again, we see a rough mimicry of what a commonly-known object looks like, devoid of any comprehension of what that object is and what it's for.
Human emotional experience is like the garden. It's unexpectedly diverse, and it doesn't neatly match tidy design abstractions.
In real garden plants, form emerges out of function. Each species is the result of billions of years of adaptation, providing intricate, integrated life cycles of behavior that emerge as what we see because of complex systems woven together in a living fabric that is structured all the way down to the molecular level.
Our emotions are not simple codes that can be grasped and replicated merely through visual scans or pattern recognition in linguistic analysis. They are, like the anatomy of plants, manifestations of billions of years of evolution. They are grounded in our biology, just as the appearance of plants is grounded in theirs.
Every garden is indivisible. it doesn't represent a Platonic ideal of what a garden is. Each garden has different plants, animals, and other organisms growing together in a complex community that's grounded in the conditions of its location.
Generative artificial intelligence is more like a blender than a garden. It takes abstract representations of reality and chops them up into conveniently digestible bits, producing a substance that is an accurate recreation of the material it was fed, on the average but not in the aggregate.
Generative artificial intelligence grinds reality up, and yet, the simulations of reality that it creates are ungrounded. The creations of generative AI are to reality what processed food is to a basket of vegetables fresh from the garden.
Humans need whole food. Humans also need whole emotion, grounded emotion.
Keep Emotion Out Of The Blender
No self-respecting garden magazine would feature that generative AI mimicry of a garden. The image would be an insult to readers, and a sign that the publication's editors had given up all semblance of self-respect.
It's a bit easier to fake it with words, as language carries with it an inevitable ambiguity, and a presumption of meaning that's something like the suspension of disbelief in a theater. We habitually assume that when sentences are composed, there is a person who has composed them, a personality with consciousness and a deliberate purpose.
As a result of this suspension of disbelief, firms that purport to "research" emotion using generative AI's large language models have gotten away with a good deal of fakery over the last couple of years. These firms can produce something that looks similar to research into emotional motivation, just so long as you don't stop to take a close look at what's actually being done.
The experience I went through interviewing people, and then editing together their thoughts about the emotion of grounding into a podcast episode, reminded me of the importance of studying emotion through grounded research. It's simply not conceptually coherent to take a huge amount of data that comes from all kinds of places that we don't even known about, stuff it into a gigantic data blender that operates without transparency, and then accept the homogenized slop that emerges from the algorithmic processor.
Grounded research into human emotion is based upon real people talking about actual experiences in their individual lives. Genuine emotion is always contextual. It's always grounded.
One easy way to tell that research into emotional motivation is grounded is that the material it works with comes from actual, identifiably individual human beings. Research participants can remain anonymous, of course, but in grounded research, expressions of emotion will be explained in the context of particular real-life experiences.
Another way to tell that research into emotion is authentic is that a diversity of perspectives are represented. Emotion is subjective experience. It's not standardized. It doesn't follow a simple formula that can be replicated through non-existent "synthetic users".
Above all else, grounded research into emotion is an immersive experience of direct encounter with the feelings of other people. You can't understand a garden just by looking at photographs of a few flower beds. You need to walk through the garden, to see it from different angles, to hear the way the wind moves through the plants, to smell the earth and watch the bees bumble from blossom to blossom.
If research into emotion never brings you face to face with people as they struggle with feelings they can't quite control, it's just dancing around the edges of what matters.
Touch ground. Keep it real. Keep it human.