Oh man, y’all, I have been through so many false starts on this post. Did you know that when you’re only about 10% of the way through a book for the first time, it can be tough to corral your provisional assumptions and early observations into a proper argument?
Daryl and Paul took a much more sensible first-timer’s path, paying attention instead to what latched onto their reading experience like burrs on their socks and collecting the signals that suggested the future importance of marks and names. I’ma do that too, because the bell that keeps ringing in the back of my mind throughout this first week’s reading is empathy.
And listen, I know it’s thoroughly trodden ground to suggest that a novel might be concerned with empathy. That’s one of the original functions of fiction, right? Inviting empathy is one of the signature strengths especially of the novel as a form, with drama as the nearest competitor. There’s a lot that’s tedious about Percy Shelley, but this part of his Defence of Poetry has stuck with me for decades:
The great secret of morals is love; or a going out of our nature, and an identification of ourselves with the beautiful which exists in thought, action, or person, not our own. A man, to be greatly good, must imagine intensely and comprehensively; he must put himself in the place of another and of many others; the pains and pleasure of his species must become his own. The great instrument of moral good is the imagination.
Of course, he’s talking specifically about poetry (and, gross, specifically about men), but to Percy Shelley literally almost any creative expression of the will counted as one of “the kindred expressions of the poetical faculty”: “architecture, painting, music, the dance, sculpture, philosophy, and, we may add, the forms of civil life.” The point stands well enough, I hope.
(It looks like a scholar by the name of Suzanne Keen has done a lot of work in this field, which I’d love to read.)
But in Bubblegum it’s more explicit than that. Belt withholds from his readers the datum of his diagnosis out of a concern that it will make us unable to empathize with him. He’s afraid that instead of seeing him as a whole person, with varied motivations and experiences (y’all, I wanna quote my Whitman motto so bad right now), we’d only be able to think of him as a psychological disorder taking shape through time. The narrator of our book doesn’t trust us to extend him the empathy he deserves as a fellow human being—and (sad thought) that’s probably a conditioned distrust.
Speaking of conditioning! The cures are another site of empathy as a theme in this book, the way I see them. They were originally designed as therapy animals for children with psychotic disorders—am I remembering correctly that it was called the Friends Study?—who have to learn to understand their needs and care for them. And of course Belt appears to be unique in thinking of Blank as a pet and even a sibling. But “flesh-and-bone robot” is in the blurb we’ve all seen for the novel, right? So we already knew empathy was going to be an issue, specifically the question of who/what deserves it—because that’s why you put a robot in a story. Whatever your personal threshold is, whether it’s sentience or altruistic behavior or being alive or anything else, a story with a robot in it is intended to destabilize your certainty in that threshold.
(Briefly on sentience: The Turing test is our famous benchmark for identifying “intelligent” behavior indistinguishable from that of a human being. But note the formulation there. It’s not for measuring when a machine has become intelligent, it’s when that machine has become capable of behaviors that are consistent with intelligence such that a human observer infers the one from the other. This is a behaviorist test, right? Commonly, inaccurately used to “prove” the existence of something interior, which behaviorism would reject either the existence or the knowability of. Hence the excursus in Bubblegum on training your cure with conditioning methods, and all the documentation on cures that rigidly refuses to accord them any status but machines that produce outputs based on inputs.)
Belt’s threshold, it appears, is much lower than those of the people in the society around him. Much lower, we learn, because it’s not just other people he goes out of his nature into. It’s not just Blank and other cures. It’s…most things. When an inan may strike up a conversation with him at any moment, without warning or even previous identification as an inan rather than an inert object, it seems like there’s very little room for him to draw that circle that contains the empathizable-with and excludes the things that are beneath empathy. His swingset murders are mercy killings, specifically prompted by connecting to the suffering he perceives in the swingsets and their desire to be released from it.
On the other hand, he doesn’t really have any compunction about gaslighting his horrible racist grandmother into thinking she’s having dementia, so. Complicated subject. And I can tell I’m going to be thinking about it a lot!
You are definitely onto something here, Jeff. And, sure, it’s not new ground, but treading familiar ground in different boots, or on stilts, or hopping on one foot, or whatever treading metaphor adds some whimsy here, is sort of what the great novels do.
Part of what I’m curious to see is how the theme of empathy toward curios in particular will emerge. Levin doesn’t strike me as one to bang the drum for animal rights (etc.) via a transparent, pat little allegory. So what will he wind up doing with this?
I like your comment about Belt’s threshold. It makes me think too about thresholds for recursive thought. In both The Instructions and Bubblegum, we see obsessive thinking about thinking, and working the material of thought even to the point of absurdity. I am this sort of thinker, and I believe that’s why some of the books that portray this kind of thinking work for me — they offer a sort of empathy with respect to my ridiculous internal monologue, I suppose — but I also think that there’s a threshold beyond which that sort of thinking ceases to be useful.
The two things may seem unrelated, except that about half the time, the sort of recursive thinking I’ve got in mind is, at least for me, centered on how my thoughts or actions might influence others and whether thinking about those thoughts or actions carefully in advance ought to change how I actually behave (ad infinitum). In other words, it is often empathy of a sort that drives the absurdly overworked thinking.
It’s possible I’m grasping at a straw there, or, heh, overthinking things and seeing something where nothing exists (something I’m finding I have a lot of anxiety about while writing about this book, as a post landing tomorrow will demonstrate). Thanks in any case for sharing your thoughts as you dive in.
I love Belt’s overthinking. I find it very enjoyable to read because he’s not overthinking on a hyper-intellectual level that might try to confuse the reader. He is simply looking at every possible angle of a situation (which, yes, does get confusing after a while, but in an entertaining way).
Since many of the characters are hard to read at this point, it’s hard to know which insight is even the most likely.
I’m not even sure we know what Lotta’s ultimate motive was for their “date,” yet, right? It couldn’t have been just to get him to eat a cure, surely.
It’s hit me the last two weeks about how hard it is to write critically about a part of a story that you haven’t finished yet. If you know the story, you can look for occurrences of foreshadowing or the like. But right now we’re looking at details and imagining what might happen.
This is true for any book, but when you’re doing a careful read, it’s even stranger wondering what is significant and what is just something the author put in there as a red herring (or just for fun).
I love thinking about the empathy in this book. There’s so much of it in so many different directions. The SafeSurf seemed to have some empathy for the slide, for instance.
It will be very interesting to find out if the inans are actually speaking to him or if it’s in his head (well, it’s all in his head, but you know what I mean). Instinctively I assume he is putting words into their ||mouths|| and yet I also don’t see how you could have a flesh and blood robot.