Attention machine, pt. 2

 

Drawing of a woman looking at something out of frame.
“Woman expressing attention, desire and hope.”

Every single thing requires attention. Attention suffers not from the problem of being hard to define — instead, so many definitions exist in so many different fields that you could find useful interpretations of it everywhere you look.

Cognitively, attention is the ability to concentrate on one thing while tuning out other details in the environment. It’s about what you see, but also about what you don’t see. If we couldn’t pay attention, we would die under a crushing mass of stimuli.

On the internet, the cognitive definition of attention holds up, but is heightened: rather than “focusing” and “tuning out,” architects of the online world sever you from the information they think you shouldn’t see. While on TikTok, you’re not tuning out all of the videos that don’t cross your For You page. On Instagram, you’re not concentrating on your feed. In both cases, it is a given that your attention is wholly engaged with the platform. Whatever it shows you, you’re all in. Everything else is dead limbs.

That’s what surprises me about social media: how small it feels, despite literally everyone being on it. It’s the world’s hugest, sugariest cake that we’re only allowed to eat one bite at a time.

Maybe this is a good way for thinking about attention in general. We are overwhelmed by the world. By the huge cake. Some people are able to take measured forkfuls and stop when they get a tummyache, sure. Others of us just can’t help ourselves. We shovel it in, pink and oily, hand over hand, until we’re raw from the girt of sugar crystals and heaving. Wouldn’t it be better to sit, lean with our backs against the old-god dessert, and let someone else measure it into our mouths?

Cognitive metaphor theory comes to us from linguistics, and it argues that we translate abstract concepts into concrete metaphors. What is the mind? I don’t know. So I say “I’m a little rusty,” “we’re running out of steam,” “his ego is fragile,” “I’m firing on all cylinders.” I speak in metaphor: The Mind is a Machine.

These metaphors end up becoming so pervasive that we think about, speak about, and operate one thing as though it were the other. If someone carries a great emotional weight for a very long time, and suddenly collapses, unable to function from the stress of it all, I could say they “broke down.” That’s all the explanation you’d need. Even though, what does that even mean?

One of the great, early cognitive metaphors is, “Time is Money.” It so pervades language that it’s become a saying all it’s own. We waste time, save time, borrow time, invest time, put time aside, budget time, make time, lose time. “Attention is Money” feels tempting as a corollary, except that’s not really right. You can give someone your attention, you can pay attention. But it can’t be saved, or budgeted. It can’t be wasted, since if you’re not paying attention to something it’s not like your attention is bleeding out onto the floor. If you’re not paying attention, then the attention just doesn’t exist. Time is the number line, attention is points you plot along its axis.

Also, the way more salient metaphors we use for attention — other than “pay” — are “keep,” “hold,” and “lose.” Time is value, but attention is valuable.

Something else interesting about attention-as-commodity (we’re stepping away from cognitive metaphors here) is that you can’t hoard attention. The only thing you can do with attention is give it to something else.

These are just my scattered thoughts on attention, after both failing to understand attention as a machine learning concept and being disappointed by the Atlantic article I mentioned yesterday. Something something fMRI, something something brain waves. But I did read it. Every word.

If you read this whole thing, I guess I should thank you for your attention.

Attention machine

Part of what’s difficult about watching movies and TV shows, versus reading books, is that you can’t really “skim” a movie or TV show. Skimming a book is a matter of pace: how much time you give the information to come through.

That’s not totally true: Skimming a book can just as easily be a matter of attention, of percentage. What percent of the information do you actually interpret. In that way, skimming a movie is just as possible. You just put it up on the second monitor and sort of pay attention, sort of don’t. It takes the runtime of the movie. Audiobooks, I guess, are the same way. I hit some really heavy traffic coming back from the holidays. Jack Reacher played out through the whole ride. There is no more skimmable extant fiction than a Jack Reacher novel.

Attention is the machine that turns art into thinking. I’m also hopeful that it can be trained, because it’s a bummer to live in a world full of art you can’t attend to.

One more thing worth training: I used to wish I could keep a routine, but I think a much better skill is restarting a routine that you’ve fallen off of. Something can always knock you off your rhythm, but that’s not a problem if you know you can find it again.

Just wanted to get this out into the world before I miss a post.

EN: Let me catch myself red-handed here. I wrote this minutes before a D&D session, minutes after I cooked dinner. I hadn’t even read the Atlantic article called “The Attention Machine,” or studied up on the concept of attention in machine learning. This isn’t performance art — me half-attending to a blog post about half-attending to things — just me still getting a handle on the daily blog dance. I’ll have something more intelligent to say about attention tomorrow. Scout’s honor.

The right direction

Short post today, partly because I need to make a shorter post today and partly because I need to enforce shorter posts for myself in general, some days. Thought I’d reach into my collection, see if there’s anything interesting.

An excellent Great War-era quote from the British Journal of Opthamology:

“Some military authorities hold that a man, unless he is a sniper, need not see what he shoots at as long as sufficient visual acuity enables him to fire in the right direction.”

The invention of spoilers

A spoiler warning, with two black-and-yellow rectangles, hazard signs, and an eye with a strikethrough. The message reads, "SPOILER WARNING!"

Doug Kenney published “Spoilers” in the April 1971 edition of National Lampoon. National Lampoon, if you have a compound sentence for an age, was a humor magazine with, gosh, just the horniest covers ever writ upon a page. Whatever. I’m not here to try and unsnarl ’70s humor.

“Spoilers” gave away the ending to Citizen Kane, The Godfather, Psycho, and as Doug Kenney put it, “every mystery novel and movie you’re ever liable to see.”

This has been desribed as the first use of the world “spoilers” — but obviously, if it’s being referenced as the title of a National Lampoon article then it probably existed as a spoken phrase. Unless it was really invented by the article. I’m at least comfortable suggesting that this cultural artifact of the article indicates a sea change in the popularity of spoilers as a concept.

Something really interesting to me about the timeline of media is that, while art feels very subjective, its progression often looks more like technological advancement — the progression of art is preceded by technological advancement, even — than it looks like, say, human nature. You can compare current events to historical events and see clear parallels where people haven’t changed at all. Compare modern art to historical art, and there are stark, objective differences. I’m not even talking about Ancient Greek novels versus Agatha Christie novels, although I’m sure those differences exist and can be plotted. I mean the play versus movie versus the reality TV show.

Attitudes about art change with art. You can’t really spoil a painting (can you?), so let’s say “narrative” instead. At some point in the middle of the 20th century, people became preoccupied with spoilers in a way that we don’t have record of before National Lampoon, apparently. I’ll take a stab and say mass media, and later the internet, made the availability of spoilers enough of a concern that we started paying attention. (Although, ironically, I can’t actually find Doug Kenney’s article anywhere online.)

Narratives are basically about change. I find it really interesting to read through these six-word stories and see what they have in common. I think some are basically poems, and some are just little jokes. But Margaret Atwood’s, “Longed for him. Got him. Shit.” is unquestionably a narrative. It has a status quo (Longed for him) that’s disrupted by change (Got him) with a result (Shit). The result isn’t even necessary. Lots of writers on this list cheat by leaving the ending to implicature — like Rockne S. O’Bannon’s, “It’s behind you! Hurry before it”

One of the famous early novels, The Tale of Genji, cuts away at the end to (spoiler) imply the death of the protagonist. Its final chapter is completely blank.

AI used “sentiment analysis” (eep) to describe six possible story arcs. Each one is identified by the number and direction of changes its subject takes: rise (“rags to riches”), rise then fall (“Icarus”), rise then fall then rise (“Cinderella”). This provides probably the best way of describing why spoilers can be so painful. Les Miserables follows the rise of Jean Valjean, and throughout the whole story you’re wondering, “Is this an Icarus, rise then fall? Is it a Rags to Riches? If Valjean falls, will he rise again?”

There’s another spoiler, which I think is specific to whodunnits, where the mystery has an answer and that answer is given away. Corrolary, but the reveal that Darth Vader is Luke’s father doesn’t materially affect the rising and falling of the hero. It just ups the stakes, and answers a question that the story asks several times.

Interestingly, psychologists found that a person enjoys a story more when they know the ending going in. One of the reasons posited in this particular study is that the reader can appreciate aesthetic elements when they aren’t preoccupied with a story’s ending. I hear that. But maybe the overwhelming perceptual distaste for spoilers indicates something important about how we enjoy stories. Namely, that we don’t always watch stories to enjoy them. Sometimes we want to suffer a little.

Is VR 4D (or; technorati alphabet soup)

An animation of a 2D platformer, projected as a plane from slices of a 3D Minecraft world. It's very complicated, I'm sorry.
Read on to know what you’re looking at. (Image credit: Mashpoe)

By way of figuring out whether I’ve got the bones of an actual essay, I want to talk through a weird turn of phrase I’ve seen a lot around discussions of VR — particularly laymen discussions of VR. It’s the idea that the virtual future we’re about to enter into is “4D.” Journalists often describe (I will use this word exactly once, and here it is) the metaverse as fourth dimensional.

Disclaimer that I am also a layman when it comes to VR. Disclaimer that I actively do not want to criticize laymen descriptions of things. But where does this language of dimensionality come from? Is it just a sci-fi synonym of “futuristic,” or do these writers have something else in mind?

To start, let’s talk about dimensionality. “Dimensions” come to us from topology, a field of mathematics that studies geometric objects. Specifically, it studies objects as they transform without breaking, puncturing, gluing, or sewing. Specifically, topology defines objects as pieces of dough or clay, that you can stretch and mold however you want, but that you can never tear up. A donut is different from a flat sheet, but not different from a drinking straw. Topologically. To be honest I’ve never found an easy description of topology.

Dimensionality appears in all kinds of fields, though. You’ve probably heard that the fourth dimension is time. What that means is that, if you were mapping the world on a coordinate grid, you could cover every point with a unique three-number code — but you would need a fourth number, a fourth dimension, to also indicate every point of the world at any point in time. I live at x latitude, y longitude, z elevation, and w time. This is a really classic use of dimensions.

As with literally every concept in math, dimensions are abstract. They don’t have to be used for anything. That’s what all the letters are for: they’re understudies for whatever ends up being useful. Fourth-dimensionality describes any conceptual space that can be defined along four axes. Your opinions on burgers, hot dogs, tacos, and french fries can be plotted in four-space (that is, in 4D). The Rotten Tomatoes audience scores for the Twilight tetralogy are (72, 61, 60, 65) if you average Breaking Dawn parts 1 and 2. The scores for Diary of a Wimpy Kid are (49, 62, 63, 30). You can plot the quality of a tetralogy in 4D.

Why do people mostly focus on dimensions of spacetime? Because it’s more interesitng to know how we should describe the universe than it is to know the quality of Diary of a Wimpy Kid movies. (Barely.) Milo Beckman writes in Math Without Numbers that researchers have a legendarily difficult time finding all of the unique shapes in four-space, despite the fact that it’s probably the best way to describe the real cosmos.

It’s not just that we don’t know the shape of the universe — until we finish classifying the four-manifolds, the universe might just be a shape we haven’t thought of yet.

None of this gets us any closer to knowing why VR apparently feels like a fourth dimension, on some instinctive level. Disregard time, and virtual environments are still modeling three-dimensional space by projecting it onto two-dimensional screens. You could assign a binary value in the fourth dimension — “blink once if you’re in the real world, blink twice if you’re in the virtual world.”

Here’s my best guess: Every virtual world simulates a 3D world. It simulates our world. Even the Matrix (which we are a far cry from) simulates the real world. While in one of these virtual worlds, you’re never very far from other virtual worlds. I imagine it like the Construct. You can enter it from anywhere, and load in anywhere. In that way, it overlaps the 3D simulation. Mashpoe’s 4D Minecraft is a more niche, but much better comparison.

Virtual reality isn’t just a world, but its own solar system of worlds that you can hop between at will. I’m generally skeptical about VR — I don’t think it will ever get to the fidelity of actual reality — but one thing it does well is collapse distance. Which is exactly what higher dimensions are about.

Uh, thanks for coming along on this journey with me. I hope you found it a little interesting. I definitely didn’t have an answer to my own question when I hit “new post.” At the very least, I hope you find topology as mind-bendy as I do. Now you can enjoy a scholarly chuckle to some jokes by mathematicians.

Love vicarious

Corita Kent said:

I don’t think of [my work] as art — I just make things I like bigger.

Art is about liking and loving things. Criticism that relies on hate always leaves a bad taste in my mouth.

Studying, adoring, eating something whole deepens your love for it. I think the only way to broaden your love for things in the world — to move more of the universe of things from your “don’t like” and “don’t care” columns into your “love” columns — is to see them through the eyes of someone else. I’ve always been afraid of bugs. The person I love most in the world loves bugs.

This is an orchid mantis. I love the orchid mantis.

An orchid mantis, camouflaged to look like the flower it's perched on.

I watched X-Men

A man's face stretching as he sticks it between metal bars, from X-Men (2000).
Ew. (From X-Men, 2000.)

I ended up watching to original X-Men movie today. Not sure how it crossed my desk, except that I’ve been watching a ton of video essays on film YouTube about the failings of modern superhero blockbusters — you know the ones.

I don’t think a lot about the MCU, except that all triple-A action movies are sort of caught in its orbit. X-Men wasn’t a scrappy production ($75 million USD). It does hail from a totally different period of superhero movie, particularly the superhero teamup movie, and that made it a really interesting watch.

Three things I loved (spoilers, I guess):

  1. A surprising amount of body horror. The premise, that mutants are a new and misunderstood phenomenon, is supported by how gross some of the mutations actually are.
  2. The end of the Wolverine/Mystique fight. Mystique pushes Wolverine away and escapes; Storm beats Toad (and says her iconic line, I don’t care what people say); Storm approaches Wolverine cautiously while he stands in the middle of the room, like he’s trying to sense something. What amazed me about this scene is how its dramatic tension exists purely for the viewer. A lot of movies in this position would have shown us where exactly the bomb is, then played the scene out. We, Dora the Explorer-like, point and yell, “She’s right there!” Those scenes can and do work. I liked being the benefactor of the suspense, though, rather than the characters on the screen.
  3. Wolverine (I love Hugh Jackman, can you tell) escaping from the metal restraints in the Statue of Liberty scene. Something X-Men movies are very good at is imposing psychological limitations, rather than physical limitations, onto their power system. We know Wolverine can survive stabbing himself through the chest, but he tells us earlier in the movie that he still feels pain like normal. At the climax, he endures that pain to save Rogue.

Anway, I liked it a whole bunch. Plan on watching X2 sometime, but I may or may not write about it. I’m interested in writing more review-y posts, though I realize they may only be useful to worldbuilders and Dungeon Masters like myself.

Tomorrow I’ll do some honest-to-goodness research. Scout’s honor.

The Fountain of Youth

Two mice, one lying on the ground with head resting on forepaws, the other is standing on hind legs with forepaws crossed, they are looking at each other, with three bells on the ground.
Start a blog? Great idea.

Blogging is maybe on the horizon of some kind of cultural moment. The Verge and Tedium (an excellent publication with excellent editorial taste) detect that, as Twitter’s twilight makes us question what’s possible with short-form content, the long-form of the blogosphere will recapture our attention.

I have mixed feelings on this. For it to come true is an objective positive (although, I swear I had decided to start this very blog prior to the New Year), but it’s hard for me as a young person to imagine something other than full-faucet media. To imagine art, rather than content.

Worth getting ahead of this early and often: I feel extremely unqualified as a trendspotter. There’s not a party alive I wasn’t late to. So I won’t even try to predict whether or not blogs will actually succeed — I hope they do — but like I said, I’m young, and it’s not something I’ve seen before. The shape of it is alien to me.

(This sort of doubles as an intro post to whatever it is I’m doing here. Hi!)

If I were to target the specific quality of blogging that feels beyond my attention, it’s this: the currency of the internet is specialization. You can’t just write, or take photos. You can’t make a YouTube channel for your Minecraft let’s plays and then post a recipe you made, and then make cultural commentary. All of those things feel very antiquarian to me, from an earlier time of social media. TikTokers bemoan that one viral joke dooms your account to forever be a retelling of that joke, changing the words slightly each time, until you are sorted into your usual obscurity.

There are fashion accounts, and there are cooking accounts. There are listicles, there are reviews, there are how-to’s. Here is the Fountain of Youth: to find one thing that you don’t mind doing forever, and do it. Forever.

Gosh, don’t think I’m bleak for saying this. I’m new to it is all. What charms me so much about a blog is that it’s a very personal space, where you don’t have to shape your ego to the peculiar demands of the machine. But because you’re not playing the machine’s game, you also don’t get the machine’s bounty: other people’s attention. Something tells me this is a feature, not a bug. You may not get to be a household name, but you can be a “Kansas City Star.”

I made an account on Substack the other day, anyway. You know, the one thing I really consider myself an expert in is D&D, so maybe I’ll write about that. I love reviews as a format. First thing, I should probably figure out how to make a good intro post to a blog. Scratch this one.

Santa’s panopticon

A sepia-toned drawing of santa at a feast table, surrounded by people in royal or religious headgear. He's holding his fingers over his lips, like he's asking you to keep a secret.

I’ve been really digging into the work of Rayne Fisher-Quann lately, particularly some of her conversations about “coolness” online. I’m going to cook on these ideas a little longer and hopefully come back with something more essay-y to say about how creativity appears online as an aesthetic quality rather than as a series of habits.

Meanwhile, her West Elm Caleb essay has me digging back through my college Google Drive — which, I think I locked myself out of trying to access. So long, Toxic Anime Husband PowerPoint presentation. Why I’m digging through the Drive though, is that Rayne’s coinage of the “feminist panopticon” reminded me of this study I read in my sociolinguistics days: “Encounter with reality: Children’s reactions on discovering the Santa Claus myth” by Anderson & Prentice. A few weeks too late to be holiday-relevant, almost a year late to be meme-relevant. This is where I plant my flag.

(Oh, and another thing: I just realized I can use exclamation points! Because this is my own damn website!! I promise I won’t use this power for ill.)

Anderson & Prentice interviewed children who no longer believed in Santa Claus. They asked the subjects, among other things, to recall the events that lead up to them uncovering the conspiracy, and how they felt afterwards. The researchers also interviewed the parents of the children who no longer believed in Santa Claus, and asked what they did to keep up the ruse, and why.

So as to delay the reveal of the study’s results, maybe it’s worth sharing my own nearness to this question. When I was in first grade, I broke the hard truth of Santa to — no lie — my entire class. Authorities found me sitting smugly at a long cafeteria table full of wailing six year olds. There were phone calls made. I hope that the universe truly is indifferent, because otherwise my judgment will be broadcast cosmos-wide.

Unless a nasty first-grader spoiled it for them, the children of Anderson & Prentice’s study actually reported strongly positive memories of their discovery of the Santa myth. Children are already very good at holding many conflicting beliefs at once, both committing to a fantasy fully while also recognizing its absurdity. Couple that with a kind of escape room that everyone from caretakers, to malls, to trillion-dollar corporations actively help build. You’d enjoy unraveling that mystery, too.

Parents, on the other hand, expressed strongly negative feelings when their children discovered the Santa myth. It’s a loss of innocence, it’s the end of the very first season of your life.

Two details from this study will (I hope) tie it back to Rayne Fisher-Quann’s panopticon.

  1. Parents were way more invested in enforcing the reality of Santa Claus than children were in accepting that reality.
  2. Parents enforced the Santa myth for the magic of it all, not to make the children behave, because cautionary tales really don’t make children behave.

Cautionary tales don’t make anyone behave. We delude ourselves to say accountability is the benefit of universal self-surveillance. Seeing West Elm Caleb — or any publicly bad person — dragged through the public square (or the public hypercube) will never prevent another similarly bad person from being similarly bad. I seriously doubt it will stop the person being dragged from being bad. If you let someone back off and mend themselves in a private space, they might. If you force someone to double down, they will. This is the universal fate of publicly shamed people, from Twitch chatters to world leaders, as far as I know.

All of this was true pior to the age of data brokerage. Now, desensitization to surveillance is basically a requirement for using the internet.

In the same way that caretakers shape the reality of their children to preserve the magic of Santa — and god, I am only now realizing how gymnastic it is to directly equate Santa and surveillance culture, but I think the metaphor is a sound one — in the same way caretakers propogate a version of reality that aligns with their own self-interest, so do all power systems curate the experiences of those they lord over, better aligning that experience with their own goals. In the case of tech billionaires, those goals are likely not to enforce good behavior. You have to wonder what “holiday magic” they’re trying to capture.

Change the systems around you

My desk, with a brown leather Travler's notebook, a red tupperware, and flashcard cubes full of kanji. The lighting is terrible.
My notebook, my flashcards, and some peanut brittle I got for the holidays. All have equally impacted my intellectual life.

Architect Cliff Tan on resolutions:

These tasks are so unnatural to you, they are extra difficult to do and you’ll probably give up […] So, the trick is to understand your own tendencies and, instead of changing yourself, you change the systems around you.

My secret sentence for 2023 is: “Your life would not be better if you were different.” I’m wary of how self-improvement gets fetishized, especially at this time of year. (Not that I’m anti-resolution — you’re reading mine right now.) You can do whatever you want, but I think watching other people’s lives swamps us in habits we wish were ours. Even if those habits are totally counter to the natural grain of your life. Maybe you’re not going to the gym, in other words, because you don’t like the gym.

The second part of Tan’s message reflects another deeper fact about our habits. For instance, I bought my Traveler’s Notebook because I thought it would actualize me as a writer if I always had paper and a pen with me. I just never brought it with me, so I got a little sleeve and started using it to store my debit card and ID. Now I take it everywhere. Having a lifelong ambition for writing did nothing to motivate me, but changing my wallet out for a notebook did. It’s without question my favorite thing I own. Habits shape to our environment as much our ego. Probably more.

I’ll return to Mr. Tan if I ever get around to talking about feng shui. It’s one of my favorite examples of the explanative power of pseudoscience.