I want to like AI, stop making me hate it

It’s a tremendously rainy day, I’m still hours of TikToks away from completing my next Supernormal post — programming note, until I’m confident I have the time to juggle multiple projects simultaneously, posts will continue to be done when they’re done — and I have the day off work. Still, despite the threat of boredom, I initially refused to mess around with GPT.

My dislike of the technology has been misplaced by tech futurists and mercenary bloggers who, lacking loyalty to creative work on some fundamental level, focused the conversation on how we can replace pesky artists.

Pesky artist here: not cool. Also, good luck.

Nothing I’ve seen has inspired an iota of confidence that AI can replace creativity. In fact, the more I’ve learned the less confident I’ve become. Replacing artists is a whole other matter, an outright contradiction for what art means.

Still, I have a crypto-borne hunch that enough wealthy people believing art is mechanizable, optimizable, could make it so. The Tinkerbell effect: the creative power of collective belief.

Gather the real world manifestations of this hustledrone corpo-shitpost mania — the blessedly short hiatus of Clarkesworld resulting from an influx of AI submissions, from which other smaller literary publications have not been spared; the disgusting “future of animation” era courtesy of Corridor Crew — well, I’m not too sorry for my suspicion around generative AI.

BUT, I also don’t want to dismiss people’s real excitement about the real capabilities of this technology. It’s cool! It should be cool! I don’t want wealthy people to ruin that.

The key is that AI is productivity software, not creative software. Excellent writeup on this subject by Ryan Broderick. He also points out that, despite focus placed on creative industries by — I’m quoting here — “lightly bearded men who pay for Twitter,” the thing this new technology is best at is coding, not creative writing.

Broderick also writes about using ChatGPT to code without experience, but honestly the first thing that’s gotten me excited about this stuff maybe ever is a video by (checks notes) Wyatt Cheng… oh dear. Ahem, a video by Wyatt Cheng in which he recreates Flappy Bird with entirely AI written code.

Cheng’s ignominious position as an Activision Blizzard director aside, I think this video demonstrates something truly cool about generative AI: it lays lots of groundwork, but requires someone with actual skills and ideas to make anything approaching elegant or useful. Cheng regularly identifies problems in the code, things that would improve gamefeel, or just points where his vision didn’t line up with what GPT produced. He could tweak the program in real time because of his technical creative experience.

No architect ever found creative fulfilment in the pouring of concrete. It’s a necessary prerequisite for their creative work.

There’s a nuance here I worry I’m not capturing, because so far what I’ve written sounds like a billion think pieces already written. What I’m saying is, I think the common refrain that AI can be a device for inspiration is a little chickenshit. At the risk of sounding elitist, I don’t support the idea that creating a plot outline, or generating dummy paragraphs, or automatically generating character names is writing busywork. Every step of the writing process should be personal, from inspiration to blank page to revision.

Writers might use ChatGPT to organize their drafts into folders or, I don’t know, set a schedule with word count goals. Painters could identify which blends of paint will create a particular color — although maybe I’m showing my ignorance, and even the process of mixing paints is a source of inspiration. No one should be asking GPT for what to write about or what to paint.

Point is, art isn’t code. I’d rather encourage creative people to take the leap into self-reliance rather than assuage thorny parts of creative work with soulless, VC-funded robotherapy.

Hm. So far this post goes “I don’t like AI, but, I don’t like AI.” Let me share what it is I got up to today, my first time actually noodling with ChatGPT and having a good time.

Tabletop game design is my hobby. It doesn’t live anywhere online right now, but I like messing around with it. I use a website called Homebrewery to make my stuff look like official, publishable design. It occured to me that I could use GPT to translate my work — in this case a character class — from Google Docs, my native design environment, to Markdown, the code used by Homebrewery.

It worked okay for that. Seeing my work externalized was cool. Moreover, though, I started asking the machine for roll tables and additional class features. It produced templates in a format I was familiar with, and design language I was familiar with.

What excited me most was how useless everything was. GPT misunderstood my vision. Its range increments were all over the place — cantrips that incapacitated each creature in a 50-foot radius. The flavor text was sometimes neat, but wholly uninspired.

You know what it felt like? It felt like when you drag out a few boxes in Excel to autocomplete the spreadsheet. Nothing created by the machine felt like mine, nothing felt finished. Just a very organized blank space for me to apply my own ideas.

And, though I am very loathe to admit it, a few ideas made me go “oo!” but I mean, hell, artists can be inspired by a walk in the park. Maybe it’s naive of me to ignore the notion that a word association box like GPT could spark something.

ChatGPT thrived as a tidy, intelligent design environement. Like a smartphone-esque upgrade to Docs or Excel. I get to bring the ideas, I get to bring anything that makes the system playable or fun or beautiful, because I have spent a long time developing my own ideas about what’s playable and fun and beautiful.

I’m still waaaaays away from ever using this stuff in my writing. I suspect I never will. But in a high-overhead creative project like game design, GPT isn’t the villain it appears to be.