Forfeiture of Nuance in the Rise of AI

by Jeanann Verlee

Humans are wildly complex. We help haul our neighbor Earl’s groceries up the stoop and hold the subway doors for a hurried stranger. We mispronounce our colleague’s name and carelessly bump the seat in front of us on airplanes. We walk our friends’ dogs, spill wine on Nana’s tablecloth, and forget to flush. We’re frothing with desire and rage and regret and fear. Each of us, a gloriously unique orbit of idiosyncrasies.

I emphasize this with my writing students. As a poet—fiery, soggy, and a little punk rock—I zealously crusade for complicated human subjects. People who are just as dynamic and amazing and wretched and difficult on the page as we are out here in reality. No two-dimensional caricatures. Write a darling villain. A reprehensible protagonist. Give me the mess.


ho·mo·ge·ne·ous
/ˌhōməˈjēnēəs/ | adjective
being all the same or of the same kind; alike.
u·nique
/yooˈnēk/ | adjective
being the only one of its kind; unlike anything else.

Not long ago, in the pre-X Twitterverse, I noticed several fellow poets collectively
experimenting with AI author portraits. Soon after, AI cover art for books. A selfie or title plus some thematic keywords and shazam: computer-generated art. For a fleeting period, my feed was saturated with writers using robots to do the creative work of other artists. It was uncomfortable. No—I was uncomfortable.

What about photographers? Illustrators? Painters? Graphic designers? Don’t we want human artists to get their shine? Don’t we want to collaborate across mediums, genres, talents, specialties? Sure, bots are cool and the process is fun, but what utterly unique artwork might be possible if you collaborate with... a person?

I’m all for curiosity. I’ll readily rabbit-hole my way through the history of the Phillips-head screwdriver, dismantle my Brother CS6000i sewing machine to understand how it functions, or devote unreasonable hours to Instagram Reels learning new cooking techniques. That’s part of what I cherish about poets, writers, and creatives in general. We’re curious weirdos with a voracious appetite for information. Yet, I was a little surprised by the DIYw/AI craze because poets tend to be overwhelmingly empathetic, driven by compassion and community, rooted in activism, and (cough) highly protective of intellectual property.

By this I mean: poets love to push creative boundaries and we abhor anything that even remotely edges near plagiarism. So, when poets started replacing human creators with AI, it seemed like an odd fit. I understand AI integrates the work of actual artists to enhance its simulation efforts, but it remains that: simulation. Replication with a few frills. Not unique. Not human. Not sentient. Why rob other artists of an opportunity to flex their intellectual prowess?

I don’t know if anyone outside of the tiny poetry sphere even noticed this fleeting phase of curiosity. Poets don’t get much notice. In fact, even the word “poets” is routinely autocorrected to “ports.” Just now as I typed it: “Ports don’t get—.” Twice. Oh, autocorrect, forcibly unthinking my thoughts. Duck off.

The results produced during this tiny AI craze were curious, but rather analogous and void of personality—an awkward step toward homogeneity. I don’t know if any of the poets ultimately used their AI experiments for books or publicity, but as these faded away from my Twitter feed, I figured it was just a nifty little fad. I had no idea ChatGPT was only a year away.


We keep tiny hand-held computers within reach roughly 18 hours a day. Insomniacs like myself? Even more frequently. We need computers to communicate, earn wages, pay the heating bill, and yes—even to create art. I’m typing this on a MacBook. I thumbed my scattered initial thoughts into Notes on my iPhone. I’m not transcribing my pre-dawn doom thoughts from a dogeared spiral notebook, no. I’m nursing numb fingers and wrist pain from what is likely a longstanding flirtation with carpal tunnel just so I can elucidate my unpopular take on the daunting rise of AI. From laptops to electric cars, video games to rocket ships. Missiles. NYPD’s robo-dogs. International Space Station. “Alexa, play NPR.” We rely on computers for everything.

Capitalism—if it were a person—would argue that computers are a crucial ingredient in the commerce pie. Programmed to usher us ever more quickly to the final version of whatever product needs selling. Coded algorithms predict what we are thinking (about buying). Or might possibly be thinking (about buying). Or might one day want to think (about buying). Faster. Faster. Sell. Sell. Ports. Duck.

And here we meet AI.

Remember the unsettling dread that Clarke’s (and Kubrick's) nefarious HAL 9000 could one day be real? Uhm, HAL’s here. Hi, HAL.


I grew up in a small brick house with a rotary phone and a wood-burning potbelly stove. No air conditioner. No dishwasher. Honestly, not even a shower. We used a good ol’ fashioned bathtub for our daily hygiene routines. We had a large walnut console that housed the entirety of any family’s entertainment needs: a hefty-knobbed AM radio, a vinyl record player, and a black-and-white television set whose tubes regularly blew out. (No clicker.)

Not archaic enough? Dad drove a rusted 1970 Ford pickup with manual transmission that ran on (yep) leaded gas. I learned bookkeeping in general ledgers—not a company’s GL, I mean legal-size green paperboard pads with dizzying arrays of columns. Yes, I learned double-entry bookkeeping by hand. In pencil. I taught myself 10-key in my teens while working as a bank teller and used a standalone adding machine until as recently as (gasp) 2012.

I learned to type on my grandfather’s now-vintage manual typewriter, which was missing the comma slug, produced a smeared dot instead of a lowercase “n,” and whose hammers jammed when my keystrokes were too quick. In school, we were required to take typing classes using manual IBM typewriters. Knowing I wanted to be a writer, my parents eventually gifted me a Smith Corona Coronet Super 12 Coronamatic Portable Electric Typewriter—with a carrying case. Electric! I typed all my early compositions, short stories, poems, and even college application essays on that beautifully ugly beast.


AOL reintroduced a generation to letter writing. AIM gave rise to a new age of initialisms. MySpace taught thousands of us basic HTML. Facebook connected us with long-lost high school friends. TikTok let us fall in love with Nathan Apodaca. YouTube. Tumblr. Vemo. Twitter. Instagram. Threads. Everything in between.

Reminiscing can be comical. While corporations now depend on Slack expressly for internal communications, I once was written up for using AIM with coworkers. Restaurant waitstaff slipped actual carbon paper between pages to duplicate guest checks. Teachers cranked out purple-inked assignments and pop-quizzes on mimeograph machines. I yelped each time I picked up my grandparents’ phone because I could hear strangers’ voices on the party line.

Tech has profoundly changed how we communicate and honestly, I’m a fan. Zoom above conference room. Slack above email. Text above phone call. These are wins.


My first computer was a second-hand Commodore 64 my dad bought from a friend years after release. It was missing some of its snazzier (and necessary) external components. I had the main console, the TV-connection cord, and the power cord. No floppy drive. No paddles or joystick. No printer. Also, no user manual. So while this relic steered thousands of kids toward becoming
tech-savvy DOS gurus, I’d plug it into the TV, type lines of English, and watch the cursor blink. I had no resources to learn DOS. No means of making the computer do anything. Eventually Dad got a Kaypro which I learned to use for his business bookkeeping, but until I was in college, I wrote by hand in notebooks or used a typewriter.

Computer games were also fairly out-of-reach when I was young. I recall a cousin introducing me to Atari’s Pong, but I didn’t have access myself. At school, wealthy classmates showed off hand-held games like Simon and Merlin: The Electronic Wizard. Dazzling technological feats at the time.

Data storage alone has been in a race with itself. IBM’s flexible 8” floppy disks evolved into 51⁄4” floppies, then more durable 31⁄2” hard plastic disks. Soon, flash drives, ever-increasing internal storage, and the magical mystery of The Cloud. I’ve replaced my music collection four times: vinyl, cassette, compact disc, and MP3. Movies on VHS and DVD gather dust on my shelf. Anymore, I stream what I want to hear or watch. And that means I miss out because the more eccentric and underground favorites aren’t available.

And then we have apps. My friends Aaron and Jeff, each a decade my junior, operate their home lighting and sound systems from iPhone apps. Jeff had to walk me through downloading just so I could keep the lights on while pet-sitting. Their apartment building has an app-based access system, so only residents can “buzz” in visitors—and do so directly from their phones. Aaron and Jeff even have a festive app-based LED display that advises when the next subway train will arrive. Whoa. I still just stand on the platform and wait.

It’s been a challenge trying to keep up. Entire generations of young people intrinsically understand the malleable, ever-evolving world of tech. It’s so integral to their existence, they cannot imagine what my childhood was like. And I don’t need them to. I’m just worried about what we lose in whatever comes next.


I’m not anti-tech. Give me Excel any day over a hard-bound general ledger. I just don’t want Excel to think for me. I pledge allegiance to both Word and Google Docs, but I don’t want them to think for me. I was once a diehard Windows user, now I prefer MacOS. I’m malleable, too. I just don’t want either to think for me. As a poet, writer, and editor, I revel in nuance. The details. Exacting mechanics, complicated plot, authentic dialogue, and unique phrasing. I’m on the back side of this very screen, trying to articulate how important these details are while simultaneously acknowledging that I’m always on the fringe, chasing new tech developments and lagging by an easy mile. See? Complex.


nu·anced
/nooˌänst/ | adjective
characterized by subtle and often appealingly complex qualities, aspects, or distinctions
me·di·o·cre
/ˌmēdēˈōkər/ | adjective
of moderate or average quality, value, ability, talent; not very good

When the poets toyed with AI portraits, I worried about putting photographers out of work. When the poets toyed with AI book covers, I worried about putting artists out of work. And now as corporations everywhere integrate AI content generators and AI editors, I—poet, writer, editor—am desperately struggling to secure reliable work. I embrace creative nuance and that doesn’t seem to carry much value in today’s market.

In a past role as senior copy editor for an ed-tech company, designers and developers routinely suggested we employ Grammarly to assist with copyediting. However, we wrote voiceover scripts—dialogue—with a specific voice, tone, and vernacular. Such tools would have overcorrected our scripts, turning our characters’ natural speech into formal prose. My manager and I continually had to explain that Grammarly is great for certain uses and certain users, but was antithetical to our purposes—it would cause additional editing labor to undo formal corrections that rob the language of its personality.

Personality. Nuance.

You can code AI to implement proper adjective order but you can’t teach it the nuance of intentionally breaking that order for a desired impact. You can code AI to correct punctuation, but you can’t teach it the nuance of an intentional sentence fragment. I imagine you can even code AI to adopt a certain vernacular, but you can’t teach it the nuance of deliberately misapplying idioms for effect.

I am a creative. Full stop. I want to debate the finer points of prepositions, quibble over lackluster adverbs, and push for lush and muscular verbs. I want linguistic texture. Human nuance. I want to read other writers’ idiosyncrasies. I want to delight in a poet’s choice of line break or double entendre. I want to geek-the-hell-out over a writer’s deliberate choice to do something wrong.

And it’s not just me. Everyone loses when we forfeit nuance.


Math is perfect. (Theoretical mathematicians will argue this, but my point in this comparison holds.) You can back into basic math and rearrange the numbers, but the result is the same. It’s a perfect language.

7 + 3 = 10
7 + 3 = 5 + 5
1 + 6 + 3 = 8 + 2

Coding is math. Language is not. Swaps in language change meaning.

She was sad. ≠ He was sad.
They were sad. ≠ They will be sad.
He’s sad. ≠ He’s regretful.

Add a bit of detail and you change more than meaning, you change impact.

She felt sad. ≠ Daphne buckled to her knees, sobbing.

I can’t convince a cluster of code that nuance is what makes art ART. I’m not suggesting we return to the inglorious days of ‘80s tech, but I am gravely concerned about erosion of artistic nuance. Art matters. Creatives matter. We need to bolster human creativity. Create opportunities. Invite humans into the room and see what blooms. We need linguists to bend rules. Editors to deep-dive into language. Photographers to source light. Painters to fucking run amok.

Faster-faster-sell-sell for corporate profit robs us of the glory and complexity of sentience. It sacrifices intellect for mediocrity. Uniqueness for homogeneity. An estimated 300 million global jobs are likely to vanish under the rise of AI. (For perspective, the entire population of the USA hovers around 334 million.) Putting writers, editors, and millions more out of work because AI can generate a sub-par product faster? What does that do to the fabric of language? To the nature
of art? What happens to artists if society settles for mediocrity? What happens to society if society settles? What happens to the texture of a diverse and thriving culture? What new and fascinating turns of language do we miss with non-human AI writing for us? And why—in this beautiful, messy challenge of being alive—would we want to acquiesce to HAL?

The unpopularity of AI among creatives is evinced by the WGA and SAG-AFTRA strikes and the growing number of small presses and literary journals who refuse to accept AI-generated work. AI detection programs have started to appear—and though it is unclear how acute these systems may prove to be, nor how they might keep pace against AI advancements, these rebuffs are also wins.


The artist’s work is work. Attempting to replicate artistic intellect with code is misguided. Yes, artists (gladly) use tech tools, but many non-artists want computer science to do their thinking for them. More accurately, they want it to do our thinking for them. As a creative, it’s disheartening. But it’s bigger than my sadness or what might be interpreted as crabby “get off my lawn” thinking, it is genuinely affecting my ability to obtain work. To pay my bills. Live my life. And I’m not alone.

Media reports a healthy job market, indicating opportunities are abundant and unemployment is down—yet here in the real world, I see editors and writers hungry for work. No. Frenzied. As in, ruined careers. As in, 20 years’ experience but can’t fill the fridge. Eviction notices. Sleeping in cars or elderly parents’ basements. For each editing or writing role posted on the major job boards, I see 800... 1,200... 3,400 applicants. That is an astonishing ratio. Sure, not all 3,400 of those applicants are the “right fit” for any singular role, but where are the rest of the roles? We are highly talented specialists. Linguists. Creatives with skills far beyond what non-writers might imagine, and corporations are sacrificing us for mediocrity.

During a staff meeting at a previous job, one of the Chief Something-or-Others once posited that the company’s goal was to offer clients a final product that “isn’t crap.” I choked. Over time, I was repeatedly asked to greenlight subpar work, to pass it up the pipeline “as long as it isn’t broken.” That was the holy grail. Not spectacular. Not great. Not good. Not even mediocre. Just “not broken.” I’m still choking.

Managers—particularly in tech—love to say “perfection impedes progress.” They’re not wrong, but it’s moot in terms of language because there are too many variables to reach perfection. We all know this. Nothing can ever be perfect, but damn—can’t we aim higher? Do better? Give it our best? It’s demoralizing to witness leadership across multiple workplaces espousing mediocrity. Imagine such taglines in B2B whitepapers: “Our product isn’t crap!” “Hey, it’s not broken!”

• • •

I am a creative. And yet, I’m not “just” a creative. I’m not just a word nerd. Not just a storyteller. Not just an editor. Like anyone, I have plenty in my toolkit. I’m also an accounting nerd. An organizational nerd. Food nerd. Dark-art nerd. Equity and advocacy nerd. A rescue-pup nerd. I can lace up my own boots and bake French bread rolls and change a flat tire and clip my toenails and repair the toilet and mount shelves and grout tiles and rewire a lamp. I can wash my own armpits and write poems and change my mind. I’m a goddamn miracle. Like you. I’m not AI. I am irreplaceable. Like you.

Creatives bring nuance to the homogeneity. Writers, musicians, painters, photographers, dancers, actors—all are storytellers. Educators. Activists. World-changers. Wrong-righters. You can ban books and feed previously published manuscripts into AI to produce something more flaccid. You can rail against the human pace of creative intellectual labor. You can bet the house on bots, but at what cost? Who are you hurting? Think carefully about that.

What’s the funniest film you ever saw? ✓Gone. What book fundamentally changed you? ✓Gone. What painting snatched the air from your lungs? ✓Gone. What photograph brought you to tears? Musical? TV Show? ✓Gone. Who are you hurting? Yourself. You are part of everyone.

Each generation of next-level thinkers will think less. And I don’t want to live in a world like that. I want big thinkers who thrive in the nuances. Artists who explore the incongruities of what it means to be human. Sentient. Unique. Complex. Life is too short and too difficult to mold the code toward mediocrity.


Jeanann Verlee is the author of three books: prey, Said the Manic to the Muse, and award-winning Racing Hummingbirds. She has received a National Endowment for the Arts Poetry Fellowship, the Third Coast Poetry Prize, and the Sandy Crimmins National Prize. Her poems and essays appear in Academy of American Poets, Adroit, VIDA, and Muzzle, among others. She served as poetry editor for Winter Tangerine Review and Union Station, and has edited several award-winning books. She collects tattoos, kisses Rottweilers, and believes in you. Find her at jeanannverlee.com.