As any author can probably tell you, coming up with titles for your novel is extremely difficult.

In my experience, the title either comes to me with minimal effort or cause or never comes to me at all.

Pretty Deadly – my book reimagining Cinderella as a psychopath – is the first in a trilogy (and is being re-released as CINDERELLA’S CRIMES in one month!). That meant I needed to come up with not one but two more titles, and that they need to somehow fit with the theme of Pretty Deadly and sound like they could be its sequels.

At first, I thought I’d have synonyms for each word – something like “Pretty” and then something like “Deadly”. This led to me making lists of synonyms and slicing them together, and I came up with nothing good. You’ll see that the neural network came up with one title from my list!

Then, I thought I’d go for Deadly [SOMETHING] for book two and then start the title of book three with whatever that second word was. But what comes after Deadly? This felt like an almost impossible word puzzle, and after making a few very long lists, I gave up.

Finally, I considered calling book two Pretty [SOMETHING] and book three Deadly [SOMETHING]. But still, I struggled to come up with appropriate follow-up words for each of them.

And so I did what any reasonable person would do: I turned to artificial intelligence.

A year ago, when I embarked on my journey to release the full trilogy, I turned to GPT-3, a large language model. It was designed and trained by OpenAI, leveraging essentially the entire internet as data to create a model of written human language use. It’s been used for all kinds of valuable and hilarious things now that it’s available for anyone to play around with through the OpenAI website.

Given my own history of working with much less capable models, I first turned to fine-tuning. This is a process in which a model is trained on a larger corpus–in this case the entire internet; for previous models I’ve played with often all of wikipedia or a bunch of books from Google Books–where it learns to predict text that sounds reasonable. Then, once it can generate sentences, the practitioner gives the model a more structured and specific task, with a sequence of examples that demonstrate exactly how the model should behave at inference-time. For example, you might say, “Great, you learned to make fake Wikipedia articles. Now, here’s a bunch of humor articles. Can you be funny?” This second dataset is often (much, much, much) smaller than the original one.

In this case, I created a list of titles, descriptions, and keywords, and presented an existing version of GPT-3 with all of these samples in the hopes it would divine the pattern.

This didn’t work.

My training set consisted of things like:

{“prompt”: “when a living boy is thrown overboard, simi does the unthinkable–she saves his life, going against an ancient decree, and punishment awaits those who dare to defy it.\n \n\n###\n\n”, “completion”: “skin of the sea END”}

{“prompt”: “a young prince must rely on a mysterious stranger to save him when he is kidnapped during his coming-of-age tour in this swoony adventure\n\n###\n\n”, “completion”: “in deeper waters END”}

(The \n\n###\n\n and END are end tokens for the prompt and the completion, respectively–basically, teaching the model that it’s okay to stop generating new text.)

When I gave the fine-tuned models prompts about Pretty Deadly and its prospective sequels, here’s some completions it gave:

Pretty Deadly:

Prompt:

A servant girl plots to get revenge on her cruel family, wed the prince, and take over the kingdom in this dark retelling of cinderella

Completions:

we fight as best friends

AS LONG AS I LIVE

all hail the queen

Book 2:

Prompt:

after her happily ever after, this cinderella poisons her in-laws and plots to take her husband’s throne

Completions:

It’s too dark

A classic, yet outrageous (and out of print!) nu romance

There aren’t any window

Hilariously, for that second one it forgot that it was supposed to generate a title, and generated a description instead! I’m amused by that spelling of “new”, as well.

Book 3:

Prompt:

Cinderella’s on the throne, but her enemies are coming for her, and so she goes to war against them before they can

Completions:

Like most creatures

Whaaaaat?: Cinderella

Grind sword, punch people, defeat her enemies

To be honest, these last ones are great. Not what I’m going for, but I’d read a book called Whaaaaat?: Cinderella! And I feel like “Grind sword, punch people, defeat her enemies” is not this book at all, but would be a hilarious jumping off point for a gritty and comical feminist epic fantasy.

Laughing but not satisfied, I actually read the GPT-3 docs from the beginning this time (as I should have done all along) and came to the startling conclusion that the model definitely did not need to be fine-tuned. Instead, I should have been interfacing with it directly, giving it a prompt and then asking for a list. A sample prompt was:

This is a tweet sentiment classifier

Tweet: “I loved the new Batman movie!”

Sentiment: Positive

Tweet: “I hate it when my phone battery dies.” 

Sentiment: Negative

Tweet: “My day has been 👍”

Sentiment: Positive

Tweet: “This is the link to the article”

Sentiment: Neutral

Tweet: “This new music video blew my mind”

Sentiment:

So I constructed the following:

This generates book titles from descriptions

Description: when a living boy is thrown overboard, simi does the unthinkable–she saves his life, going against an ancient decree, and punishment awaits those who dare to defy it

Key words: fantasy, romance, adventure

Title: skin of the sea

Description: a young prince must rely on a mysterious stranger to save him when he is kidnapped during his coming-of-age tour in this swoony adventure

Key words: queer, fantasy, adventure, romance

Title: in deeper waters

Description: A servant girl plots to get revenge on her cruel family, wed the prince, and take over the kingdom in this dark retelling of cinderella

Key words: dark, fantasy, adventure

Title: Pretty Deadly

Description: after her happily ever after, this cinderella poisons her in-laws and plots to take her husband’s throne

Key words: dark, fantasy, adventure

Title:

I thought for sure it would work now!

It did not.

No matter how I varied the temperature, it just generated real book titles that already existed!

Temperature is a setting you can tune while using the model for inference (as I am here), and while it’s not creativity or randomness, it does adjust the “openness” of the answers. In the docs, they say, “If you’re asking it for a response where there’s only one right answer, then you’d want to set these lower. If you’re looking for a response that’s not obvious, then you might want to set them higher.” It wasn’t clear what the right number would be for this task, but it didn’t seem much to matter.

If I included “Cinderella” in the description, it picked out Cinderella retellings, giving me “Cinder & Ella”, “Throne of Glass”, “Cinderella”.

If I didn’t include “Cinderella”, it grabbed seemingly random titles: “Mae Among the Stars”, “The Sad, Strange Story of a Girl Called Amber”,  “Stormdancer”.

I added “new” before “book titles” in the prompt and they got a little more…interesting:

Revenge of the Whore Princess

The Canterbury Tales (Squirrel Nutkin Edition)

I actually think these are fanfiction? I did not really want to Google them, however.

I gave it more examples, and it started to figure things out a little more:

The Summer Prince

Description: Set in a dystopian, oppressive future, this Cinderella-esque story tells of a special girl who falls for the prince, who has plans of his own. She must prevent this and save her country from its destructive fate.

Key words: sci-fi, romance, adventure

The Tale of Snow White and Rose Red

Description: This is the story of two lovely sisters locked in a power struggle with the witch that lives in their wood until seven dwarfs show up.

Dark fantasy

…kind of.

At this point, it began to generate some passable but uninspired suggestions, like “the handsome prince and the witch”. This doesn’t match my story, but it could be a book title, and as far as I can tell it isn’t one already.

I decided to try a new format for the prompts, since the model seemed to be getting confused–it appeared to be generating a new triplet of title-description-keywords, rather than completing the description-keywords-title triplets I had tried to construct.

My new datasets looked like this:

This generates book titles from descriptions

a servant girl plots to get revenge on her cruel family, wed the prince, and take over the kingdom: pretty deadly

after her happily ever after, a new princess poisons her in-laws and plots to take her husband’s throne:

And finally it was giving me just titles!

blood and water

Is this not a book already?

dead girls can’t wear pretty dresses

I think a lot of people are buried in nice dresses, actually. But this could be a romp of a YA novel, honestly, and I’d read it.

Deadly royalty

So to the point.

The wicked crown

Too much like The Wicked King, which is a real and good book.

Princess of blood

Ooh chills. I love this, but it’s not this book.

Deadly Princess

Meh.

the queen’s sinister motive

Could be a title for the second book, honestly, if not for the naming scheme I’m going for.

impossible princess

Could be a book. Isn’t this one.

One dead bride

She’s one of very few characters who doesn’t die.

Nicely deadly

This ended up on my bad brainstorm list too. GPT-3, I relate.

Vernal treachery

Sounds fun

Deadly ever after

I actually loved this! But it’s already a book.

Deadly darlings

Cute, but not right for this

Deadly for one

Not sure what this means.

It got very hung up on starting with Deadly, which had actually been one of my ideas! It also, to my great amusement, really wanted me to use “vernal” – besides “Pretty” and “Deadly”, this was the most frequent word in its suggestions.

I tried giving it the example of Deadly Ever After as the third, along with Pretty Deadly as the first, and aimed for the name of the second one.

First it tried combining them all:

Pretty deadly ever after

Then it tried:

Pretty Revenge

Could’ve been the first book, honestly, but not the second. There’s nothing so neat as vengeance anymore.

Pretty deadly ever after (in a newlywed way)

Hilarious. Sounds like a meme.

Because it seemed extremely stuck on ising “Pretty Deadly”, I removed Pretty Deadly from the samples. This yielded:

Pretty Icky Pickle

When I’m sad, I will think of this and laugh. The perfect pick me up.

Pretty Vernal Ever After

Again with the “vernal”!

Pretty Murder

Technically true. Although even Cinna is probably not at her best while killing people.

Pretty Ixolite

What even is this?

It was really not getting this! I tried changing “poison” to “kill” in the book’s description, on the thought that maybe the frequency with which it was suggesting “icky” had to do with the word “poison”, but that yielded the same result.

So I tried rewriting the prompt entirely: “Cinna is married to the prince, but she won’t be satisfied until his parents are dead and she can rule”.

Pretty Murder

Again???

Pretty Prince to Clean Her Weapon

Hans never does clean Cinna’s weapons, alas.

Pretty Answers to Pretty Problems

This could be a book, but it’s not my book.

Pretty Girl on the Throne

Not wrong, but not helpful.

Pretty Until Blood Starts Splashing On the Walls

True, I suppose.

Pretty Hangs the Prince

Spoiler alert: not how he dies.

Pretty Kill, Pretty Kill

Sounds like something I’d whisper to my cat while she hunts.

Pretty And Deadly

I suppose I could do something like the Fast and Furious series! Pretty Deadly, 2 Pretty 2 Deadly, Pretty Deadly 3: Tokyo Drift.

Eventually, I came to the unfortunate conclusion that GPT-3 was not going to save me and come up with a title for the second book. Perhaps my prompts were too scrictive or not clever enough. Perhaps I should’ve waded through more hilarious suggestions. Check back in a few days for my travails using Chat-GPT for the same task.

We can all cheer ourselves up with the thought of Pretty Icky Pickle, though. Forever in my thoughts.