Lee Schneider
4 min readMay 21, 2024

--

Issue 75 — Your Official Spokesperson is not Real, and that’s OK

The black and white graphic banner for 500 Words.

Hello, welcome to 500 Words. This newsletter was written by a human.

For the past week, I’ve been deep into the final proofreading of my second novel, Resist. Once the proofreader has checked everything in your book, you have to proofread their work, which means reading your own book like it was written by someone else. It’s an out-of-body experience, or maybe out of mind. I’ve thought about hiring a proofreader to check the proofreader, but then would I need to hire another proofreader to check the first two?

Pre-sales for the book start next month. It will be published on June 30. (Which is also my youngest son’s birthday. He will be 12.)

Resist is book two in a three-part series of novels. The third book, Liberation, is lounging around now in a detailed but messy outline. The characters are doing exercises like players warming up before a tennis match. They wake me up too early in the morning with their chatter, and they sometimes make it hard to fall asleep. They’re impatient for more of my attention. I tell them that I have to feel my way through this massive job, but do it in small bits. The goal is to write 600 words a day. If I do that for 208 days, I’ll have a big mess of words that I can turn into a novel in a year or less.

“Who cares about your timeline?” my characters say. “Just start. If you screw up, you can always rewrite us.”

Easy for them to say, but I’ve learned not to argue with them.

FAKEOUT LOL

Ukraine’s new foreign ministry spokesperson, Victoria Shi, is not a real person. She is an AI modeled on a former contestant on the Ukrainian version of The Bachelor.

When Victoria makes official announcements on behalf of the Ukrainian foreign ministry, she speaks words written and verified by people.

“It’s only the visual part that the AI helps us to generate,” Dmytro Kuleba, the Ukrainian foreign minister, said.

Fiction writing is more challenging than ever, because you can’t make up stuff that sounds fictional anymore; it’s already happening in the world.

A synthetic spokesperson is no longer much of a stretch, and we’ll see more of it as celebrities extend their brands. Can’t you see Clint Eastwood licensing his image and voice to be shaped into the next U.S. Secretary of Defense? Or an AI based on Matthew McConaughey as our charming, smooth-talking Secretary of State?

If you make a living with your voice, you’ve already got a problem with AI simulations. There is a story going around about voice actors who were paid to record sample voice tracks and were told the tracks would only be used for research. Instead, apparently, the company used their voices to make a commercially-available talking chatbot.

OpenAI recently released a version of its chatbot that can talk. Personality-based artificial intelligence is an engine of seduction, the magic that coaxes us to believe that these platforms can do more than they really can.

Soon after OpenAI’s new chatbot voice came out (it was called Sky), OpenAI pulled it off the market because it sounded incredibly similar to Scarlett Johansson. This is more than a coincidence: Scarlett Johansson played the seductive AI in the movie “Her,” a movie that has become the case study for how chatbots will eliminate loneliness. For that reason, it probably seemed like a good idea at the time to make OpenAI’s chatbot sound like a beloved and respected movie star.

The story got more convoluted yesterday, when OpenAI published a blog that said they believed that AI voices should not “deliberately mimic a celebrity’s distinctive voice — Sky’s voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using her own natural speaking voice.”

Then Scarlett Johansson called them on that, issuing a statement that she’d received an offer from Sam Altman, CEO of OpenAI, to hire her to “voice the current ChatGPT 4.0 system.” She declined. She wrote that Altman contacted her agent as recently as two days before the company first demoed the ChatGPT voice, asking for her to reconsider. She said no again.

Did OpenAI hire a soundalike actor, or did they just use Johansson’s voice to train the system? Maybe the lawyers will sort that out. Kate Conger, a NYT tech writer, framed the meaning of all this perfectly in a Bluesky post:

one of the ugly characteristics of tech culture is entitlement & it’s been particularly disquieting to see it pop up in the AI world. the entitlement to women’s bodies as expressed through deepfakes, the entitlement to women’s voices through the recent OpenAI stuff, should be noted and checked.

Kate Conger

No matter what warmth and fuzziness your AI pal might bring on, it won’t make you less lonely. AI’s will never do that, because we need human contact to alleviate loneliness. As one scientist said, “Right now, all the evidence points to having a real friend as the best solution.”

References

Ukraine Unveils AI-generated Foreign Ministry Spokesperson

What Do You Do When A.I. Takes Your Voice?

OpenAI Pulls ChatGPT Voice ‘Sky’ After Users Say It Sounds Like Scarlett Johansson in ‘Her’

Scarlett Johansson told OpenAI not to use her voice — and it did anyway — The Verge

Scarlett Johansson’s Statement About Her Interactions With Sam Altman

Kate Conger on Bluesky

Loneliness Is a Problem That A.I. Won’t Solve

Disclaimer

Apologies for all typos, spelling errors, and grammar disasters.

--

--

Lee Schneider

Writer-producer. Founder of Red Cup Agency. Publisher of 500 Words. Co-founder of FutureX Studio. Co-founder of 3 children. Married to a goddess.