(COPY)

LESSON 14
Authenticity
Developed by Carl T. Bergstrom and Jevin D. West


Humans desire authenticity. That’s why we go to live music. It’s why we attend sporting events in person when the camera angles on our HD TVs are so much better than what we see from our seats. And it’s why "LegalBeagle66" was so angry when a Washington Post reporter informed him that the woman he was buying risque photos from online was actually an AI generated illusion.
It’s the same with writing. People crave authenticity. When parts of James Frey’s memoir A Million Little Pieces were exposed as fabrication, readers were furious. His publisher even offered refunds to those who had bought the book. Why? It was a good story. But it wasn’t authentic, and pretended otherwise.
LLMs are similarly inauthentic. An LLM cannot communicate on our behalf or express our thoughts for us because it doesn't know what we think. It doesn’t even express its own thoughts, because it’s not the sort of agent that is capable of having thoughts. Thus LLMs may be fine for writing corporate memos about changes in accounting procedure, but they can’t substitute for honest human expression of feelings.
Yet pastors use LLMs to write their sermons and deploy AI outreach tools to reach their congregations.
One megachurch pastor even created an AI version of himself for "personalized prayer and spiritual guidance", available to subscribers for the low price of $49/month.
Anthropic.com
Anthropic.com
Operational cost reduction, customer base increases, content delivery, reach and engagement—is this the nature of spiritual ministry in 2025?
A Vanderbilt University dean emailed the student body to express his heartfelt concern about a mass shooting at another university—but he had ChatGPT write the email for him.
Countless startups are developing and deploying unregulated chatbot therapists—but often it's the empathy, rather than the dialogue, that patients need. LLMs are not the sorts of entities that are capable of empathy.
In Google’s creepy Dear Sydney ad, a ten-year-old daughter wants to write a fan letter to a sprinter, and her father explains that it has to be "just right". So he has the Gemini LLM write it for her.
An Olympian doesn’t want letter-perfect grammar from the kids who idolize her; she wants a child’s genuine enthusiasm. Expressing feelings you didn’t have—that’s bullshit.
Source: Google
Source: Google
Let's close by considering what it means to offer an apology. An apology is a speech act. With speech acts, the point is not the novelty of the words but rather appropriateness of the circumstances in which they are used and the sincerity with which they are spoken. Here's Ted Chiang:
When someone says “I’m sorry” to you, it doesn’t matter that other people have said sorry in the past; it doesn’t matter that “I’m sorry” is a string of text that is statistically unremarkable. If someone is being sincere, their apology is valuable and meaningful, even though apologies have previously been uttered.
So can an LLM apologize?
For a speech act to be meaningful, the speaker must be sincere and must be the kind of the type of agent that is capable of performing such an act. If one third grader on the playground taps another on the shoulder with a stick and says "I dub thee Sir Fartalot", no actual title has been bequeathed. The act was neither sincere, nor performed by one authorized to confer knighthood.
In our view, LLMs are incapable of being sincere in issuing an apology. They don't have feelings of remorse, nor volition to do better in the future. Because they are not intentional agents, we would also argue that they are not the kinds of actors that can genuinely apologize. (Nevertheless, they do frequently apologize when called on their mistakes—and then repeat them after promising to do better.)
If an LLM can't apologize itself, can it apologize on our behalf? Might I enlist the help of an LLM to draft text for my own apology? Perhaps—but in passing off the machine's words as my own, I undermine the sincerity and authenticity of my apology.

PRINCIPLE
Humans crave authenticity. LLMs can't provide it because they neither know our authentic thoughts nor have thoughts of their own. Don’t use an LLM to write a love note or a condolence letter.
DISCUSSION
We described a number of situations where authenticity is essential and LLMs are not appropriate. Are there writing tasks for which authenticity is not an issue, and it would be fine to use an LLM?

VIDEO