Today we're going to talk about authenticity and how even a glimmer of humanity will make you and your organization stand out from the crowd.
"This is the future"
A couple of weeks ago, I got a text from my dermatologist's office saying that they had to cancel my appointment at the last minute — an appointment scheduled six months ago — because my provider doesn't work on Fridays. (Did they just notice this?Why am I being notified on a Thursday afternoon for a Friday appointment?)
Please call the office to reschedule.
I called the number and got a brief "your call is being answered by an automated assistant" message. I expected the classic phone tree voice prompts. What I got instead was something entirely different.
The call was answered by a male voice. In the background you could hear the telltale voices common in a call center. The "representative" would "type" your information into their computer. It was extremely uncanny valley. It was enraging.
So, being me, I tried to ask it some bullshit questions for before my appointment: could I eat before the appointment? Did I need to give a blood sample? Should I wear sunscreen? It responded with the known AI platitudes. "Great question!" it said, or "I know that visiting a doctor can be scary." The kicker: the absolute non-sequitur of, "A dermatologist is a doctor for your skin." None of my questions were answered because there wasn't a human to answer them. I declined to make an appointment.
The follow-up
I received a survey from the dermatologist's office. It had leading questions about how happy I was with my phone experience. As you might imagine, I used the open question portion in great detail.
Text response: "We're sorry to hear that we missed the mark."
But then a human called! I thought that this would be a good time to explain that if they wanted to have an automated scheduler, the more user-friendly option would be to let me schedule, reschedule, and cancel online — no patronizing call with a fake call center necessary. I would rather have a simple and serviceable online portal than a faked interaction.
The human rep was dismissive. "This is the future," they said. "You're just going to have to get used to it." And they're probably right. There are literally dozens of fake receptionist systems out there.
If my care provider is trying to fake authenticity in its call center, then I can't trust them to be a partner in my care.
AI as destroyer of best practices
Last week, Torrey Podmajersky and I attended an in-person conference together. I don't want to call out the conference specifically — the volunteer hosts, the speakers, and the participants all meant well — but for me, it was just fuel for my AI rage.
Almost all of the talks had an AI angle. And I get it: conferences need to focus on topics that a) companies will pay for their employees to learn about, or b) make people fear for their jobs.
But every talk seemed to surface an unpleasant truth. AI is being used in lieu of our known best practices in content strategy, content design, and information architecture. Here's how to vibe code instead of understanding the problem. How to use AI to shorten the research cycle by pre-populating with leading questions. How to get AI to do the writing for you.
(As a side note, shout out to Torrey for posting about her proposed term, "AI dementia," where people are showing diminished memory, attention, judgment, language, and problem-solving. "I've been thinking about this a lot when people say extremely contradictory things about their work with AI without showing any signs of cognitive dissonance.")
How it manifests in my clients
I see this in my clients with increasing frequency.
- The client who was ready to go live with AI-generated content, showing it to me at the last minute. Minus three different words, it was a competitor's product page, verbatim. Rather than asking how we got to day-before-launch with no one catching it, they were furious that I questioned the wisdom of the AI.
- Similarly, clients who used to have meticulous review processes for content — from writing and editing to detailed nitpicky review by legal, engineering, regulatory, and product teams — now just give things a cursory glance and ship it. Information is contradictory, unclear, or incomplete.
- AI is encouraging us to create more and more digital assets, and I see clients who are creating and posting wildly. I don't want to harp on the fact that I wrote an entire book on the topic, but this is terrible not just for sustainability, but for usability and accessibility.
This is an opportunity
We are starving for humanity. As contradictory as it seems in the AI era, we're desperately trying to humanize our experiences.
The dermatologist? They wanted that AI scheduler to seem human and approachable. You can play a demo that shows this behavior here.
The major AI players? They want those tools to seem conversational (and sycophantic) in a way that makes us feel personally validated and promote dependence.
Regular people? They give their AI "partners" names and call them "him" or "her."
If we can bring humanity to our content — real content, created and moderated by real humans — we can stand out from the crowd.