But no one’s allowed to care
A decade ago, big companies spent big money trying to understand what people really thought of them. The buzzword was sentiment analysis—running emails, tweets, and reviews through algorithms to see if the mood was positive, negative, or somewhere in between.
It was clunky, but it gave managers something to wave around at meetings. “Look, our positivity score is rising!”
Fast forward to today, and sentiment analysis is effortless. AI can parse not just what you said, but how you said it, what tone you used, whether you’re escalating, and even whether the employee on the other side sounds scripted or dismissive.
And Amazon sells this technology themselves—AWS Comprehend. Which is why my recent experiences with their customer “support” raise a delicious irony.
The angry customer the AI can see
Every email I’ve sent to Amazon KDP and Retail has been dripping with frustration, because I’m trapped in a loop:
- Their system told me to update my bank details to receive royalties.
- I tried to follow instructions, only to be blocked by their own two-step verification error.
- KDP tells me it’s a Retail issue. Retail tells me to start again from scratch. And round we go.
If Amazon’s AI is running over those emails—and I’d bet money it is—then I’m a flashing red warning light on their dashboard.
The scripted staff the AI can see
Now contrast that with the replies I’ve received:
- “Your account is functioning normally.” (It isn’t.)
- “We’re not dismissing your concern.” (Yes, you are.)
- “We tried calling you at the number in your account.” (The same dead number I told them was inactive.)
If sentiment analysis is working both sides of the conversation, Amazon’s management would see:
- Customer tone: angry, urgent, escalating.
- Agent tone: flat, canned, handballing.
That’s a data-driven picture of organisational failure.
The leadership gap
And here’s the rub: AI might already know exactly how angry I am, but the culture inside Amazon doesn’t allow anyone to care. Staff are trained to follow the script, stay in their silo, and keep the hand-offs moving.
The lesson for leaders everywhere?
- Data without ownership is useless. It doesn’t matter what your AI dashboard says if no one is empowered to act.
- Sentiment without accountability is noise. You can measure dissatisfaction all day long, but until you give someone the authority to resolve it, customers remain stranded.
- Culture eats metrics for breakfast. A slogan like “Earth’s most customer-centric company” means nothing if the culture won’t let staff take ownership.
Why this matters
Amazon will survive this shambles. But in smaller organisations, ignoring sentiment—whether revealed by AI or simply by listening—can destroy you.
The AI might already be telling you your customers are furious. The question is: are you building a culture where anyone is allowed to do something about it?
One reply on “When AI knows you’re angry”
[…] The AI knew. The culture didn’t care. I spelled this out in When AI knows you’re angry. […]