The default that ate the industry
Every "AI feature" pitch in 2024–25 ended with a chat box. The pattern was: pick a workflow, slap a textarea on it, call it "Talk to your X." Investors clapped. Product teams shipped. Customers used it once and went back to the spreadsheet.
The mistake was confusing the model's interface (chat) with the product's interface. They are not the same thing. The model happens to be most-easily talked to in chat. The user does not want a chat. The user wants the answer, the action, the report.
What "build outcomes" actually means
Take whatever you were going to build a chatbot for. Now ask: what's the user doing five seconds before they would have opened the chat? They were looking at something. A dashboard, a document, a customer record, an inbox.
The right place to put AI is in that thing they were already looking at. Not behind a button that opens a chat. The model should already have the context, already have done the work, and surfaced its output where the user's attention already is.
Concretely: instead of "ask a question about this customer," the customer record shows the three things the AI noticed about the customer this week. Instead of "chat with your data," the dashboard already has the anomaly highlighted with the explanation underneath. Instead of "ask the docs," the help page already shows you the answer to the question you were about to ask, because the URL or the search was enough signal.
The three reasons chat keeps winning anyway
Chat persists despite the above. Three reasons:
- Engineering laziness. Building a chat box is one component. Building outcome-shaped surfaces is many components, each tuned to its workflow. Chat ships in a sprint. Outcomes ship in quarters.
- Demo theatre. Chat looks impressive in a screen-share. The model says smart things in real-time. The audience cheers. Outcome surfaces are quietly correct, which doesn't film well.
- Buyer expectation. Someone in marketing has already told the buyer that "AI" means a chat box. Showing them anything else creates a dissonance you have to spend pitch time resolving.
None of these are good reasons. All of them lose against "shipped a thing customers actually use" within twelve months.
When chat is right
Chat is the right surface when:
- The user genuinely doesn't know what they want yet — exploratory, open-ended discovery.
- The action surface is too varied to pre-build — the model needs to decide what to do, not just compute one of N known answers.
- The interaction itself is the value — therapy, teaching, creative ideation.
Notice none of those is "internal knowledge base," "customer support deflection," or "data analysis assistant." Those three categories are 80% of failed chatbot projects we get pulled in to fix. None of them needed chat.
The reframe we run with clients
First meeting with a new prospect who wants "an AI chatbot for X": we don't ask about chat. We ask, "what does the user do five seconds after they get the answer they wanted?" That's the outcome. Then we work backwards from that outcome to the smallest surface that delivers it.
Half the time the answer is: a button that says "draft this for me," and the email is drafted. Or: the dashboard now has a one-line annotation per row. Or: the report you already get on Mondays now has a section called "anomalies this week." None of those is a chat. All of them are AI features.
The one-line summary
The chat box is a debugging interface for the model. It is not the product. Build the product. The model can live behind it.
For the corollary on which kinds of demos always survive contact with production and which don't, see our companion piece on the five demos that always look great and ship terribly.