With AI, Instruct—Don’t Chat

With AI, Instruct—Don’t Chat


As you use AI for work, do you find yourself acknowledging, musing (“what if…”), or being polite? It is natural. Humans are highly attuned to communication because it is essential for trust, coordination, and nuance. Carrying that into AI is understandable, but it does not produce the best results.

AI rarely rewards those behaviours:
• politeness adds no value
• hedging reduces clarity
• narration is padding

Defaulting to social norms carries two real costs.

First, time. - prompts become longer, and nuance slows convergence to a usable output.

Second, cognitive effort - you manage tone, add context, and filter language that has no functional impact. Effort designed for human interaction is wasted on a system that does not use it.

If the objective is a clear business outcome, the interaction needs reframing. AI is not a participant in a conversation. Its interface mimics dialogue, but its underlying mechanism optimizes for task completion It is a system that responds to structured input. Social layering is redundant.

A more effective frame is simple: you are specifying a deliverable, not maintaining an interaction. For most tasks, direct instruction is faster and requires less effort than conversational framing.

This shift is not intuitive. It runs against ingrained habits. A useful test is to ask: are you defining an output, or sustaining a conversation? If it is the latter, precision is likely being lost.

Conversation mode tends to include filler language, open-ended prompts, and incremental drift.
Instruction mode defines a clear objective, sets constraints (format, scope, tone), and targets a specific output.

Politeness remains essential in human interaction, and iteration is a valid way to refine thinking. Neither is inherently wrong with AI. The issue is when they replace precision.

In practice, keep conversational habits where they help structure your thinking, and drop them where they add nothing. The goal is not to change how you think, but to get better results from a system that is not human.