Prompts are becoming a design material

There’s been a shift in how products get designed, especially in anything that involves AI.

For a while, prompts were treated as a means to an end. You wrote something, tweaked it a bit, maybe added an example or two, and tried to get a better output. It felt closer to trial-and-error than design. Useful, but not something you’d spend much time thinking about.

As soon as AI becomes part of the actual user experience, the prompt stops being a hidden detail and starts shaping what the product does in a very direct way. It influences tone, structure, behaviour, and even what the system chooses to pay attention to.

At that point, it’s hard to argue that it isn’t design.

What’s interesting is that prompts don’t fit neatly into any existing category. They’re not quite copy, although they clearly define voice and tone. They’re not quite logic, but they do control behaviour. They’re not quite API contracts, but they often define the structure of inputs and outputs.

They sit somewhere in between all three, which is probably why most teams don’t have a clear way of thinking about them yet.

In practice, this means prompts tend to be scattered. A bit of text in a config file, something embedded in a component, maybe a few variations copied around and slightly tweaked. Changes get made quickly, often without much visibility, even though small edits can have a big impact on the output.

Over time, that becomes hard to manage.

You start to see the same prompt patterns reused across different parts of the product. You notice that changing one phrase improves results in one place but makes them worse somewhere else. You realise that two similar features behave differently for no obvious reason, and it turns out the prompts have drifted apart.

These are all fairly familiar problems. They’re the same kinds of issues that led to design systems and component libraries in the first place.

It’s not a stretch to imagine prompts going in the same direction. Instead of being treated as disposable strings, they become things that are versioned, reviewed, and reused. Teams start to care about consistency, not just in UI, but in how the system behaves and responds.

The shift underneath this is that we’re no longer only designing fixed interfaces. With AI, you’re shaping a system that generates responses on your behalf. You don’t control the exact output, but you do control the boundaries and the intent.

Prompts are one of the main ways you do that.

A simple example is something like summarising user research. A vague prompt will give you a vague result. A more structured prompt—one that asks for themes, highlights contradictions, and includes supporting quotes—produces something much closer to what a designer or researcher would actually want to use.

The difference comes from how the task is framed.

That framing is design work.

None of this means prompts replace other parts of the process. You still need good UX, clear thinking, and well-defined problems. But it does add a new layer that sits somewhere between writing, logic, and system design.

The teams that recognise this early will probably have an advantage. Not because they’re better at prompting in isolation, but because they treat prompts as part of the product, rather than something that sits behind it.

It totally changes how you build things.

Leave a comment