Pete Foster
Technology & Strategy Director
Pete Foster
Technology & Strategy Director
4th March, 2026
Read time: 3 minutes
AI agents and generative AI tools can be coders, researchers and content producers. But the tools have no social context and no memory of why past successes succeeded and why past failures failed.
Our job as experienced humans is more important than ever and, in the context of working with agents and automation tools, it’s increasingly a job of the senior architect providing the vision.
But you only get out what you put in. And in the world of AI-driven development, ‘what you put in’ is no longer just syntax; it’s context, curiosity and critical thought.
When we use tools like Cursor or Claude Code, we aren’t just asking them to ‘make a button’ or ‘create a menu’. Anyone can do that. The value we bring is in explaining why the button exists, who’s clicking it, and how it needs to interact with a legacy database that hasn’t been touched for years.
Talking the agent’s language isn't about memorising magic prompts. It’s about:
Building context: Give the AI tool or agent the business logic and brand constraints it needs to make smart decisions, and teach it about past successes, failures and lessons learned.
Critical thinking: Apply human eyes, knowledge and experience to catch and correct the hallucinations before they reach a staging environment.
Iterative discovery: Treat the tool or the agent as a sounding board and play it to its strengths. Use them to explore five different ways of solving a problem in the time it used to take to solve one.
With the right approach nailed down, it’s vital to spend time stress-testing how we communicate with the tools. The best results don’t come from the most complex prompts (although some can be long), but the most informed ones.
Constantly testing and refining a prompting playbook leads to accurate answers, cleaner code and more secure architecture. The aim isn’t perfection; it’s about teaching the tools about a client’s brand voice, business objectives or technical constraints, so the first draft creates a solid foundation, not a disappointing skeleton.
This isn’t R&D for the sake of it. From Labs projects to production-ready websites, we’re using these techniques everyday. By learning to ‘speak AI’ more effectively, we’re reducing the time spent on repetitive tasks and spending more time on understanding the problems we can solve by using the right tools in the right way.
We believe that being good with AI is a new form of being good at problem solving. It requires a restless curiosity – the desire and drive to ask "what if?" and "why not?" – combined with the technical and contextual experience to know when the answer is wrong.
Nearly two decades on from the ‘New Literacy’ of digital communication and social media, the newer literacy places human understanding right at the heart of technology.
The more we share our conversations with AI tools and agents, the faster everyone learns. If you’re interested in learning more, we’d love to show you some of the prompts and processes we’re using, and how they’re helping us build better, from prototypes to production software and everything in between.
Pete Foster
Technology & Strategy Director
Contact us
Prompt your way to better results, faster
Get in touch with Pete Foster, our Technology & Strategy Director, to discover more.
No Content Set
Exception:
Website.Models.ViewModels.Blocks.SiteBlocks.CookiePolicySiteBlockVm