The noise around AI has made adoption harder, not easier.
PureGym operate one of the largest gym networks in the UK and Europe. Behind the product is an in-house engineering team that, like most at this stage of the AI curve, had no shortage of tools to evaluate and no shortage of opinions about them. What they needed was a way to separate what was genuinely useful. What is more that just hype?
Every tool in the market promises to transform how teams work. Most conversations about AI in software development gravitate toward code generation; essentially autocomplete at scale. This is useful but narrow in scope. Teams that go looking for value there often miss the more significant opportunities sitting at either end of the engineering workflow.
PureGym's engineers were at this crossroads. Interest was high. But translating that interest into consistent, team-wide practice, in a way that actually impacted delivery velocity was the problem that needed solving.
Teams don't struggle to find AI tools. They struggle to adopt them in a way that compounds. The workshop exists to solve that second problem.
The workshop
We ran a full-day session with PureGym's engineering team built entirely around what we'd learned the hard way. We were consistently uncovering which AI workflows had made a real difference on live client work, which approaches hadn't held up under pressure, and how to think about integrating AI into the parts of a development workflow where they will have the greatest impact.
We were not interested in vendor overview or a curated tour of the AI landscape; neither were PureGym. The value we provided was practitioner-to-practitioner conversation. Dootrix sharing the thinking and the tooling that had come out of running AI-assisted workflows across real, production engagements. The aim was to give the team a shortcut through the experimentation phase that every organisation is currently navigating at significant cost.
The morning established a framework for thinking about where AI genuinely helps in a development workflow, and equally, where it tends to disappoint. The afternoon moved into live demonstration and hands-on exploration, with the team working through tooling against problems they were actively dealing with.
A big moment: spec-driven development
Most AI adoption in engineering focuses on what happens inside the IDE. Spec-driven development shifts the value upstream. The idea is that a well-formed specification; covering API contracts, event payload definitions, GraphQL schemas, and system behaviour descriptions, become the primary input to the development process. This is instead of moving straight from conversation to code.
AI generates the spec. The spec drives the build. The result is more consistent output, clearer contracts between services, and a significant reduction in the back-and-forth that accumulates when requirements live in people's heads and Slack threads.
For PureGym's team, this connected directly to the architecture challenges they were working through: a proposed event-based, mini-services system where getting the specifications right across services was as important as any individual piece of code. Seeing AI applied at that level, and not simply at the level of function completion, reframed what adoption could mean for the team.
We demonstrated this using OpenSpec, which the team got hands-on with during the afternoon. The conversation shifted from "how do we use AI to write code faster" to "how do we use AI to design and specify systems more rigorously." That is a more interesting, and more valuable, question.
The value we provided was practitioner-to-practitioner conversation. Dootrix sharing the thinking and the tooling that had come out of running AI-assisted workflows across real, production engagements.
What the session enabled
The session gave the team a clearer and more honest picture of where to focus. Not a list of tools to evaluate, but a practical understanding of which parts of their workflow stood to benefit most from AI integration. Importantly, what good practice in those areas actually looked like.
The spec-driven approach in particular opened up a way of working that addressed something the team had been circling: how to maintain consistency and rigor across a distributed system without manual specification work becoming a bottleneck in itself. AI-generated specs, validated against a design and iterated quickly, offered a route through that.
The session also surfaced the questions the team hadn't yet fully answered; around cost governance, usage monitoring, and how to measure the impact of AI tooling on delivery. It gave those questions the right shape. Not "should we be using AI?" but "here is what we need to put in place to adopt it well."
What followed
The workshop became the starting point for a broader conversation with PureGym's engineering leadership about their AI adoption roadmap. This covered tooling standardisation, how to bring the rest of the team along, and the longer-term question of what an engineering organisation that has genuinely internalised these workflows looks like.