Jack Anglesea
3 min read • 20 June 2025
As AI tools become more embedded in our workflows, a big question is emerging in design circles:
What does our role look like when machines can generate UI faster than we can sketch wireframes?
The answer, in part, lies in how we build design systems.
Systematic design isn’t just about building components or managing a design system doc. At its core, it’s about codifying design decisions, structure, hierarchy, interaction rules, voice, and tone in a way that’s repeatable, scalable, and interpretable by both humans and machines.
When AI enters the picture, it doesn’t magically “get” our intent. It needs structure. It needs building blocks. That’s where systematic design will become essential in making generative design work.
Without a solid design system in place, AI outputs feel generic and inconsistent
I was recently part of a project where we used Subframe as our design tool of choice, a platform that blends AI with systematic design.
The key takeaway? The speed wasn’t just a result of the generative AI; it stemmed from having a solid, systematic design foundation that the AI could understand and build upon.
I’ve recently started experimenting with Figma’s MCP server alongside Cursor.
I connected it to my Figma design system, complete with production-ready components, and let the AI attempt to construct screens using natural language prompts. Once connected, you can prompt your MCP client to access a specific design node.
With the MCP server enabled, you can:
It wasn’t flawless, I still had to course-correct, reframe prompts, and finesse details, but the experience showed this clearly:
When your design system is structured well, AI becomes a powerful design collaborator, not just a content generator.
Lately, I’ve also been testing Figma Make — their new AI-powered tool. It’s still early days, but what’s immediately promising is how it bridges the gap between structured design systems and creative screen output.
In Figma Make, you can upload components directly from your design system, alongside a written prompt. This means you can hand-pick the exact buttons, forms, navs, or cards you want the AI to use. Once uploaded, you can prompt the system with natural language (e.g. “design a checkout flow”) and it generates screens and interaction states that are tailored to your system, not someone else’s.
I’ve been experimenting with a couple of different flows, and it’s already saved me hours of scaffolding, providing solid starting points that stay aligned and consistent with my design system.
What’s most exciting is how this tool starts to feel like co-creation rather than generation. You’re not giving up control, you’re giving the AI smarter guardrails and getting more focused ideas in return.
Instead of crafting every screen by hand, we’re building design logic and teaching the AI to execute within that framework.
When AI is layered onto a chaotic or ad-hoc workflow, it amplifies the mess. But with a strong design system in place:
Think of systematic design as prompt engineering for your product’s visual language. The better your system, the more valuable your generated outputs will be.
The future of AI and design isn’t simply about crafting beautiful interfaces; it’s about architecting the systems, rules, and constraints that empower AI to design with us, not for us.
It is becoming increasingly evident that this is the direction of travel. Systematic design-led and AI tools, such as Subframe, the launch of Figma’s MCP server, and the rise of ‘Libraries’ features from the likes of Lovable and Magic Patterns, are early signals of a shift. These platforms are not just accelerating workflows; they are enabling scalable systems that align human intent with machine capability.
If you’re a designer looking to stay relevant and resilient in the AI era, start here:
Design less. Define more. Systematise everything.
Jack Anglesea