International Women’s Day: Architects Ask Whether AI Is Building Rooms Women Can Enter
As artificial intelligence becomes more deeply woven into daily life and used more frequently in architecture, the question of who gets to participate — and who risks being excluded — lands squarely on international women’s day. The moment shifts from a slogan to a practical test: do the tools and teams shaping the built environment reflect the diversity of the people they serve?
What does International Women’s Day mean for architecture and AI?
For architects and design teams, the debate is not abstract. The Organisation for Economic Co-operation and Development (OECD) names “inclusive growth” as one of its guiding principles when it comes to AI, calling for the “responsible stewardship of trustworthy AI in pursuit of beneficial outcomes for people and the planet, such as augmenting human capabilities and enhancing creativity, advancing inclusion of underrepresented populations, reducing economic, social, gender and other inequalities, and protecting natural environments, thus invigorating inclusive growth, well-being, sustainable development and environmental sustainability. “
That institutional framing sits beside a persistent culture problem: technology has been shaped by a culture dominated by “tech bros, ” where largely male perspectives have set priorities. Katie Fisher, Director at CARD Projects and RIBA J Rising Star, describes the imbalance this way: “From what I’ve seen and read, the imbalance is most visible in leadership, core engineering and venture funding [for those creating the systems]. The teams building large language models, BIM-integrated AI plugins or generative design platforms are still overwhelmingly male. For example, optimisation tools that prioritise speed and cost over care or community engagement reflect who sets the metrics. Women are more visible in ethics panels and facilitation roles, but less often shaping the underlying code or investment decisions. “
How are architects addressing inclusivity in practice?
Voices from practice say inclusivity must be a built-in test for any new tool. Olivia Stobs-Stobart, Design and AI Lead at Plan A Consultants, explains a practical approach: “Any new process or technology we adopt has to pass a few key tests, and inclusivity is always one of them. We take feedback seriously, running structured pilot tests on potential systems, gathering input and tracking engagement from diverse team members across different roles, experience levels, and working styles. We then work with our People & Culture team to ensure accessibility, ease of use, and whether the tool works for neurodiverse team members is considered before full adoption. It isn’t an afterthought for us; it’s part of the process from the beginning. We want our tools to empower and support our team to create a space where everyone can do their best work, not be left behind by the technology around us. “
The imbalance in the workforce underlines why those tests matter. “How can we responsibly say that AI is a fair representation of our population when only 22% of all AI and data professionals in the UK are women, ” Stobs-Stobart asks, pointing to a measurable gap that compounds other forms of exclusion and creates harmful biases and feedback loops.
Renee Dobre, Architect and Design Computation Team Leader at NBBJ, is part of the conversation invited to mark the day and to probe how practice, procurement and investment might shift. The conversation they joined focuses attention on the roles that set metrics, build core systems, and decide what optimisation looks like in a project.
What can change and who is acting?
Practitioners describe a mix of immediate steps and longer cultural shifts. Structured pilot testing, closer collaboration with people-and-culture teams, and explicit checks for accessibility and neurodiversity are presented as practical measures. At a higher level, the OECD framework emphasizes stewardship of AI toward inclusion and sustainability, offering institutional language for architects seeking to justify different priorities: care, community engagement and environmental outcomes rather than pure speed and cost optimisation.
Yet participants in the conversation stress that shifting where decisions are made — who writes the code, who controls funding, who sets the metrics — is central to whether tools will serve broader social goals. That is the heart of the critique Katie Fisher raises about leadership and investment patterns.
As the industry reflects on tools and teams, the prompt of International Women’s Day is framed not as a one-off celebration but as a checkpoint. If AI is to augment human capabilities and advance inclusion, then the rooms where AI is designed and deployed must be recognizably open to those who have too often been kept at the margins. The question remains: will the design choices made today shape buildings and cities that include more voices tomorrow?