Join us for a powerful lunch as Women in Technology Hunter (WITH) officially launches.

Hear from the women who started it, why it matters, and how a simple idea from last year’s SlashNEW turned into a growing, grassroots network.

What began as one informal lunch has become a supportive community where women working in, with, or looking to step into technology show up for each other, share experiences, open doors, and have a few laughs along the way.

We also welcome HunterWise to the panel. HunterWiSE is an initiative that strives to increase the number of girls entering in the STEM pipeline, while fostering a supportive professional network of women in STEM fields.

I’ve watched incredibly smart engineers and technology leaders lose the room, not because they were wrong, but because they over-explained, went too deep too fast, or tried to prove they were right instead of making it easy for people to say yes.

It’s a pattern I’ve seen play out for years. The gap between having the best answer and actually getting it heard is real, and nobody really teaches you how to close it.

In this session, I’ll talk about what gets in the way and what to do differently so your ideas actually move things forward.

 

Everyone has an opinion on AI. Far fewer people have actually shipped it inside a real organisation, with real customers, real legacy systems, and real risk on the table.

This panel brings together three leaders doing that work. Andrew Cresp is CIO at NGM Group, where every decision sits on top of customer money, regulation and trust. Josh Doolan leads APAC for Endava and founded Mudbath before its acquisition — he sees AI adoption playing out across dozens of enterprises, not just one. Katherine Squire has 25 years leading product and engineering across Macquarie, ASX, Nasdaq and Culture Amp — with deep experience on both the vendor and client side of the software business.

We’ll talk about what enterprise AI actually looks like once the demos are over. Where to start when you can’t change everything at once. How to bring a whole organisation along when half the room is excited and half is anxious. What’s genuinely shifting for engineers, product people and tech leaders. And how to handle security, privacy and data without becoming the team that just says no.

A case study on the business value of engaging and empowering staff to build their digital skills.

How I saved $391,000 in 15-minutes.  A support model that combines technical support and governance with change management and training activities. Tailoring learning delivery to busy people – nano-learning video content, micro-learning 15-minute digital skills sessions, through to 30-min deeper dives and ‘ask me anything’. Techniques to build communities (in Microsoft Teams) to keep staff informed, engaged, creating a strong peer-support network of users. Establishing trust, connection and educating users where and how to self-help.

Most Security Operations Centers (SOCs) are drowning in noise, yet adding more analysts is rarely the sustainable answer. Drawing on his experience as a Software Engineer turned Security Operations Manager, Daniel Clements shares the blueprint used at nib Group to move beyond traditional, manual monitoring.

This session explores the practical journey of building and implementing AI triage agents and SOAR workflows to automate the “heavy lifting” of investigations and stakeholder communications.

Attendees will learn how to shift their team’s focus from manual ticket-pushing to high-value security engineering. Daniel provides a diverse perspective focused on pragmatism, demonstrating how to deliver security outcomes that satisfy both technical requirements and executive expectations for efficiency.

Most engineering work starts the same way. Someone needs something built, you get it done, everyone moves on. Until the next request arrives. And the next. Before long you’re writing the same Terraform, configuring the same pipeline, answering the same questions you answered six months ago on a completely different project.

Spoiler: 80% of what you’re rebuilding is identical. 15% is a config value. You’re only ever writing the 5% that’s genuinely new. You just haven’t built the scaffolding to prove it yet. This talk walks through a phased approach to turning first-pass deliveries into something your whole team can actually reuse: kanban templates that give you a running start, modular IaC where one file controls everything, pipelines with deployment gates and automated testing baked in, and decision records that capture the why so the next person doesn’t have to rediscover it.

But we’ll also get into the stuff that actually stops smart people from doing this. The “every project is different” myth. The quiet fear that documenting your work makes you replaceable (it doesn’t, it gets you promoted). And the discipline it takes to carve out 5% of the timeline for harvest work when the next deadline is already breathing on your neck. These aren’t technical problems.

If you’ve ever shipped something and immediately thought “I should really template this,” this is the session that shows you how to actually follow through.

Most organisations say automation will “save time.” But few stop to ask the question every employee is quietly wondering: what should we do with the time?

In this session, Dan Godden introduces Thea, a customer service rep navigating the growing wave of AI tools appearing across her workplace. Along the way he draws an unexpected comparison with toilet training toddlers to explain why technology rollouts so often fail to change behaviour.

Through Thea’s story, Dan explores the human side of AI adoption and the emerging role of Humans-in-the-Loop (or even Humans-at-the-helm), helping teams think differently about judgement, responsibility and the uniquely human value that remains when more work becomes automated.

Does it feel like you’re constantly juggling All The Things but never quite nailing anything? Are you experiencing the special drain of using AI – the Dracula effect (as coined by Steve Yegge)? It’s not just you: almost 40% of us met the criteria for workplace burnout in 2025, and we know burnout among creatives is much higher. AI is an accelerant and if you or your team tilts towards burnout, you may very well be accelerating on that trajectory. The good news is that there are also opportunities for flipping the pattern and cultivating healthy, sustainable performance.

This fun, interactive session will:

We spend our careers hardening backends against external threats, but are we inadvertently building “Internal Exploits” right into our interfaces? When design choices weaponise the developing psychology of minors to drive metrics, we aren’t just frustrating users – we are violating their inherent dignity.

Drawing on the concept of the “technocratic paradigm” from Laudato Si’, this session moves beyond surface-level design ethics into the structural reality of how we build software. We will explore how to stop treating humans purely as extractable data points and start engineering interfaces that protect the most vulnerable.

Join this session to learn:

It’s time to patch the “Innocence Vulnerability”. Come learn how to architect a future where the human in the loop is empowered, not entrapped.

UX design often focuses on usability. But how do we measure the mental cost of an interface?

In this talk, Dr Ben Shelton explores how Cognitive Load Theory can be used as a practical tool to assess interface effectiveness and quantify the hidden impact of digital noise.

If we want calm, human-centred technology, we need to understand what it demands of the mind.