Presenter/s:
Presenter/s:
Presenter/s:
There's a lot of hype around AI, and also a lot of disappointment.
Learn how to use (and not overuse) AI to develop better software, and to quality check it with observability.
Presenter/s:
Most engineering work starts the same way. Someone needs something built, you get it done, everyone moves on. Until the next request arrives. And the next. Before long you're writing the same Terraform, configuring the same pipeline, answering the same questions you answered six months ago on a completely different project.
Spoiler: 80% of what you're rebuilding is identical. 15% is a config value. You're only ever writing the 5% that's genuinely new. You just haven't built the scaffolding to prove it yet. This talk walks through a phased approach to turning first-pass deliveries into something your whole team can actually reuse: kanban templates that give you a running start, modular IaC where one file controls everything, pipelines with deployment gates and automated testing baked in, and decision records that capture the why so the next person doesn't have to rediscover it.
But we'll also get into the stuff that actually stops smart people from doing this. The "every project is different" myth. The quiet fear that documenting your work makes you replaceable (it doesn't, it gets you promoted). And the discipline it takes to carve out 5% of the timeline for harvest work when the next deadline is already breathing on your neck. These aren't technical problems.
If you've ever shipped something and immediately thought "I should really template this," this is the session that shows you how to actually follow through.
Presenter/s:
A case study on the business value of engaging and empowering staff to build their digital skills.
How I saved $391,000 in 15-minutes. A support model that combines technical support and governance with change management and training activities. Tailoring learning delivery to busy people – nano-learning video content, micro-learning 15-minute digital skills sessions, through to 30-min deeper dives and 'ask me anything'. Techniques to build communities (in Microsoft Teams) to keep staff informed, engaged, creating a strong peer-support network of users. Establishing trust, connection and educating users where and how to self-help.
Presenter/s:
Presenter/s:
Join us for a powerful lunch as Women in Technology Hunter (WITH) officially launches.
Hear from the women who started it, why it matters, and how a simple idea from last year’s SlashNEW turned into a growing, grassroots network.
What began as one informal lunch has become a supportive community where women working in, with, or looking to step into technology show up for each other, share experiences, open doors, and have a few laughs along the way.
We also welcome HunterWise to the panel. HunterWiSE is an initiative that strives to increase the number of girls entering in the STEM pipeline, while fostering a supportive professional network of women in STEM fields.
Presenter/s:
As AI becomes embedded in cyber operations, incident response plans must evolve — but blindly “adding AI” to response workflows can create new risks.
This session explores how to design AI-augmented incident response capability that improves speed and decision-making without sacrificing human judgment, governance, and accountability.
Presenter/s:
UX design often focuses on usability. But how do we measure the mental cost of an interface?
In this talk, Dr Ben Shelton explores how Cognitive Load Theory can be used as a practical tool to assess interface effectiveness and quantify the hidden impact of digital noise.
If we want calm, human-centred technology, we need to understand what it demands of the mind.
Presenter/s:
We spend our careers hardening backends against external threats, but are we inadvertently building "Internal Exploits" right into our interfaces? When design choices weaponise the developing psychology of minors to drive metrics, we aren't just frustrating users - we are violating their inherent dignity.
Drawing on the concept of the "technocratic paradigm" from Laudato Si', this session moves beyond surface-level design ethics into the structural reality of how we build software. We will explore how to stop treating humans purely as extractable data points and start engineering interfaces that protect the most vulnerable.
Join this session to learn:
It's time to patch the "Innocence Vulnerability". Come learn how to architect a future where the human in the loop is empowered, not entrapped.
Presenter/s:
Most Security Operations Centers (SOCs) are drowning in noise, yet adding more analysts is rarely the sustainable answer. Drawing on his experience as a Software Engineer turned Security Operations Manager, Daniel Clements shares the blueprint used at nib Group to move beyond traditional, manual monitoring.
This session explores the practical journey of building and implementing AI triage agents and SOAR workflows to automate the "heavy lifting" of investigations and stakeholder communications.
Attendees will learn how to shift their team’s focus from manual ticket-pushing to high-value security engineering. Daniel provides a diverse perspective focused on pragmatism, demonstrating how to deliver security outcomes that satisfy both technical requirements and executive expectations for efficiency.
Presenter/s:
Presenter/s:
Everyone has an opinion on AI. Far fewer people have actually shipped it inside a real organisation, with real customers, real legacy systems, and real risk on the table.
This panel brings together three leaders doing that work. Andrew Cresp is CIO at NGM Group, where every decision sits on top of customer money, regulation and trust. Josh Doolan leads APAC for Endava and founded Mudbath before its acquisition — he sees AI adoption playing out across dozens of enterprises, not just one. Katherine Squire has 25 years leading product and engineering across Macquarie, ASX, Nasdaq and Culture Amp — with deep experience on both the vendor and client side of the software business.
We'll talk about what enterprise AI actually looks like once the demos are over. Where to start when you can't change everything at once. How to bring a whole organisation along when half the room is excited and half is anxious. What's genuinely shifting for engineers, product people and tech leaders. And how to handle security, privacy and data without becoming the team that just says no.
Presenter/s:
I've watched incredibly smart engineers and technology leaders lose the room, not because they were wrong, but because they over-explained, went too deep too fast, or tried to prove they were right instead of making it easy for people to say yes.
It's a pattern I've seen play out for years. The gap between having the best answer and actually getting it heard is real, and nobody really teaches you how to close it.
In this session, I'll talk about what gets in the way and what to do differently so your ideas actually move things forward.
Presenter/s:
Does it feel like you’re constantly juggling All The Things but never quite nailing anything? Are you experiencing the special drain of using AI - the Dracula effect (as coined by Steve Yegge)? It’s not just you: almost 40% of us met the criteria for workplace burnout in 2025, and we know burnout among creatives is much higher. AI is an accelerant and if you or your team tilts towards burnout, you may very well be accelerating on that trajectory. The good news is that there are also opportunities for flipping the pattern and cultivating healthy, sustainable performance.
This fun, interactive session will:
Presenter/s:
Most organisations say automation will “save time.” But few stop to ask the question every employee is quietly wondering: what should we do with the time?
In this session, Dan Godden introduces Thea, a customer service rep navigating the growing wave of AI tools appearing across her workplace. Along the way he draws an unexpected comparison with toilet training toddlers to explain why technology rollouts so often fail to change behaviour.
Through Thea’s story, Dan explores the human side of AI adoption and the emerging role of Humans-in-the-Loop (or even Humans-at-the-helm), helping teams think differently about judgement, responsibility and the uniquely human value that remains when more work becomes automated.
Presenter/s:
Hosted by Atlassian.
Join us after-hours to network with the region's best and brightest.
We’re looking to build something permanent here. We want to gauge interest for Newcastle’s very own Atlassian Community chapter!
Even if you aren't attending the conference sessions, we’d love for you to drop by, meet the team, and help us shape the future of this community. Let’s put Newcastle tech on the map together.
This is free to attend, but register is required.
Presenter/s:
Only required for those who don't register on day 1.
Presenter/s:
Presenter/s:
Presenter/s:
As AI accelerates software development, writing code is no longer the hardest part, but alignment, context, and decision-making are.
In this keynote, Jovana explores how intelligent systems are reshaping the entire software lifecycle, from planning to production, and why human judgment, reasoning, creativity, and empathy are a requirement.
This talk reframes AI as an amplifier, not a replacement, offering practical insights on building more intentional, traceable, and human-centered engineering workflows.
Presenter/s:
Remote work might now be considered the norm, but many software teams have been doing it since "before it was cool"!
That's not to say everyone does it well, though...
In this session Callum shares his near-decade of experience leading successful software teams distributed across vast time zones and continents (UK and Australia). He'll reveal his approaches to implementing processes, coding standards, and collaboration tools to keep teams running like a well oiled machine despite physical distance.
Let's discuss some tried and tested approaches for ensuring remote teams thrive – with a particular focus on supporting mental health, team wellbeing, and adjusting ways of working to support neurodiversity like ADHD.
Additionally we'll explore ways to maintain a strong, supportive team culture despite physical distances – including tips for remotely onboarding new colleagues. Whether an engineer or a manager, you will leave inspired with ideas for how to improve collaboration and quality across your team.
Presenter/s:
Shipping a new feature is easy. Knowing if it actually improved anything? That's the hard part.
Many teams ship features based on intuition rather than evidence.
This makes it impossible to understand user behaviour or build confidence in product decisions.
Experimentation is the only reliable way for engineering teams to uncover the true impact of the features they ship.
In this talk, we will walk through the experimentation lifecycle from an engineering perspective - designing an experiment, translating it into code, launching and collecting data, analysing results and learning - showing how teams can embed this process into their software delivery practice. We will also navigate through common challenges with experimentation and how to scale it within organisations.
Presenter/s:
Presenter/s:
As AI systems become deeply embedded in software development, the question isn’t whether machines will assist us, it’s how we ensure they do so responsibly.
Trustworthy AI isn’t just about algorithms; it’s about preserving human judgment, creativity, and ethics in every loop.
This talk explores practical strategies for building AI-driven solutions that prioritize transparency, fairness, and accountability while amplifying human potential.
We’ll dive into:
Attendees will leave with actionable insights to craft systems that are not only intelligent but also aligned with human values because in the age of AI, trust is the ultimate feature.
Presenter/s:
I flipped my desk last week. Literally. Sent the monitor flying, stomped on my laptop in a blind rage.
Or at least, I did in my head. Then I heard my name, tuned back into the daily call, and carried on fixing a broken build, rotating secrets, and untangling merge conflicts I didn’t cause. Just to clear the way for the work that actually mattered.
And that’s the point: everything is too hard. Not meaningful-hard. Not worth-it-hard. Just absurdly, unnecessarily, needlessly hard.
We’ve normalised friction. We’ve glamorised struggle. We celebrate Git mastery as a badge of honour, telling ourselves the problem is our competence, not the fact that we’ve standardised a tool where one wrong move can undo a whole week’s work. We twist our models to fit relational schemas and write SQL incantations that make us feel superior, even when they’re clearly not the right tools for the job.
We tell ourselves this is maturity, when really it’s resignation. We build ever higher barriers around “the right way”, then congratulate ourselves for surviving them, instead of making it harder to do things wrong.
This talk doesn’t offer a framework, or a fix, or five steps to simplification. It offers recognition. It’s a rallying cry for everyone who’s ever stared at some pointless obstacle and thought: Why am I even doing this?
Because you’re not the problem.
Everything is too hard.
And maybe the first step to fixing that is finally talking about it.
Presenter/s:
Frustrated by your business partners missing tech opportunities?
Baffled by the decisions that mean high tech risk continues?
This all comes down to quality decision making, and is a critical part of being a senior tech leader.
This session will break down the biases to watch out for and give usable structures and methods to drive the best possible decision making and help tech leaders stay sane.
Presenter/s:
Why AI won't work without the platform maturity we should have had years ago.
For years, Platform Engineering was incorrectly treated as a deferrable cost - a structural investment bypassed in favour of immediate feature delivery. We relied on manual coordination and the invisible effort of engineers to navigate unstandardised environments, essentially masking the true cost of organisational friction.
As we move into the era of AI, that deferral has reached its limit. The paved roads, service catalogs, and automated guardrails that were once ignored are no longer just about reducing developer toil; they are the fundamental requirements for making AI work at all.
In this session, we’ll explore how the technical debt of yesterday has become the AI blocker of today. We will dive into why an LLM is only as good as the predictability of your underlying environments, why AI agents can’t navigate an undocumented mess, and how true platform maturity provides the deterministic foundation that non-deterministic GenAI demands.
What you’ll learn:
- The Context Gap: Why AI agents fail in organisations without a mature Digital Platform to navigate.
- Safety & Governance: How Paved Roads prevent AI from turning fast deployments into fast disasters.
- From DevEx to AIX: Shifting your platform strategy to support both human engineers and their machine collaborators.
We should have built these platforms to support our people. Now, we must build them to make AI work.
Presenter/s:
Presenter/s:
In 2008, Larry Ellison (Oracle CEO) said according to the Wall Street Journal[1]: "The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it? It's complete gibberish. It's insane. When is this idiocy going to stop?". He was, of course, talking about Cloud Computing.
Cloud Computing wasn't the first or last (over?) hyped technology, and Larry was poking fun at the "Amazing new XYZ product now with Cloud!" marketing. We have seen similar things happen with eCommerce late 1990'ies (online shopping using Windows 98 and 56 kbps modem?), Quantum Computers (IT industry's version of Schrödinger's cat), distributed ledgers were predicted to transform $2 trillion online economy in mid 2010s[2], metaverse (what exactly was that anyway?), digital twins ('must have' for CEOs[3]), and recently (dare I say it?) AI.
This session will explore early warning signs and provide some practical advice on how to foster a constructive discussion within your organisation to avoid the worst of software misapplications:
[1] CNET (Sept 2008): "Oracle's Ellison nails cloud computing"
[2] Accenture (2016): "Editing the uneditable blockchain - Why distributed ledger technology must adapt to an imperfect world"
[3] AFR (Feb 2026): "3 things these bosses plan to do differently this year"
Presenter/s:
This is talk about technical communication in large scale complex systems where operational safety is paramount. In such environments, technical phrases, formal procedures, recent biases and individual habits influence how we communicate. But in high risk environments, when miscommunications or misunderstandings occur, it can be catastrophic.
Today we’re going to learn how to fly a plane. Or, specifically, we’re going to learn how large international jets taxi and take off from a runway.
While thousands of planes take off and land safely every day, occasionally there are some close calls when disaster is averted by sheer luck or coincidence. In those circumstances, we get a safety investigation to help us learn from the mistakes and avoid catastrophic outcomes. This talk learns from one such safety report and we’ll discuss the parallels software engineers can learn from the aviation industry when communicating in complex operating environments. When safety is paramount, is just one human in the loop really enough?
Presenter/s:
Most people in tech are stretched. Career. Health. Family. Life. And at some point it stops feeling like hustle and starts feeling like failure.
Right now there's another layer. The landscape is shifting fast. Technology professionals, software engineers especially, are asking real questions about where their careers are headed. What it actually means to build a career in technology when the ground keeps moving.
This session is for anyone sitting with that uncertainty.
It draws on real experience. Career pivots, building a business under pressure, backing yourself before you felt ready. The pattern that keeps showing up: growth doesn't come from waiting until things are clear. It comes from choosing hard things while they're still unclear.
It also reframes balance. Not as something you maintain, but as something that shifts. Seasons. Some seasons you build hard. Some you recover. Neither is wrong.
No motivational fluff. No predictions about AI. Just a more honest way to think about your career in tech and how to move through uncertainty with clarity instead of guilt.
Presenter/s:
As AI tools integrate into software development, the real value doesn't come from the tools themselves; it comes from how they are used and implemented within existing workflows. Drawing on Octopus Deploy's AI Pulse Report, which examines AI adoption patterns, current capabilities, and where these tools are being used across development workflows.
The research reveals a critical misalignment: AI's current capabilities don't align with what developers actually need help with, including compliance, security, onboarding, deployment, and release management, with common frustrations reflecting that gap.
Automation through Continuous Delivery practices provides the foundation for AI to deliver compounding value, bridging the gap between individual productivity gains and organizational impact, and creating the environment where AI's actual strengths can be applied to solve problems at scale.
Presenter/s:
Conference Day 1
Host:
Take a break from sessions, grab a coffee from Pirate Coffee, and have a chat with the other attendees, plus see what our sponsors have to offer.
Host:
Host:
There's a lot of hype around AI, and also a lot of disappointment. Learn how to use (and not overuse) AI to develop better software, and to quality check it with observability.
Host:
Most engineering work starts the same way. Someone needs something built, you get it done, everyone moves on. Until the next request arrives. And the next. Before long you're writing the same Terraform, configuring the same pipeline, answering the same questions you answered six months ago on a completely different project. Spoiler: 80% of what […]
Host:
A case study on the business value of engaging and empowering staff to build their digital skills. How I saved $391,000 in 15-minutes. A support model that combines technical support and governance with change management and training activities. Tailoring learning delivery to busy people – nano-learning video content, micro-learning 15-minute digital skills sessions, through to 30-min deeper dives and 'ask me anything'. Techniques to build communities […]
Host:
Host:
Join us for a powerful lunch as Women in Technology Hunter (WITH) officially launches. Hear from the women who started it, why it matters, and how a simple idea from last year’s SlashNEW turned into a growing, grassroots network. What began as one informal lunch has become a supportive community where women working in, with, […]
Host:
As AI becomes embedded in cyber operations, incident response plans must evolve — but blindly “adding AI” to response workflows can create new risks. This session explores how to design AI-augmented incident response capability that improves speed and decision-making without sacrificing human judgment, governance, and accountability.
Host:
UX design often focuses on usability. But how do we measure the mental cost of an interface? In this talk, Dr Ben Shelton explores how Cognitive Load Theory can be used as a practical tool to assess interface effectiveness and quantify the hidden impact of digital noise. If we want calm, human-centred technology, we need […]
Host:
We spend our careers hardening backends against external threats, but are we inadvertently building "Internal Exploits" right into our interfaces? When design choices weaponise the developing psychology of minors to drive metrics, we aren't just frustrating users - we are violating their inherent dignity. Drawing on the concept of the "technocratic paradigm" from Laudato Si', this […]
Host:
Most Security Operations Centers (SOCs) are drowning in noise, yet adding more analysts is rarely the sustainable answer. Drawing on his experience as a Software Engineer turned Security Operations Manager, Daniel Clements shares the blueprint used at nib Group to move beyond traditional, manual monitoring. This session explores the practical journey of building and implementing […]
Host:
Insert text
Host:
Everyone has an opinion on AI. Far fewer people have actually shipped it inside a real organisation, with real customers, real legacy systems, and real risk on the table. This panel brings together three leaders doing that work. Andrew Cresp is CIO at NGM Group, where every decision sits on top of customer money, regulation […]
Host:
I've watched incredibly smart engineers and technology leaders lose the room, not because they were wrong, but because they over-explained, went too deep too fast, or tried to prove they were right instead of making it easy for people to say yes. It's a pattern I've seen play out for years. The gap between having […]
Host:
Does it feel like you’re constantly juggling All The Things but never quite nailing anything? Are you experiencing the special drain of using AI - the Dracula effect (as coined by Steve Yegge)? It’s not just you: almost 40% of us met the criteria for workplace burnout in 2025, and we know burnout among creatives […]
Host:
Most organisations say automation will “save time.” But few stop to ask the question every employee is quietly wondering: what should we do with the time? In this session, Dan Godden introduces Thea, a customer service rep navigating the growing wave of AI tools appearing across her workplace. Along the way he draws an unexpected […]
Host:
Hosted by Atlassian. Join us after-hours to network with the region's best and brightest. We’re looking to build something permanent here. We want to gauge interest for Newcastle’s very own Atlassian Community chapter! Even if you aren't attending the conference sessions, we’d love for you to drop by, meet the team, and help us shape […]
Host:
Conference Day 2
Only required for those who don't register on day 1.
Host:
Host:
Host:
Keynote blurb
Host:
Remote work might now be considered the norm, but many software teams have been doing it since "before it was cool"! That's not to say everyone does it well, though... In this session Callum shares his near-decade of experience leading successful software teams distributed across vast time zones and continents (UK and Australia). He'll reveal […]
Host:
Shipping a new feature is easy. Knowing if it actually improved anything? That's the hard part. Many teams ship features based on intuition rather than evidence. This makes it impossible to understand user behaviour or build confidence in product decisions. Experimentation is the only reliable way for engineering teams to uncover the true impact of […]
Host:
Host:
As AI systems become deeply embedded in software development, the question isn’t whether machines will assist us, it’s how we ensure they do so responsibly. Trustworthy AI isn’t just about algorithms; it’s about preserving human judgment, creativity, and ethics in every loop. This talk explores practical strategies for building AI-driven solutions that prioritize transparency, fairness, […]
Host:
I flipped my desk last week. Literally. Sent the monitor flying, stomped on my laptop in a blind rage. Or at least, I did in my head. Then I heard my name, tuned back into the daily call, and carried on fixing a broken build, rotating secrets, and untangling merge conflicts I didn’t cause. Just […]
Host:
Frustrated by your business partners missing tech opportunities? Baffled by the decisions that mean high tech risk continues? This all comes down to quality decision making, and is a critical part of being a senior tech leader. This session will break down the biases to watch out for and give usable structures and methods to […]
Host:
Why AI won't work without the platform maturity we should have had years ago. For years, Platform Engineering was incorrectly treated as a deferrable cost - a structural investment bypassed in favour of immediate feature delivery. We relied on manual coordination and the invisible effort of engineers to navigate unstandardised environments, essentially masking the true […]
Host:
Host:
In 2008, Larry Ellison (Oracle CEO) said according to the Wall Street Journal[1]: "The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it? It's complete gibberish. It's insane. When is this idiocy going to […]
Host:
This is talk about technical communication in large scale complex systems where operational safety is paramount. In such environments, technical phrases, formal procedures, recent biases and individual habits influence how we communicate. But in high risk environments, when miscommunications or misunderstandings occur, it can be catastrophic.Today we’re going to learn how to fly a plane. Or, specifically, we’re going to learn how large international jets taxi and take off from a runway.While thousands of planes take off and land safely every day, occasionally there are some close calls when disaster is averted by sheer luck or coincidence. In those circumstances, we get a safety investigation to help us learn from the mistakes and avoid catastrophic outcomes. This talk learns from one such safety report and we’ll discuss the parallels software engineers can learn from the aviation industry when communicating in complex operating environments. When safety is paramount, is just one human in the loop really enough? Today we’re going to learn how to fly a plane. Or, specifically, we’re going to learn how large international jets taxi and take off from a runway.While thousands of planes take off and land safely every day, occasionally there are some close calls when disaster is averted by sheer luck or coincidence. In those circumstances, we get a safety investigation to help us learn from the mistakes and avoid catastrophic outcomes. This talk learns from one such safety report and we’ll discuss the parallels software engineers can learn from the aviation industry when communicating in complex operating environments. When safety is paramount, is just one human in the loop really enough?
Host:
Most people in tech are stretched. Career. Health. Family. Life. And at some point it stops feeling like hustle and starts feeling like failure. Right now there's another layer. The landscape is shifting fast. Technology professionals, software engineers especially, are asking real questions about where their careers are headed. What it actually means to build […]
Host:
As AI tools integrate into software development, the real value doesn't come from the tools themselves; it comes from how they are used and implemented within existing workflows. Drawing on Octopus Deploy's AI Pulse Report, which examines AI adoption patterns, current capabilities, and where these tools are being used across development workflows. The research reveals […]
Host:
We recognise and acknowledge that the land and waters where the /NEW Conference takes place are the traditional country of the Awabakal and Worimi peoples. We celebrate and respect the cultures, histories and rights of the Awabakal and Worimi peoples.
Copyright © 2026 /New. All rights reserved.