CyberQuest trains 16–25s in cyber skills across Ireland. That pipeline can become the security backbone for healthcare AI and medtech teams.

CyberQuest: Building Ireland’s Health AI Security Talent
A hospital can buy the most advanced AI diagnostic software on the market, but if its identity systems are weak or its data pipelines are sloppy, it’s still one phishing email away from chaos. That’s the uncomfortable truth about AI in healthcare: the bigger the data and the faster the automation, the higher the payoff for attackers.
That’s why the newly launched CyberQuest cross-border programme (for ages 16–25) matters well beyond “cyber careers.” It’s a practical pipeline into the skills Ireland’s health and medtech ecosystem is hiring for right now: secure software development, cloud security, data governance, and incident response—the unglamorous foundations that make clinical AI safe to deploy.
CyberQuest—supported by PEACEPLUS and delivered by The Bytes Project with partners including REIM Training Solutions, YouthAction Northern Ireland, Springvale Learning, and Foróige—sets out to take learners from digital basics to industry-recognised certification using a hybrid, trauma-informed, youth-led model. That model isn’t a “nice-to-have.” In healthcare tech, the workforce gap is real, and rigid training pathways leave too many capable people behind.
Why a youth cyber programme is a healthcare AI programme
Answer first: Healthcare AI can’t scale without security talent, and the fastest way to grow that talent is to start earlier than university.
Most conversations about “AI in medtech” fixate on algorithms. In practice, the blockers are usually operational:
- Can we trust the data? (integrity, provenance, access controls)
- Can we keep services running? (resilience, backup strategy, recovery testing)
- Can we meet regulatory expectations? (audit trails, least privilege, risk management)
- Can we prevent model and data leakage? (secure MLOps, secrets management)
Those are cybersecurity problems with software engineering consequences.
The healthcare sector also has an awkward constraint: you can’t “move fast and break things” when the “thing” is clinical workflow. Security has to be designed into systems from day one—especially when AI models are trained, deployed, monitored, and updated through complex pipelines.
The new baseline: secure-by-default AI pipelines
If you’re building AI-enabled healthcare systems, your baseline now includes:
- Identity and access management (IAM): tight controls over who can access datasets, labeling tools, model artifacts, and production endpoints.
- Data governance: classification, retention, and clear boundaries between training data, test data, and live clinical data.
- Secure software development lifecycle (SSDLC): threat modelling, code review, dependency management, and vulnerability remediation.
- Cloud security fundamentals: logging, monitoring, network segmentation, and secure storage.
These are learnable skills. Starting at 16–25 makes sense because habits form early: how people code, how they handle credentials, how they think about risk.
What CyberQuest gets right (and what employers should pay attention to)
Answer first: The programme’s structure—full pathway, hybrid delivery, and youth-led support—maps well to how people actually enter tech roles.
CyberQuest is designed as a learning pathway from foundational digital skills through to certification-level capability. That matters because cybersecurity isn’t a single skill; it’s a stack.
From the source programme description, there are a few design choices I strongly agree with.
Hybrid learning isn’t a compromise—it’s the modern default
Hybrid delivery does two valuable things:
- It mirrors real work environments, where teams collaborate across sites and borders.
- It widens access for learners who can’t commute, can’t relocate, or need flexible scheduling.
That second point is a big deal in December 2025. Many families are balancing cost pressures, and “travel to training” is often the first thing to break. Hybrid programmes keep momentum.
Trauma-informed + youth-led = better retention, not softer standards
Tech training still fails too many people by confusing “rigour” with “sink or swim.” A trauma-informed, youth-led approach doesn’t mean lowering the bar. It means removing unnecessary friction so learners can actually clear it.
In cybersecurity, confidence is a competency. People quit early if they think they’re “not technical enough.” Programmes that deliberately build resilience and leadership—like CyberQuest aims to do—create graduates who stick with hard problems.
“CyberQuest is not just about filling the digital and cyber skills gap. It is about transforming lives, creating inclusive pathways, building resilience and leaving a legacy of knowledge and capacity within the youth sector.”
That’s not just a nice quote. It’s a labour market strategy.
Cross-border collaboration is a preview of how health AI must operate
Answer first: Health data, threats, and AI supply chains don’t respect borders—so training shouldn’t either.
CyberQuest spans Northern Ireland and the border counties of Ireland. That’s timely because healthcare delivery is increasingly interconnected:
- Patients move across regions.
- Specialist services collaborate.
- Vendors host systems in multi-region clouds.
- AI tools rely on supply chains of libraries, APIs, and managed services.
Attackers already operate that way. Defence has to match.
Shared threats, shared playbooks
Whether you’re securing a youth organisation’s training environment or an NHS/HSE-adjacent system, common patterns show up:
- Phishing and credential theft
- Ransomware attempts
- Misconfigured cloud storage
- Third-party dependency vulnerabilities
Cross-border programmes can cultivate a shared language: basic threat modelling, incident triage, log interpretation, and safe escalation practices.
And in healthcare, “safe escalation” is everything. You need engineers who know when a security event becomes a patient safety issue.
The real pipeline: from CyberQuest to secure health software teams
Answer first: CyberQuest graduates can map into entry-level roles that directly support AI in healthcare and medical technology.
If you’re a health tech employer reading this, you don’t need every junior hire to be a penetration tester. You need people who can do secure work consistently.
Here are realistic pathways from a programme like CyberQuest into the AI in Technology and Software Development world—particularly health and medtech.
Entry roles that matter for AI healthcare adoption
- Junior SOC / security analyst: monitors alerts, triages incidents, supports response.
- GRC assistant (governance, risk, compliance): helps with audits, evidence collection, policy rollouts—hugely relevant for regulated health environments.
- Junior cloud operations / platform support: manages access, logging, backups, and reliability.
- Secure software engineering intern: focuses on code quality, dependency scanning, basic threat modelling.
- Data operations / data steward assistant: supports data classification, access controls, and dataset hygiene.
These roles feed directly into safer AI implementation because they keep the basics strong: identity, availability, integrity, and auditability.
Practical skills to build during training (and how to show them)
Certifications help, but portfolios close the deal. If I were advising a CyberQuest learner aiming at health AI, I’d suggest building evidence like:
- A small app with role-based access control and a clear permissions model
- A demo CI/CD pipeline with dependency scanning and basic security checks
- A simple incident report write-up: timeline, impact, containment, and prevention
- A “secure dataset checklist” template for AI projects (access, retention, audit logs)
Hiring managers love candidates who can explain trade-offs clearly.
What organisations can do now (without waiting for a “talent miracle”)
Answer first: Treat youth programmes as strategic partners, not charity projects—and build on-ramps into real work.
CyberQuest is a strong start, but the ecosystem needs employers, health boards, vendors, and universities to connect the dots.
For healthcare and medtech employers
- Offer short, bounded placements: 2–4 weeks with clear tasks (logging review, IAM cleanup, documentation).
- Provide “safe” project environments: sandboxes that mimic real systems without exposing patient data.
- Mentor in public-facing skills: writing incident summaries, presenting risk, documenting controls.
- Create apprenticeship-style roles: not everyone needs a degree-first route.
For educators and programme partners
- Teach security using healthcare-flavoured scenarios:
- securing appointment systems
- protecting imaging workflows
- handling lost devices in clinical settings
- preventing data leakage from AI training datasets
When learners see the human impact, they stick with the hard parts.
For policymakers and funders
Keep funding tied to outcomes that matter:
- completion rates
- progression to apprenticeships/FE/HE
- job placement into junior technical roles
- retention at 6 and 12 months
Programmes don’t need hype. They need continuity.
The bigger point: secure AI needs a broader workforce than we admit
CyberQuest’s launch in Belfast is being framed as a cybersecurity opportunity for young people—and it is. But it’s also something else: a clue that Ireland is getting more serious about building the human infrastructure behind AI.
If your organisation is planning AI-enabled clinical tools in 2026, your risk won’t be “not enough models.” Your risk will be weak security fundamentals, unreliable data flows, and teams that can’t operationalise safe software.
CyberQuest-style pathways are how you fix that, one cohort at a time. The open question is whether employers will meet these learners halfway with real roles, real mentoring, and real progression—or whether we’ll keep complaining about a “skills shortage” while leaving talent on the sidelines.