€70M EIT funding targets 200,000 STEM learners by 2028. Here’s how Ireland can turn it into practical AI in healthcare skills, pilots, and startups.

€70M STEM Funding: A Fast Track for AI in Healthcare
Europe just put serious money behind a simple idea: you don’t get strong AI in healthcare without a strong STEM talent pipeline.
The European Institute of Innovation and Technology (EIT) has announced a new call under the EIT Higher Education Initiative, backed by up to €70 million, aimed at strengthening innovation and entrepreneurship skills across Europe’s STEM ecosystem. The headline target is concrete: train 200,000 STEM students, academics, and staff in innovation, entrepreneurship, and intellectual property management by 2028.
If you work anywhere near healthtech—hospital innovation teams, medtech founders, university labs, or software engineering groups building clinical tools—this matters. Not because “more funding is good” (true, but vague), but because this call is designed to change how universities and industry collaborate, and that’s exactly where most AI-in-healthcare projects either thrive or die.
What the EIT call actually changes (and why healthtech should care)
Answer first: This call shifts STEM training from “learn theory, then maybe innovate later” to “build, validate, and commercialise with partners now.” That’s the missing link for many AI-enabled medical technology projects.
The EIT Higher Education Initiative is funding projects (up to €2 million each, 24 months, running 2026–2028) that help universities build innovation capacity—things like venture creation support, entrepreneurship curricula, and IP literacy. The new twist is stronger cooperation between EIT Knowledge and Innovation Communities (KICs) and European Universities alliances, bringing scale and cross-border collaboration.
For AI in healthcare, that combination is gold. Most clinically useful AI systems require:
- Access to real-world clinical environments (not just synthetic datasets)
- Multidisciplinary teams (software engineering + clinical + regulatory + UX)
- A pathway to validation (clinical evaluation, evidence generation, procurement readiness)
- Commercialisation know-how (IP, pricing, reimbursement logic, partnerships)
Universities are strong on the first two. They’re often weaker on the last two. This programme exists to close that gap.
The quiet win: IP and entrepreneurship training for technical teams
A lot of AI engineering talent can build models, pipelines, and apps. Fewer people know how to protect inventions, negotiate data access, set up licensing, or create a company structure that investors and hospital buyers trust.
The call explicitly prioritises innovation, entrepreneurship, and intellectual property management training. In healthtech, that’s not “nice to have.” It’s the difference between:
- A promising research prototype
- And a product that can survive clinical scrutiny, procurement, and scale
Why Ireland’s AI-driven healthcare talent pipeline can benefit disproportionately
Answer first: Ireland is well-positioned because it already has dense clusters of software engineering, medtech manufacturing, and hospital systems—but it needs more cross-functional health AI talent to connect those dots.
Ireland’s tech sector has deep strengths in software development, cloud services, and data platforms. It also has a meaningful medtech footprint and a growing health innovation scene. The gap I keep seeing in real deployments is not “can we build AI?” It’s can we build AI that fits clinical workflows, meets governance requirements, and proves value quickly?
This EIT call can push Irish universities and industry partners to build repeatable training and delivery models for:
- Clinical AI product engineering (from dataset to monitored deployment)
- AI quality management (documentation, traceability, auditability)
- Regulatory-aware development (risk classification, evidence plans)
- Responsible AI operations in healthcare settings (bias checks, drift monitoring)
That’s also where the “AI in Technology and Software Development” series intersects perfectly with healthcare: health AI is a software delivery problem as much as a data science problem.
Seasonal reality check (December 2025): budgets reset, pipelines open
Late December is when many organisations:
- Lock next-year budgets n- Reassess university partnerships
- Decide which innovation projects will get executive sponsorship
This timing matters because the application deadline is 4 March 2026 (17:00 CET). If you want to be part of a consortium, January is when partnerships get formed—not the week before submission.
What projects should build: practical, fundable health AI directions
Answer first: The strongest proposals will combine education + real-world pilots + commercial pathways, especially in areas where hospitals have operational pain and measurable outcomes.
If you’re shaping a consortium proposal with a healthcare angle, avoid vague themes like “AI training.” Build around concrete deliverables. Here are examples that map well to the programme’s goals and to current hospital needs.
1) Hospital-ready AI software engineering programmes
Many AI education tracks overemphasise modelling and underemphasise shipping and maintaining clinical software.
A compelling project could train STEM learners (and staff) on:
MLOpsdesigned for healthcare environments- Data lineage and audit trails for clinical datasets
- Model monitoring for drift and performance decay
- Secure deployment patterns (on-prem, hybrid, private cloud)
- Integration approaches (EHR interoperability basics, workflow embedding)
The measurable output isn’t just “trained people.” It’s a repeatable delivery playbook and a portfolio of pilots that prove the playbook works.
2) Evidence-generation “sprints” for AI-enabled medical technology
Health AI fails when teams can’t answer basic buyer questions:
- Does it improve outcomes or throughput?
- What’s the false positive cost in real workflows?
- Who is accountable when it’s wrong?
A fundable approach: create a shared curriculum and mentoring structure that teaches teams to run evidence sprints—short cycles that produce:
- A clinical evaluation plan
- A dataset and governance plan
- A bias and safety assessment
- A lightweight economic model (time saved, capacity gained)
These artefacts also make startups more investable and procurement conversations less painful.
3) AI entrepreneurship for clinicians and engineers together
The best healthtech teams rarely look like a typical software startup. You need someone who understands clinical constraints and someone who can build.
A strong project structure:
- Mixed cohorts (engineering + clinical trainees + health service staff)
- Startup formation support (incorporation, pricing, IP ownership, licensing)
- Partnerships with hospitals for pilot access
- Mentors who’ve shipped regulated products
This is exactly the “bridge” the EIT Director talks about: universities, business, and research pulling in one direction.
4) Intellectual property literacy for AI + data collaborations
IP in health AI is messy because value can sit in:
- Data access agreements
- Annotation processes
- Model architectures
- Workflow design
- Integration tooling and monitoring
Training 200,000 learners in IP management sounds abstract until you’ve watched a promising collaboration stall over ownership terms. A practical curriculum here reduces friction and speeds up partnerships.
How to build a winning consortium (without overcomplicating it)
Answer first: Keep the consortium tight, outcome-driven, and aligned to a real healthcare delivery environment.
The call encourages consortia of higher education institutions, businesses, research institutes, public bodies, and other non-academic organisations. In practice, the best ones usually include:
- 1–2 universities (curriculum + research + student pipeline)
- 1 hospital group or health system partner (real workflows + pilot site)
- 1–2 industry partners (engineering capacity, productisation, scale)
- Optional: a public body or innovation agency (alignment + dissemination)
Here’s what works in proposals I’ve seen succeed in similar programmes: design the project as a factory for repeatable outcomes, not a one-off pilot.
A simple structure that reviewers can understand
- Skills layer: train cohorts in innovation, entrepreneurship, and IP with health AI modules
- Pilot layer: run 2–4 small, measurable pilots inside partner healthcare settings
- Venture layer: support spin-outs or licensing paths (even if only 1–2 reach market)
- Scale layer: publish a toolkit and replicate across departments/universities
Metrics that make sense for AI in healthcare
Use numbers that match hospital realities:
- Time saved per clinician per week
- Reduction in admin backlog or reporting cycle time
- Improved triage throughput (patients/day)
- Reduction in missed follow-ups or appointment no-shows
- Fewer manual reviews per case in imaging/pathology workflows
Even if the programme’s target is training volume, healthcare partners will only stay engaged if the pilots produce operational value.
“People also ask” (and the practical answers)
Who should apply for this funding?
Answer: Consortia led by higher education institutions, ideally with industry and public-sector partners. For health AI, include at least one real clinical environment partner.
Is this only for students?
Answer: No. The programme explicitly includes academics and non-academic staff. In healthcare, that’s huge—innovation offices, data protection teams, IT, clinical engineering, and operations staff are often the bottleneck.
What’s the biggest mistake health AI teams make in these programmes?
Answer: Treating it as an education grant instead of an innovation delivery programme. Training is the input. Pilots, ventures, and repeatable collaboration models are the output.
Where this fits in the “AI in Technology and Software Development” series
A lot of our series focuses on automation, cloud optimisation, cybersecurity, and large-scale data analytics. Health AI forces those themes to grow up.
Healthcare is where software teams get tested on:
- Reliability and monitoring, not just feature velocity
- Governance and auditability, not just model accuracy
- Integration into legacy systems, not just greenfield builds
- Human factors, not just dashboards
If Europe wants more AI-enabled medical technology that actually ships, this EIT call is pointed at the right lever: skills + entrepreneurship + partnerships.
What to do next (if you want to turn this into leads and real projects)
The application deadline is close enough that “we should look into this” won’t cut it. If you’re a university, hospital innovation lead, or healthtech company, the next steps are straightforward:
- Pick one healthcare problem where AI can deliver measurable operational value in 6–12 months
- Identify your consortium core (university + clinical partner + industry build partner)
- Define 2–4 pilots with clean success metrics and clear data governance boundaries
- Build an IP and commercialisation plan that won’t scare partners away
If you want Europe—and Ireland in particular—to have a stronger AI in healthcare workforce, the fastest route is to stop training in isolation and start training inside real delivery environments.
So here’s the question I’d use to pressure-test any proposal: Will your graduates be able to ship, validate, and maintain clinical AI software—or just talk about it?