Open-Source Robots Are Fixing the STEM Access Gap

AI in Robotics & AutomationBy 3L3C

Open-source robots are lowering cost barriers and widening who gets to build AI automation. Here’s what leaders can copy from Carlotta Berry’s approach.

Open-Source HardwareRobotics EducationAI Automation TalentWorkforce DevelopmentWomen in EngineeringInclusive Innovation
Share:

Featured image for Open-Source Robots Are Fixing the STEM Access Gap

Open-Source Robots Are Fixing the STEM Access Gap

Only about 8% of electronics engineers are women and around 5% are Black—numbers that should make anyone building the AI and robotics workforce uncomfortable. Because when the talent pool is that narrow, the products we ship—and the automation we deploy—end up reflecting the blind spots of the people in the room.

Carlotta Berry, a professor of electrical and computer engineering at Rose-Hulman Institute of Technology, has a blunt origin story for her mission: as a student in the ’80s and ’90s, she studied robotics but wasn’t allowed to touch the robots because they were too expensive. At the same time, she was often one of the only women or Black students in her program. Those two barriers—cost and belonging—show up together more often than people admit.

Berry’s response is practical, not symbolic: build open-source, low-cost, modular robots (including 3D-printed designs like her LilyBot) and put them in learners’ hands—college students, K–12 kids, librarians, and teachers. For teams working in AI in robotics & automation, this is more than a feel-good education story. It’s a workforce development strategy and a product-quality strategy wrapped into one.

The real bottleneck in AI robotics isn’t algorithms

The bottleneck is access to reps—repetitions of building, breaking, debugging, and trying again.

AI-powered robots don’t become useful in manufacturing, logistics, or healthcare because someone watched a lecture on reinforcement learning. They become useful because technicians, engineers, and operators have touched real hardware enough times to develop instincts:

  • Sensors lie, drift, and saturate
  • Motors stall at the worst moment
  • Battery voltage sag changes everything
  • “Works on my desk” fails on the shop floor

Berry’s undergrad experience—robots kept behind glass because they were expensive—still happens today, just in updated forms. Maybe it’s one shared mobile robot cart for 120 students. Maybe it’s a “simulation-first” curriculum that never graduates to real-world failure modes.

Open-source educational robots remove the scarcity mindset. When the platform is affordable and reproducible, the lab stops being a museum. Students stop asking permission to experiment.

Snippet-worthy truth: Hands-on robotics education isn’t a “nice to have.” It’s the fastest path to building people who can ship reliable automation.

Why open-source robots change who gets to participate

Open-source hardware does something subtle and powerful: it shifts learners from consumers to builders.

Berry’s work focuses on open-source, 3D-printed wheeled robots that learners can assemble, program, and modify. That openness matters for inclusion in three concrete ways.

1) Cost drops—and the learning curve gets kinder

When a platform is priced like a fragile asset, instructors guard it. Students hesitate. Mistakes feel expensive.

Low-cost robotics kits (especially modular designs with printable parts) make failure normal. And failure is where robotics skills actually form.

Here’s what I’ve found in training environments: if learners can replace a part quickly and cheaply, they try bolder ideas. That’s where you get the “a-ha” moments about feedback control, sensor fusion, and autonomy.

2) Openness invites remixing, not gatekeeping

With closed educational robots, the real value is locked behind proprietary firmware, obscure connectors, or limited extension points. Open-source designs encourage:

  • Hardware mods (new sensors, new mounts, different wheelbases)
  • Software swaps (from simple rule-based behaviors to ROS nodes)
  • Student ownership (they can read the code, not just run it)

This is exactly how AI robotics advances in industry: iterate, integrate, and validate.

3) Community grows faster than any single campus

Berry didn’t keep her outreach confined to a classroom. She took robots to schools, libraries, museums, and educator workshops—what she calls taking robots “to the streets.” That model scales because it trains multipliers: teachers and librarians who can run programs long after the demo.

In 2025 terms, this matters because the demand signal is loud: organizations want automation talent, but many regions don’t have robotics pipelines. Open-source platforms can seed those pipelines without waiting for a major capital investment.

A better on-ramp to AI in automation: sense, plan, act

Berry teaches audiences the three pillars of robotics—sense, plan, act—using accessible demonstrations with sensors like sonar, plus audio inputs/outputs (microphone and speaker) so kids can see (and hear) what a robot is doing.

That structure is also one of the cleanest ways to introduce AI for robotics without overcomplicating it.

Sense: turn the world into signals

Start with a small sensor set learners can understand:

  • Distance sensing (sonar, IR, ToF)
  • Simple audio cues (claps, tones)
  • Bump sensors (binary but honest)

Then teach one core lesson: sensors are noisy, biased, and incomplete. Even an “AI robot” is only as good as the signals it can trust.

Plan: rules first, then machine learning

Most companies get this wrong: they jump straight to “add AI” before learners understand baseline behaviors.

A strong progression looks like:

  1. Deterministic logic (if obstacle < 20cm, turn)
  2. State machines (search → approach → avoid → recover)
  3. Classical control (PID for speed/heading)
  4. Learning components (line-following with a small model, anomaly detection, simple RL in constrained tasks)

That sequence teaches good engineering judgment: prove the simplest solution first, then add intelligence where it clearly reduces errors or increases flexibility.

Act: make motion predictable

The “act” part is where robotics becomes real. Students learn why motor calibration, wheel slip, and battery voltage turn perfect plans into messy trajectories.

For the AI in Robotics & Automation series, this is the bridge to industrial settings: predictability is what makes automation safe and economical. A mobile robot in a warehouse and a cobot on an assembly line both need the same discipline—repeatable actuation and measurable performance.

Representation isn’t a side quest—it’s product risk management

Berry’s second formative experience was being one of very few women or Black students in engineering. She later heard Black women grad students in 2020 describe the same isolation she felt decades earlier.

The common corporate response is to treat representation as branding. I think that’s a mistake.

In robotics and automation, limited representation becomes product risk:

  • Safety features miss edge cases in real workplaces
  • User interfaces don’t match operator mental models
  • Field deployment fails because teams didn’t anticipate environmental variability
  • Hiring pipelines produce homogeneous teams that debug in similar (and similarly flawed) ways

Diverse teams don’t guarantee perfect products. But homogeneous teams almost guarantee repeatable blind spots.

Berry’s co-founding of Black in Engineering and Black in Robotics (part of the broader Black in X network) highlights another point: community creates retention. People stay in a field when they can see a future self there.

Snippet-worthy truth: Inclusion work isn’t separate from technical work in robotics; it changes who is around to solve the technical problems.

What organizations can copy from Berry’s playbook (starting next quarter)

If your business depends on AI-driven automation—manufacturing, healthcare robotics, logistics, service robotics—this story should push you toward action. The good news: you don’t need to replicate a full university program to get the benefits.

Build an “accessible robotics stack” for training

A practical stack for hands-on robotics education should include:

  • A low-cost mobile robot base (ideally open-source, repairable)
  • A sensor starter kit (distance + IMU + optional camera)
  • A simple curriculum path: sense → plan → act, then add autonomy
  • A shared software baseline (Python + microcontroller code, or ROS for advanced cohorts)

Your goal is repeatable training, not a one-off workshop.

Sponsor “train-the-teacher” programs, not just student demos

Berry’s outreach includes professional development for educators. That’s the multiplier.

If you’re funding STEM access, prioritize programs that:

  • Train teachers to run labs independently
  • Provide parts lists and repair guides
  • Offer classroom-ready modules (45–90 minutes) plus extensions

A single teacher trained well can reach hundreds of learners over a few years.

Use open-source robotics to widen your hiring funnel

Open-source platforms make skills more legible. Candidates can show:

  • Build logs and design modifications
  • Code repositories and tests n- Videos demonstrating autonomy behaviors

That evidence often predicts job performance better than pedigree.

Design your entry-level roles around hardware reality

If you hire “AI robotics engineers” but your onboarding is all simulations, you’ll lose people quickly.

A better approach:

  1. Week 1–2: hardware bring-up, calibration, failure diagnosis
  2. Week 3–4: sensor processing + baseline navigation
  3. Week 5–6: AI add-ons where they measurably help (perception, anomaly detection)

This mirrors the sense-plan-act learning arc and reduces early churn.

Where this fits in the AI in Robotics & Automation story

This post is part of a bigger theme: AI makes robots smarter, but access makes the field stronger. The industry can’t complain about a robotics talent shortage while keeping robotics education behind expensive lab doors.

Berry’s work shows a cleaner model: open-source robots, hands-on learning, community-driven visibility, and an insistence that learners—especially those historically marginalized—get to be builders.

If you’re leading automation initiatives, the next step is straightforward: audit your pipeline. Where do learners touch real robots? How often? Who gets included by default, and who needs an extra invitation?

The question I keep coming back to is this: What would your automation roadmap look like if you treated inclusive, open robotics training as core infrastructure—not charity?

🇺🇸 Open-Source Robots Are Fixing the STEM Access Gap - United States | 3L3C