When we imagine drones exploring Mars, we picture sleek machines zipping over rusty plains, weaving through dust storms and rocky craters — all without a human pilot or a GPS signal to guide them. But building the brains for that kind of flight is no small feat.
A new paper, developed through a collaboration led by Parallax Advance Research, the University of Dayton’s Vision Lab, and the University of Cincinnati’s Digital Futures Lab, shows what’s possible when AI, neuroscience, and space exploration collide.
A Neuromorphic Breakthrough for Off-Earth Autonomy
At the heart of the project is VelocitySNN-Fuzzy, a biologically inspired AI system that helps drones estimate their own velocity in three dimensions — no GPS required. The system combines event-driven sensors with spiking neural networks (SNNs) that work more like a human brain than traditional AI models.
“It addresses one of the biggest challenges in off-Earth autonomy: how to perceive motion and navigate in environments where conventional tools like GPS or standard cameras fail, such as on Mars,” says Dr. Steve Harbour, director of AI Hardware Research at Parallax Advanced Research and Ohio Aerospace Institute, and the project’s lead researcher.
Caption: Dr. Steve Harbour, director of AI Hardware Research at Parallax Advanced Research and Ohio Aerospace Institute, and the project’s lead researcher.
Caption: David Harbour, Electrical Engineering undergraduate researcher at the University of Dayton
This work demonstrates that neuromorphic AI — energy-efficient, fast, and inherently robust — can tackle the harsh, sparse, and unpredictable conditions on the Red Planet.
The project began as an idea kicked around between students and faculty at the University of Dayton and the University of Cincinnati. Dr. Harbour’s son, David Harbour, an Electrical Engineering undergraduate researcher at the University of Dayton, was instrumental in planting the early seeds.
“David sparked early concepts involving spiking neural networks and event cameras,” says Dr. Harbour. “From there, it grew into a full collaboration involving neuromorphic computing, fuzzy logic, and Martian analog data.”
The University of Dayton Vision Lab, under Dr. Vijayan Asari, provided a testbed for the idea. David Harbour co-developed the event-density encoding pipeline and built the early SNN models. For Dr. Harbour, it became a rare opportunity to mentor his son on real-world space technology.
Simulating Mars — with Spikes
To mimic Martian conditions, the team ran simulations using first person-view drone datasets designed to replicate low-light, high-dynamic-range scenarios like those found on Mars.
The environment on Mars poses two big problems: visual sparseness and unpredictable lighting. By using event cameras — sensors that detect changes in brightness rather than static images — and SNNs trained to extract motion cues from those spikes of data, the team found a reliable way to estimate velocity in three dimensions.
An unexpected discovery? The system’s robustness. Even trained on Earth-like data, it generalized well to Martian analog environments. Adding a fuzzy logic layer further stabilized the system, making the “black box” more transparent for mission planners who need to trust AI decisions.
Bridging Neuromorphic AI and Autonomous Control
The collaboration with Dr. Kelly Cohen and the University of Cincinnati Digital Futures Lab strengthened the project’s ties between perception and control. UC’s expertise in fuzzy logic, control theory, and multi-agent autonomy helped integrate the velocity estimation into larger flight navigation and decision-making pipelines.
“They helped us test how our SNN outputs could plug into real control systems, making the AI not just smart but understandable,” said Dr. Harbour.
Together, the labs ran joint experiments, combining real-world and synthetic terrain data, validating the system across different Martian-like scenarios.
Powered by OSGC — and Student Ingenuity
The project wouldn’t have gotten off the ground without funding from the Ohio Space Grant Consortium (OSGC), which provided resources for early-stage experiments, hardware, and, crucially, student researchers. The grant enabled rapid development from concept to prototype and fostered cross-institutional teamwork.
OSGC is a statewide network of colleges and universities working to expand opportunities for Ohioans to understand and participate in NASA’s aeronautics and space projects by supporting and enhancing Science, Technology, Engineering, and Mathematics (STEM) through scholarships, fellowships, higher education, research infrastructure, pre-college (K-12), and informal education public outreach efforts. The Space Grant national network includes more than 1,000 affiliates from universities, colleges, industry, museums, science centers, and state and local agencies. OSGC is managed by the Ohio Aerospace Institute, which is wholly affiliated with Parallax Advanced Research.
“It was competitive but straightforward,” said Tim Hale, OSGC program manager (OSGC). “We highlighted how students were driving the technical work — that made a huge impact.”
From Mars to Earth — and Back Again
The implications go far beyond Mars. This kind of low-power, explainable neuromorphic AI could support GPS-denied drones on Earth, enable safer search-and-rescue missions, and add a layer of cybersecurity for flight systems that need independent velocity checks.
A second phase, called SpikeNav+, is already underway. The team is fusing VelocitySNN with new terrain-mapping capabilities and prepping real-time demonstrations on next-generation neuromorphic chips.
For Dr. Harbour, the project is proof that frontier research doesn’t just live in high-level theory — it can be built, tested, and deployed with student researchers at the core.
“Honestly, seeing undergraduate researchers — especially my son — play such a key role was the most rewarding part,” he says. “We weren’t just publishing; we were inventing something that could fly on another planet.”
His message to other innovators combining AI, neuroscience, and space?
“Do it. The fields are converging faster than you think. Neuromorphic computing gives us a biological template for building intelligent systems, and space gives us the perfect testbed — harsh, beautiful, and unforgiving. There’s no better place to innovate responsibly and meaningfully.”
The material is based upon work supported by NASA under award No(s) OSG – NSSC25M7134. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Aeronautics and Space Administration.
###
About Parallax Advanced Research & the Ohio Aerospace Institute
Parallax Advanced Research is an advanced research institute that tackles global challenges through strategic partnerships with government, industry, and academia. It accelerates innovation, addresses critical global issues, and develops groundbreaking ideas with its partners. In 2023, Parallax and the Ohio Aerospace Institute, an aerospace research institute located in Cleveland, OH, formed a collaborative affiliation to drive innovation and technological advancements across Ohio and the nation. The Ohio Aerospace Institute plays a pivotal role in advancing aerospace through collaboration, education, and workforce development. More information can be found at parallaxresearch.org and oai.org.