"CONNECTING RESEARCH TO REALITY"

Tiny Brain-Like Technology Gives Robots Real-Time Vision Processing

Australian researchers have developed a brain-inspired neuromorphic device that enables real-time vision processing in robots without external computing. This breakthrough mimics how human eyes and brains handle visual data instantly and efficiently, promising major advancements in autonomous technology.

BIOPHYSICSBIOTECHNOLOGYNEUROSCIENCENEWS-RATHBIOTACLAN

π’π‡πˆππ€π’πˆπ’ 𝐑𝐀𝐓𝐇

5/15/20252 min read

Real-Time Vision Processing Powers Next-Gen Robots

Australian researchers have developed a brain-inspired neuromorphic device that enables real-time vision processing in robots without external computing. This breakthrough mimics how human eyes and brains handle visual data instantly and efficiently, promising major advancements in autonomous technology.

  • Australian researchers have reported to have developed a very innovative neuromorphic device which in a similar fashion to human brain processes visual info.

At RMIT University a team has put together this which is breaking the mold in the autonomous field. The device is able to tell of movement, also it stores visual memories and does all of this in real time without the need of external computers which is a large step forward in the field of auto tech.
"This proof-of-concept device mimics the human eye's ability to capture light and the brain's ability to process that visual information, enabling it to sense a change in the environment instantly and make memories without using huge amounts of data and energy," said Professor Sumeet Walia, who led the research team at RMIT University.

Mimicking Brain Cell Behavior

At the core of this innovation is molybdenum disulfide (MoSβ‚‚), a metal compound with atomic-scale defects that converts light into electrical signals much like neurons in the human brain.

The researchers found that ultra-thin layers of MoSβ‚‚, produced through chemical vapor deposition, can accurately replicate the "leaky integrate-and-fire" (LIF) behavior of biological neuronsβ€”the fundamental mechanism behind how our brains process information.

"We demonstrated that atomically thin molybdenum disulfide can accurately replicate the leaky integrate-and-fire neuron behavior, a fundamental building block of spiking neural networks," said Thiha Aung, a PhD scholar at RMIT and first author of the study.

By adjusting the gate voltage, the system can quickly reset itself, enabling faster response times similar to actual brain functions.

ADVERTISEMENTS

ADVERTISEMENTS

Impressive Results

The team built a spiking neural network (SNN) using the MoSβ‚‚ material's light-response characteristics. Testing showed promising results:

  • 75% accuracy on static image tasks after just 15 training cycles

  • 80% accuracy on dynamic tasks after 60 cycles

  • Efficient edge detection for hand movement tracking without frame-by-frame processing

This approach significantly reduces data processing requirements and power consumption compared to conventional computer vision systems.

The Applications for Autonomous Technology

The innovation could dramatically improve how autonomous vehicles and advanced robots respond to visual inputs, particularly in high-risk or rapidly changing environments.

By detecting scene changes instantly with minimal data processing, the technology enables faster, more efficient reactions. This could enhance human-robot interaction in manufacturing, personal assistance, and other fields requiring visual processing.

Steps in Development

Researchers are now scaling up their single-pixel prototype into a larger MoSβ‚‚-based pixel array with support from new research funding. Their plans include:

- Optimizing the device for more complex vision tasks

- Improving energy efficiency

- Integrating the technology with conventional digital systems

- Exploring other materials to expand capabilities into the infrared range for applications like emission tracking and environmental sensing

The research details were published in the journal Advanced Materials Technologies.

With this breakthrough, we may soon see robots and autonomous vehicles that can "see" and respond to their environments with human-like speed and efficiency, opening new possibilities for applications ranging from industrial automation to healthcare.

ADVERTISEMENTS

ADVERTISEMENTS

ADVERTISEMENTS

ADVERTISEMENTS

a room filled with lots of hearts hanging from the ceiling

Drop Us a Line

We’d Love to Hear from You

RECENT POSTS