Header Banner – Finance
Header Banner – Finance
Header Banner – Finance
Header Banner – Finance
Header Banner – Finance
Header Banner – Finance
Header Banner – Finance
ASU’s Four-legged AI Robodog Champions a Caring Robotics Future

ASU’s Four-legged AI Robodog Champions a Caring Robotics Future

ASU’s Four-legged AI Robodog Champions a Caring Robotics Future

Arizona State University’s LENS Lab has unleashed a innovation with bite: the Unitree Go2 robotic dog. This nimble mechanical quadruped isn’t here to fetch – it’s engineered to save lives and make the world more accessible.

Outfitted with cutting-edge LiDAR, AI‑powered cameras, and a responsive voice interface, it’s learning to tread where humans hesitate: in search‑and‑rescue missions and as a guide for the visually impaired.

Credit: Photo by Erika Gronek/ASU Ransalu Senanayake, an ASU professor of computer science and engineering, at work in his laboratory. The roboticist teaches students to create AI solutions that can help robots assist real people and solve critical problems.
Ransalu Senanayake, an ASU professor of computer science and engineering, at work in his laboratory. The roboticist teaches students to create AI solutions that can help robots assist real people and solve critical problems. Photo by Erika Gronek/ASU.

Robotics with Real‑World Purpose

Founded by Assistant Professor Ransalu Senanayake, the Laboratory for Learning Evaluation and Naturalization of Systems (LENS Lab) is on a mission. It’s not about abstract algorithms – it’s about forging tools that solve tangible, pressing problems.

We’re not just writing code for robots: instead, the team is crafting AI companions that can navigate dangerous environments and widen accessibility.

The lab pushes the envelope by equipping robots to perceive, reason, and adapt – transforming them from static machines into dynamic teammates.

Search‑and‑Rescue in a Canine Form

Take second-year master’s student Eren Sadıkoğlu, for example. His focus? Teaching the Unitree Go2 to navigate post‑disaster terrains using vision- and language‑based navigation powered by reinforcement learning.

This isn’t just about point‑A‑to‑B mobility – it’s about strategic, safe movement: jumping over rubble, ducking under debris, and executing agile manoeuvres.

Armed with RGB‑depth cameras and foot‑mounted touch sensors, the robodog can respond to unstable surfaces and unforeseen obstacles – venturing where human rescuers can’t, and keeping teams safer.

Credit: Photo by Erika Gronek/ASU Quinn the dalmatian pays a visit to the Laboratory for Learning Evaluation and Naturalization of Systems, or LENS Lab, where she eyes the Unitree Go2 robotic dog uneasily. But Quinn can relax. Researchers in ASU's School of Computing and Augmented Intelligence are busy creating new AI systems that will allow robotic dogs to tackle the most dangerous search-and-rescue missions.
Quinn the dalmatian pays a visit to the Laboratory for Learning Evaluation and Naturalization of Systems, or LENS Lab, where she eyes the Unitree Go2 robotic dog uneasily. But Quinn can relax. Researchers in ASU’s School of Computing and Augmented Intelligence are busy creating new AI systems that will allow robotic dogs to tackle the most dangerous search-and-rescue missions. Photo by Erika Gronek/ASU.

Guiding the Visually Impaired

Undergraduate student Riana Chatterjee is steering another exciting thread of research: using the Unitree Go2 as a guide for visually impaired users. Through YOLO, transformer-based depth estimation, and vision-language models (VLMs), the robot identifies objects, gauges distance, and articulates surroundings in real time.

My project is about combining deep learning technologies to enable the robot to understand its surroundings and communicate that to a visually impaired person, she says.

This could herald a future where robotic guide dogs step in where live animals are impractical – able to identify obstacles, map walkable paths, and deliver real‑time guidance.

Academic Rigour Meets Applied Innovation

Senanayake, visionary founder of LENS Lab, is blending theoretical insight with real-world impact. On the academic front, the lab focuses on making AI adaptable, explainable, and ethically robust – challenging conventional metrics of accuracy to ask: when do models fail, how transparent are they, can they adapt?

Meanwhile, the applied side, as demonstrated by the Go2 robodog, brings AI from the lab to the field – tackling hazards, enhancing human safety, and offering practical accessibility.

ASU’s leadership is backing it all. Ross Maciejewski, director of the School of Computing and Augmented Intelligence, underscores a curriculum that balances theory with real‑world projects: future problem‑solvers are being trained right now to deploy transformative solutions.

Credit: Photo courtesy of the LENS Lab/ASU The feed shown from the Unitree Go2 robotic dog as it perceives and defines obstacles. ASU computer science undergraduate student Riana Chatterjee is developing an AI system that will enable the robotic dog to assist the visually impaired.
The feed shown from the Unitree Go2 robotic dog as it perceives and defines obstacles. ASU computer science undergraduate student Riana Chatterjee is developing an AI system that will enable the robotic dog to assist the visually impaired. Photo courtesy of the LENS Lab/ASU.

Insights from Recent Research

Recent studies reinforce – and expand – these efforts:

  • A quiet, stable locomotion controller developed for the Unitree Go1 robot significantly reduced noise (by half) compared with standard motion controllers, improving user comfort – especially crucial for those relying on auditory cues.
  • Research into aesthetic and functional design of robotic guide dogs shows that appearance matters: even for visually impaired users, societal perceptions, texture, and familiarity can influence acceptance – an important factor when developing technologies meant to integrate into daily life.
  • User studies have highlighted key developer considerations: customizable control and communication styles, robust battery life, weather resilience, and intuitive interaction are non‑negotiable requirements for real-world adoption of guide‑dog robots.

Across Terrain, Across Needs

LENS Lab’s research could redefine what a service companion looks like – not with fur and paws, but circuits and sensors. These robotic dogs are already crossing boundaries: from rubble‑strewn disaster zones to bustling urban crosswalks, offering safety, guidance, and hope.

It’s a tangible chance to make AI not just smarter, but kinder.

Credit: Photo courtesy of Eren Sadıkoğlu/ASU Eren Sadıkoğlu, who is currently pursuing his master’s degree in robotics and autonomous systems, poses with a humanoid robot in the LENS Lab. Sadıkoğlu is working on AI solutions that will allow the Unitree Go2 robotic dog to undertake search-and-rescue missions in dangerous environments, such as in the aftermath of earthquakes.
Eren Sadıkoğlu, who is currently pursuing his master’s degree in robotics and autonomous systems, poses with a humanoid robot in the LENS Lab. Sadıkoğlu is working on AI solutions that will allow the Unitree Go2 robotic dog to undertake search-and-rescue missions in dangerous environments, such as in the aftermath of earthquakes. Photo courtesy of Eren Sadıkoğlu/ASU.

About The Author

Thanaboon Boonrueng is a next-generation digital journalist specializing in Science and Technology. With an unparalleled ability to sift through vast data streams and a passion for exploring the frontiers of robotics and emerging technologies, Thanaboon delivers insightful, precise, and engaging stories that break down complex concepts for a wide-ranging audience.

Related posts