A new approach to robotics


A person works at a computer desk while a large robot made to look like a giant teddy bear stands in the background.
|

Today’s robots help us perform daily tasks at home and at work, they work with medical practitioners to overcome medical challenges and physical disabilities, and now they are transporting us through our daily lives on roadways and in the air.

National Robotics Week (April 2–10) is an annual event designed to showcase what’s new in robotics and inspire students to investigate STEM-related fields.

Robotics and augmented intelligence are components of all seven of the Ira A. Fulton Schools of Engineering at Arizona State University — the largest engineering program in the country. The following are just a few examples of robotics innovation taking place at ASU.

Helper robots

Video by Ken Fagan/ASU News

Students can always find a robot hug in Assistant Professor Heni Ben Amor’s Interactive Robotics Laboratory, where the focus is on human-robot interaction, robot autonomy and machine learning. The robot is learning how to interact with humans on a one-on-one, physical basis, and perhaps teaching humans that robots can have a calming role to play, too.

Suckermouth catfish were the inspiration for the swimming robots in Assistant Professor Daniel AukesIDEA Lab. The fish-inspired robot for extreme environments, or FIRE robot, can navigate through tight spaces in canals and waterways. Like the scavengers that are catfish, FIRE’s future is as a cleanup robot.

Narrow, cluttered spaces are the domain of specialized drones in Associate Professor Wenlong Zhang’s Robotics and Intelligent Systems (RISE) Lab.  Able to mimic bird-like flight, these drones can squeeze their bodies to fly through narrow spaces and then expand to full flight capacity.

Auto manufacturers are moving quickly forward with autonomous technologies, and the team in Professor Aviral Shrivastava’s Make Programming Simple Lab are hastening the process. Creating a broad-based platform through which different manufacturers’ vehicles can speak with each other, humans and the environment will lead to safe, autonomous vehicle decision-making. 

Assistive robots

Video by Ken Fagan/ASU News

Children with cerebral palsy and stroke victims regaining full shoulder mobility are two of the recent projects in Associate Professor Hyunglae Lee’s Neuromuscular Control and Human Robotics Lab.  For the cerebral palsy platform, Lee’s team uses human-robot interaction to improve a participant’s gait – data from the unaffected leg is used to adjust the platform and improve the neuromuscular response of the impaired leg. 

Shadowy robotic figures create situational awareness in Assistant Professor Yu (Tony) Zhang’s Cooperative Robotic Systems Lab. The system allows a project manager to monitor what robots are doing in the background via virtual, non-interruptive shadows. Someday, medical practitioners may have goggles to see what a robotic assistant is doing in the background.

New robotics technology for interventional radiology may soon improve procedures to insert needles, catheters and neurostimulators by using a robot arm pulling a magnet-guided device through a vascular system or tissue. The technique developed in Associate Professor Hamid Marvi’s Bio-inspired Robotics, Technology and Healthcare (BIRTH) Lab not only limits radiation exposure, but the magnetic pulling process can avoid bone obstructions or perforating an artery. 

Originally designed to help U.S. Air Force aerial porters load pallets and lift cargo onto jets, Professor Thomas Sugar’s Aerial Port Exoskeleton, or APEx, will offer significant assistance to workers in a variety of industries, from warehousing to manufacturing to loading luggage on commercial flights to assisting postal workers. Created in the Human Machine Integration Lab with hip stabilizers and elements that provide robotic assistance for lifting, pushing and pulling, the device will also reduce workforce injuries. 

Top photo: Computer sciences master’s degree student Michael Drolet prepares the "hugging robot" in ASU Assistant Professor Heni Ben Amor’s Interactive Robotics Laboratory on Oct. 23, 2020. Ben Amor’s lab work focuses on machine learning, human-robot interaction, grasping manipulation and robot autonomy with the hugging robot "learning" to interact with humans. Photo by Deanna Dent/ASU News

More Science and technology

 

Photo of the ISPMHA group at ASU with Olivia Davis in the center

ASU postdoctoral researcher leads initiative to support graduate student mental health

Olivia Davis had firsthand experience with anxiety and OCD before she entered grad school. Then, during the pandemic and as a result of the growing pressures of the graduate school environment, she…

Silhouettes of an adult and a child facing each other.

ASU graduate student researching interplay between family dynamics, ADHD

The symptoms of attention deficit hyperactivity disorder (ADHD) — which include daydreaming, making careless mistakes or taking risks, having a hard time resisting temptation, difficulty getting…

Portrait of Shaopeng Wang.

Will this antibiotic work? ASU scientists develop rapid bacterial tests

Bacteria multiply at an astonishing rate, sometimes doubling in number in under four minutes. Imagine a doctor faced with a patient showing severe signs of infection. As they sift through test…