A biohybrid hand powered by lab-grown muscle tissues is marking a significant leap forward in robots and prosthetics.
The hand, from researchers at the University of Tokyo and Waseda University, features multiple muscle tissue actuators, bundles of thin strands of cultured muscle tissue, allowing it to contract its fingers to grip objects and form gestures.
These movements that were previously impossible for living tissue-based robots.
The muscle tissue is grown on a 3D-printed plastic base. Electrical currents stimulate the muscles to contract, mimicking natural movement realistically. For now, the hand has to remain suspended in liquid to prevent friction, which would otherwise hinder movement.
Future research will need to overcome this for real-world applications in prosthetics.
Merritt Moore is a quantum physicist with a Ph.D. in atomic and laser physics from Oxford University. She teaches creative robotics at New York University-Abu Dhabi. Moore is also a ballerina who has performed with world-class dance companies, including Zurich Ballet, Norwegian National Ballet and Boston Ballet.
In the intersection of the Venn diagram of Moore’s seemingly disparate professional pursuits is her passion for dancing with cobots, industrial robots that can work alongside humans in the same space.
“Sometimes creativity is just merging ideas in different ways.”
– Dr. Merritt Moore
She talked with KUST Review about merging art and science, turning her Ph.D. project into interpretive dance for a contest and a new ambition that surfaced after she appeared on a grueling BBC reality series.
LISTEN TO THE DEEP DIVE
| QUESTION: You’re a ballet dancer and a physicist. That’s an unusual mix. Can you talk about how that came about?
I started dancing at 12 or 13, but was told I would never make it professionally. So I went to Harvard to study physics. But when I was there I still had this love for dance and auditioned like crazy and took a year off to dance.
When I was working on my Ph.D. with Oxford I danced with the English National Ballet. Then the pandemic hit and I had a residency at Harvard University’s ArtLab.
CAPTION: PHOTOS IMAGE: Courtesy Merritt Moore
My interest was piqued by AI in terms of how it could enhance our creativity: Sometimes creativity is just merging ideas in different ways. I couldn’t dance with humans, but robots couldn’t get COVID. A robot company generously lent me a robot.
I created more and more video content and was invited to perform live. It opened the doors to more questions and possibilities.
| Q: You’ve talked before about how physics helped you be a better dancer. Can you explain more?
Because I couldn’t be in the dance studio much because I was in physics classes all day, I really used the power of visualizing at night and would visualize doing the ballet moves. But at the same time I was understanding inertia and torque and friction and how your arms can slow you down or project motion.
IMAGE: Freepik
Dance your Ph.D.
Dance Your Ph.D. since 2008 has encouraged scientists to explain their Ph.D. dissertations through interpretative dance. Read more›››
Winners get modest cash prizes and, naturally, bragging rights.
The 2020 overall winners were a trio of students from the University of Helsinki who used dance, rap and a wardrobe of white, short-sleeved button-down shirts to explain their research into computation study of molecular clusters. The 2022 and 2023 winners used dance to explain the electroporation of yeast cells and nanoMOFs.
The contest is sponsored by the American Association for the Advancement of Science, Science magazine and artificial intelligence company Primer.‹‹‹ Read less
(I was) visualizing the angle I’d need on take-off to get the highest leap. It’s using physics to maximize the least effort in a way. I could almost release and let physics do as much as possible. It also helped me get out of my head.
| Q: Has dance helped you be a better physicist?
Dance helped because I think there’s a huge importance in mind-body connection. Dance opened up so much passion.
For the Dance Your Ph.D. contest I created a dance called “EnTANGOed” (about the spontaneous parametric down-conversion equation). Everything became a metaphor. It made me think conceptually about the equation. (As scientists) we’re taught to memorize and regurgitate information. But it’s often missing something.
Einstein imagined himself as a photon or a light beam. So many breakthroughs happened outside the lab. It was a realization that there’s this unsaid pressure that a good physicist’s head is in the textbooks. But dance helps understand physically what’s going on.
| Q: You engage in youth outreach to encourage kids in STEM and founded the group Science-Art-Sisters to encourage girls to think about science in a creative way. How do students respond?
I’m always surprised by how many are so hungry for it. During the pandemic I created Zoom calls with SciArtists from around the world. I was expecting 40. There were about 300. It was a breath of fresh air. If I could squeeze in extra hours in my day I’d do it again.
CAPTION: PHOTO IMAGE: Courtesy of Merritt Moore
| Q: You participated in the U.K. reality series “Astronauts: Do You Have What It Takes?” and U.S. competition show “America’s Got Talent.” Which was more nerve-wracking, helicopter training or facing Simon Cowell?
The astronaut one was definitely more nerve-wracking in the sense that they take away your phone, they take away your computer. I had no idea what was coming up next. The unknown made it more nerve-wracking than anything else. (It was) all day every day.
The stuff they don’t show: Anytime we were waiting, we were having to do IQ tests, EQ tests. We were constantly miked up and filmed. It was really intense. It was the best experience of my life but also, yeah, really intense.
| Q: You have also talked about your hopes to become an astronaut and dance on the moon. How do you envision that would look?
I think that the weightlessness is the ethereal aspect of it. On those levels it would be so incredible. I would also love to explore what’s the new language up there. What’s the new language of dance? How do we create or optimize it?
IMAGE: Freepik
Exoskeleton crew
While some robots are dancing with humans, others might help more humans dance again. Read more›››
While some robots are dancing with humans, others might help more humans dance again.
“These are really great problems and interesting challenges to be solved,” says Lakmal Seneviratne, KUCARS’ founding director.
Taking up those challenges: Irfan Hussain, a KU robotics professor researching variable stiffness actuators (VSA), which mimic human muscles that become stiff or soft depending on the task. For tasks that require accuracy, like throwing a ball or writing, the muscles become stiff, while for tasks that require safety, like physically interacting with humans, the muscles become soft, he says.
Hussain is working on a VSA device that uses bioinspired systems to create joints that can become stiff or soft as needed. It’s a robotic exoskeleton that people who have had a stroke could wear on their legs. The device, funded by Emirati investment fund Mubadala, could aid rehabilitation by mimicking the function of a knee joint, Hussain says. The same principle would go into building soft robotic hands that might help stroke patients safely grasp objects, Hussain adds. ‹‹‹ Read less
| Q: Did you want to become an astronaut before the BBC series, or did it jump-start a new ambition?
It definitely launched a new ambition. It’s not exactly a career that career fairs talk about.
| Q: You frequently dance with an industrial robot arm that you program. Are you interested in choreographing dances with other kinds of robots or is there something about the robot arm specifically that speaks to you artistically?
I’d love to explore so many different (kinds). The more robots the better. The more expertise the better. I’d like to dance with the massive ones. That would be super interesting. It’s just complicated to get access.
| Q: How do you envision AI and robotics will contribute to the arts in the future?
I think (robots are) an incredible tool that we can use for human expression. People get worried: Are you going to replace human dancers? No, that will never happen.
Painters got worried when we invented cameras 200 years ago. Painting is still valued, but photography is now an art. You can see a photographer’s work and you can see a human dignity to it. I think the same will happen with robot dancers.
IMAGE: Freepik
ONE GIANT LEAP FOR ART
Physicist and ballerina Merritt Moore isn’t the only one with a desire to combine art and science on the moon. Read more›››
Semi-retired physicist and writer Samuel Peralta has been buying payload space on rockets to send coin-size Nanofiche loaded with music, books, visual art and more from more than 30,000 artists to the moon’s surface as lunar time capsules.
Canadian Heather Horton is one of the contributors to the project, called the Lunar Codex. “Every time I look at the moon, for the rest of my life, it will be different,” she tells the Guardian.
“I think what we have done here is the most global, the most diverse, the most expansive project,” Peralta says. “I sometimes think of the Lunar Codex as performance art,” Peralta adds. “This is the greatest performance art of my life!”‹‹‹ Read less
With AI, this is where it will get a little blurry and it depends how we legally start thinking about it. AI brings together a lot of peoples’ different work. It still needs human expertise to curate it well.
| Q: When you’re choreographing a dance with the robot arm, do you start with the human’s movements or the robot’s? What are the limiting factors?
I love that I can change the “formula” each time. Sometimes I start with human movement, sometimes I start with the robot’s movement.
Limiting factors are that the robot does not have arms or legs, so it’s always a puzzle to figure out what type of movements will “read.” The speed is sometimes an issue because if it is too fast, there is a risk it will fall over. There are limits to how much it can rotate (but I’m much less flexible than the robot).
| Q: What do you hope your audiences take away from your performances?
Audience members have mentioned that they never imagined a dance with a robot could be so moving. I always hope that audience members leave deeply moved and spirits lifted.
I want to show the blend of technology and human emotion, pushing the boundaries of what’s perceived as traditional art. My hope is that audiences leave not only moved by the beauty of the unexpected partnership but also inspired by the possibilities that arise when we merge diverse disciplines.
The first step in building any robot is to decide what you want it to do. While most of the robot’s abilities will be unlocked with clever machine learning and artificial intelligence algorithms, you need to set your robot up for success with the right mechanical features.
LISTEN TO THE DEEP DIVE
For a human eyeball, nice and round, turn to embedding light-sensitive receptors directly onto the surface of a 3D sphere like the team from the Hong Kong University of Science and Technology, UC-Berkeley and the Lawrence Berkeley National Laboratory.
You could also add a narrow bandgap semiconductor as a photosensing material — then your robot could see in the dark with infrared light sensing. In lieu of realism, you could turn to any number of sensors to have your robot “see”:
Distance sensors and gauges – maybe an ultrasonic range finder or laser measurement sensor. Positioning sensor – room navigation or indoor localization might come in handy. A GPS system or other live tracking devices will help your robot find its way around.
Thermal imaging sensors or pressure sensors are also an option.
Facial recognition – that’s some machine learning pre-programming.
LEGS
Want to jump? Forget biomimicry. Researchers at the UC Santa Barbara use an actuator system based on elasticity. It’s a spring with rubber bands and carbon fiber slats used to shoot the bot into the air.
Or keep the biomimicry but add hydraulic systems and electric motors a la Boston Dynamics’ Atlas.
You could leave humanity behind and go the marsupial route. German engineering firm Festo took it one further and developed the BionicKangaroo.
A “tendon” in its robotic leg drives it forward and captures energy on landing. The impact drives the legs into position for the next leap on its spring-loaded legs.
GRAPHICS: Abjad Design
Stanford University engineers developed a “stereotyped nature-inspired aerial grasper” or SNAG, bird-shaped feet that can perch on any branch.
WINGS
Go classic with drone design and choose rotary wings that spin to create lift and thrust like a helicopter. These are best for hovering, vertical takeoff and changing direction quickly.
Maybe you’d rather the classic plane look and have room for a runway or launcher. Fixed wings generate lift by moving through the air and offer higher speed, longer endurance and greater stability, though your robot will be at the mercy of the weather conditions.
You could even turn to the flapping wings of insects and birds. There are complex transmission systems using gears and motors available from the Harvard team that developed a solar-powered tiny robot styled after a honey bee. A team at the University of Bristol developed a tiny flying robot that flaps its wings more efficiently than an insect, using an electrostatic “zipping” mechanism (their words).
HANDS
What kind of hand does your robot need? Do you want the classic gripper, optimized for delicacy or accuracy? Or is a suction cup plenty?
How many joints does your robot arm need? You’re not limited by human anatomy here.
Many robot hands come with sensors packed into their fingertips only, but an MIT team built a robotic finger with sensors providing continuous sensing along the finger’s entire length, allowing it to accurately identify an object after grasping it just one time.
Researchers at Columbia Engineering developed a highly dexterous robot hand that can operate in the dark. It uses tactile sensors rather than vision to manipulate objects.
Humanoid robots are used in industries from medicine, law enforcement and hospitality, to maintenance and disaster relief. But Stanford University has developed a deep-sea humanoid robot that is diving in the robotics pool at Khalifa University with an end goal of exploring marine robotics for sustainable ocean ecosystems.
The OceanOneK robot — designed and built by Oussama Khatib and his Stanford team — has been five years in the making and made its Abu Dhabi debut tasked with retrieving plastic waste from the Khalifa University marine robotics pool.
But the team has bigger plans for OceanOneK
Having completed testing in the pool at Stanford on the trifecta of robotic function integration — navigation, bimanual manipulation (reciprocal hand movements needing disparity between hand actions), vision and body-control — it was time to take OceanOneK out to sea.
The robot performed several dives around the Mediterranean, reaching close to 1,000 meters — a record depth — exploring sunken vessels and retrieving artifacts.
As team members operated the robot through its haptic interface (communication system), they were able to feel what the robot was touching.
“It was pretty amazing feeling something that no other human could touch. While it was a (haptically mediated experience), it was still an amazing connection,” says Adrian Piedra, a Ph.D. student in Khatib’s Stanford lab.
CAPTION: Stanford team shares in-field experience with OceanOneK IMAGE: Khalifa University
One of the vessels was Le Francesco Crispi, an Italian steamship torpedoed by the British while enroute from Italy to France in 1943. Delicate white coral has formed on the wreck, Khatib says, that the dive’s marine biologists were very excited to touch and then collect as samples. Also present and observed were iron-eating bacteria.
The robot was able to perform tasks for archaeology and for marine biology.
– Oussama Khatib
This is why a humanoid robot was essential for this project, adds Wesley Guo, another of the project’s Stanford Ph.D. students. “The way we control the robot is direct, as this helps the operator relate intuitively. The easiest way to do this is to have the body at a scale and shape similar to the human form. We also wanted it to appear non-threatening, as it will work in collaboration with human divers at different sites.”
A typical recreational diver can safely descend to about 30 meters – anything deeper requires specialized training and equipment. At 30 meters the pressure is approximately four times that at the surface. What happens to the human body beneath these depths depends on the person’s overall health and fitness levels. At 1,000 meters, the robot experiences 100 times the atmospheric pressure, team leader Khatib explains.
So, such robots are the key to deep-water exploration. And with more autonomy comes more skill sets.
Khatib says autonomy of a robot in the water is challenging, hence the haptic interface back to a human. But the goal is to diminish the need for human intervention as much as possible.
These deep-water diving robots, called remotely operated vehicles, or ROVs, are a new type of robot that can collect a lot of image data. “Operations under water require arms, hands and coordination between them, and that is what we’ve brought here with the OceanOne concept,” Khatib says.
“The interface we use goes beyond the visual – it delivers tactile-touch sensing using a haptic device. A haptic device allows humans to touch and feel what the robot is interacting with and permits one to guide the robot while it is executing delicate tasks. It acts as an avatar,” Khatib tells KUST Review.
“It interprets and affects movement and grasp request, maintains attitude and position for the human reference, and passes sensory information back to the human,” he says.
Human movement is just one of the considerations when building a robot like OceanOneK. The working environment must also be factored in. In this case that includes water and how it behaves.
Currents, for example, disrupt the intended movement, and this is where Khalifa University comes in.
The robotics pool at Khalifa University can simulate such environments, but under controllable conditions.
“Here, we can control the amount and direction of currents, we can control the waves, we can control those interactions in an ocean-like environment,” says Khatib. “This is perfect for training and learning.”
CAPTION: Ku Robotics Pool IMAGE: Khalifa University
The Khalifa University robotics team will also work toward adding to the tasks the robot’s hands can carry out on their own.
“Full autonomy (without human intervention) will be the ultimate target; this, however, is challenging, and in the near-term humans will work with the robot to carry out tasks such as underwater valve-turning and plug-insertion,
Our objective is to increase the robot’s degree of autonomy while reducing the extent of human intervention.
– Lakmal Seneviratne, director of the Center for Autonomous Robot Systems and professor of mechanical engineering at Khalifa University
Stanford’s Khatib says these sensory-mechanical systems are also used out of the water in industries such as medicine, where a physician may interact through a haptic interface when not able to be present in the ICU. Similarly, the systems could be used for robots working on electrical grates or offshore platforms.
“In many of these applications we aim to distance humans from danger while connecting their skills to the task that must be carried out in that environment,” Khatib says.
CAPTION: Stanford and Khalifa University robotics collaboration IMAGE: Khalifa University
“There is a lot of work needed before taking these robots into the field, and Khalifa University offers a unique environment for this preparatory marine robotic study,” Khatib says. “We are also collaborating in other ways,” including curriculum development and teaching, as well as through research focus groups and workshops,” he adds.
“We look forward to more interaction with the researchers, faculty and students here.”
Among future joint projects: Khalifa University KUCARS and Stanford University Robotics Lab have recently established a collaboration to research and develop marine robotics systems for sustainable marine ecosystem applications, including ocean monitoring and ocean cleaning.
Scientists have developed a new generation of robot fish that can do more than just swim, it can also eat microplastics — providing a promising solution to the global problem of plastic ocean pollution.
The University of Surrey in the United Kingdom hosts a contest each year focused on developing robots that mimic things in nature. The 2022 winner, chemistry undergrad Eleanor Mackintosh, designed a robot that looks and acts like a fish and is skilled at filtering microplastics from water it sucks in through its gills. The robot is aptly named Gillbert.
Gillbert is 50 centimeters long and approximately the size of a full-grown pink salmon. It is shaped like a fish, and its movements mimic those of a fish. It moves through the water via remote control while its gills move in and out, drawing in water. Gillbert filters the microplastics — some as small as 2 millimeters — and stores them in an internal container.
Though Gillbert is operated by remote control, Robert Siddall, robotics lecturer at the University of Surrey and founder of the competition, hopes this robot fish inspires others to work toward gaining control of the plastic problem plaguing the world’s oceans.
But with an estimated 5.25 trillion pieces of plastic in the oceans, why focus on microplastics?
Ludovic Dumée, assistant professor of chemical engineering at Khalifa University, says although microplastics are small and difficult to see, they have an enormous impact.
“Microplastics, whose maximum dimension falls below 5 millimeters, are ultimately released into waterways and represent a major threat to global ecosystems, the entire food chain as well as many human industrial activities that rely on river or sea-water intake,” he says in a 2023 article in KUST Review.
Additionally, Dumée says human beings consume between 50,000 and 100,000 microplastics annually. This exposes humans to contaminants and increased cancer risks.
CAPTION: Plastic straws become microplastics IMAGE: Unsplash
Gillbert the fish is one possible solution to the microplastics problem, but more attention is required to solve this global issue.
The 2023 Natural Robotics Contest requires this year’s entries be inspired by the December 2022 UN Biodiversity Conference held in Montreal, Canada. The biodiversity conference addressed appropriation of a global biodiversity framework to deal with the main causes of nature loss. The 2023 contest is open for entries until July 1, and the winner is promised a working prototype based on their design.
A 3D print download of Gillbert is available for open access so others might improve upon the initial design.
Join our mailing list
Get the latest articles, news and other updates from Khalifa University Science and Tech Review magazine