Cornell University

12/02/2024 | Press release | Distributed by Public on 12/02/2024 09:44

Smallest walking robot makes microscale measurements

Cornell researchers in physics and engineering have created the smallest walking robot yet. Its mission: to be tiny enough to interact with waves of visible light and still move independently, so that it can maneuver to specific locations - in a tissue sample, for instance - to take images and measure forces at the scale of some of the body's smallest structures.

"A walking robot that's small enough to interact with and shape light effectively takes a microscope's lens and puts it directly into the microworld," said Paul McEuen, the John A. Newman Professor of Physical Science Emeritus in the College of Arts and Sciences (A&S), who led the team. "It can perform up-close imaging in ways that a regular microscope never could."

The team's paper, "Magnetically Programmed Diffractive Robotics," published Nov. 28 in Science, with McEuen as corresponding author. Conrad Smart, researcher at Cornell's Laboratory of Atomic and Solid State Physics (LASSP), and Tanner Pearson, Ph.D. '22, are the study's co-first authors.

Cornell scientists already hold the world's record for the world's smallest walking robot at 40-70 microns.

The new diffractive robots are "going to blow that record out of the water," said Itai Cohen, professor of physics (A&S) and a co-author of the study. "These robots are 5 microns to 2 microns. They're tiny. And we can get them to do whatever we want by controlling the magnetic fields driving their motions."

Diffractive robotics connects, for the first time, untethered robots with imaging techniques that depend on visible light diffraction - the bending of a light wave when it passes through an opening or around something. The imaging technique requires an opening of a size comparable to the light's wavelength. For the optics to work, the robots must be on that scale, and for the robots to reach targets to image, they have to be able to move on their own. The Cornell team has achieved both objectives.

[Link]
Credit: Jason Koski/Cornell University

Itai Cohen, professor of physics, in his lab in the Physical Sciences Building.

Controlled by magnets making a pinching motion, the robots can inch-worm forward on a solid surface. They can also "swim" through fluids using the same motion.

The combination of maneuverability, flexibility and sub-diffractive optical technology create a significant advance in the field of robotics, the researchers said.

"I'm really excited by this convergence of microrobotics and microoptics," said co-author Francesco Monticone, associate professor of electrical and computer engineering in Cornell Engineering, who designed the optical diffractive elements and helped the team identify applications. "The miniaturization of robotics has finally reached a point where these actuating mechanical systems can interact with and actively shape light at the scale of just a few wavelengths - a million times smaller than a meter."

To magnetically drive robots at this scale, the team patterned the bots with hundreds of nanometer-scale magnets that have an equal volume of material but two different shapes - long and thin, or short and stubby. The idea, Cohen said, originated with Fudan University physicist Jizhai Cui.

"The long, thin ones need a larger magnetic field to flip them from pointing one way to pointing the other, while the short, stubby ones need a smaller field," Cohen said. "That means you can apply a big magnetic field to get them all aligned, but if you apply a smaller magnetic field, you only flip the short, stubby ones."

Cornell scientists combined this principle with very thin films invented at the Cornell Nanoscale Science and Technology Facility to create the robots.

One of the main optical engineering challenges was figuring out the most suitable approach for three tasks - tuning light, focusing, and super-resolution imaging - for this specific platform, because "different approaches have different performance trade-offs depending on how the microrobot can move and change shape," Monticone said.

There's a benefit to being able to mechanically move the diffracting elements in order to enhance imaging, Cohen said. The robot itself can be used as a diffraction grading, or a diffractive lens can be added. In this way, the robots can act as a local extension of the microscope lens looking down from above.

The robots measure forces by using the same magnet-driven pinching motion that enables them to walk to push against structures.

"These robots are very compliant springs. So as something pushes against them, the robot can squeeze," Cohen said. "That changes the diffraction pattern, and we can measure that quite nicely."

Force-measurement and optical abilities can be applied in basic research, as in explorations of the structure of DNA, the researchers said. Or they might be deployed in a clinical setting.

"Looking to the future, I can imagine swarms of diffractive microbots performing super-resolution microscopy and other sensing tasks while walking across the surface of a sample," Monticone said. "I think we are really just scratching the surface of what is possible with this new paradigm marrying robotic and optical engineering at the microscale."

Contributing authors to the study are Zexi Liang, postdoctoral associate at LAASP; Melody X. Lim, experimental fellow at the Kavli Institute at Cornell for Nanoscale Science (KIC); and Mohamed I. Abdelrahman, doctoral student in electrical and computer engineering at Cornell Engineering.

The research was made possible by the Cornell Center for Materials Research, the National Science Foundation and the Cornell NanoScale Science and Technology Facility.

Kate Blackwood is a writer for the College of Arts and Sciences.