The Griffin Museum of Science and Industry (MSI) hosted its annual Robot Block Party on April 4 and 5 in celebration of National Robotics Week. Graduate students from the University of Chicago, Northwestern University, Illinois Institute of Technology, and Toyota Technological Institute of Chicago showcased prototypes capable of sensing and adapting to their surroundings during the special exhibition.
National Robotics Week began in 2010 to highlight the “growing importance of robotics in a wide variety of application areas and emphasizes its ability to inspire technology education,” according to their website. MSI joined the initiative in 2011 and has since spotlighted inventions and technologies designed by Chicago-area university students and launched a seasonal Robot Revolution exhibit. Robot Revolution previously featured “pre- and post-visit lessons and field trip worksheets,” along with more expansive exhibits like the Drone Zone and a Google Self-Driving Car simulation, before closing in 2018.
Voula Saridakis, the head curator of MSI, spoke with the Maroon and explained that researchers have multiple reasons to present during this two-day event. In previous years, roboticists were able to troubleshoot and refine their prototypes in a real-world setting and “set up displays and prototypes of robots that were eventually going to be added to the Robot Revolutions exhibit,” Saridakis said. Additionally, presenting to a family-friendly audience allowed researchers to generate “a buzz and excitement over [their] robots and technology,” potentially encouraging more people to pursue careers in robotics.
Many of the exhibition’s robots were built to interact dynamically with their environments, responding to verbal cues from subjects or visual cues from a set of sensors or cameras. The Baxter robot, programmed in the Robot Intelligence through Perception Laboratory at the Toyota Technological Institute at Chicago, can mimic the movement of human limbs based on information it gathers with sensors above its animated screen.
When a subject stands in an enclosed space, Baxter’s deep learning model can perceive flexible objects like joint angles and send information back to the robot. Luzhe Sun, who presented Baxter at the Block Party, emphasized how his team is “coding with the current artificial intelligence technology” and applying AI to commercially produced hardware.
Recent breakthroughs in artificial intelligence have also opened the door to integrating robots as educational tools in classroom settings. Lauren Wright, a Ph.D. student in the Human–Robot Interaction Lab at UChicago, is applying large language models (LLMs) to facilitate personalized and autonomous human–robot interactions.
Wright’s robot, Misty, has been teaching fourth-graders social and emotional learning skills. When programmed with generative AI, which can customize dialogue and responses, Wright believes robots can meet this need by teaching material in a “lower stress, more friendly environment.”
Wright’s lab had interviewed teachers and determined that students routinely disengage from learning social and emotional skills “which may be coming from a place of discomfort with doing this type of lesson in front of everybody,” Wright said.
“When we do lessons with social emotional learning, it’s the whole class. Sometimes they’re talking about topics that are more fragile like problem solving with friends and empathy. Those can be touchy topics for some people,” Wright said.
The Misty robot offers an alternative: a one-on-one lesson that is delivered in a lower stress environment. Misty’s LLM-based software relies on patterns gathered from massive amounts of text and surface-level associations to form a response. Misty is capable of producing contextually appropriate and seemingly authentic responses, though it often misses nuance and has difficulty detecting inflection, tone, and sarcasm. Wright reiterated its utility in classroom settings, where communication tends to be more structured and predictable rather than conversational or spontaneous.
However, the MSI event also demonstrated Wright’s technical challenges. She restarted the program during an interaction multiple times because external noise was being misinterpreted as user input. Similarly, NoodleBot, co-designed and presented by Northwestern University Ph.D. student Allison Pinosky Gaines, had used the conditions during the event as a chance to troubleshoot and improve their algorithm.
NoodleBot focused on applying a concept Pinosky’s lab describes as “Maximum Diffusion Reinforcement Learning”—using an algorithm that “encourage[s] robots to explore their environments as randomly as possible in order to gain a diverse set of experiences.”
Pinosky lauded NoodleBot’s performance “moving around its environment and figuring out a policy to move in a certain direction” in an unfamiliar environment like the museum.
MSI’s main hall presented several challenges for the environmental information–based robot. Its loud background noises interfered with the robots’ GPS systems and slippery floors limited their mobility. Pinosky plans to take what they learned at the Block Party to improve their algorithm. “What’s cool is that we get to show people in the museum what we’re doing, talk a little about research, but also testing our algorithm in a challenging environment,” Pinosky said.
While some exhibits highlighted challenges faced by robots in dynamic environments, others focused on the practical feasibility of replicating their work. Zhengyang Kris Weng, another Ph.D. student at Northwestern, spent eight weeks designing a biomimetic dexterous hand that can mimic human hand movement.
Weng adopted an iterative approach to replicating the biological components of a human hand, with a cable-tensioning system to maintain tension along the tendon. “It makes the fingers themselves very light and inertia very low,” Weng said. This decreased the amount of lag between different positions and increased the accuracy of its movement.
Additionally, when the subject puts on a “Meta Quest 3S headset that calculates and broadcasts tracked hand joint angles,” the biomimetic hand can replicate the human hand’s movements. Weng was especially proud that his project is affordable, versatile, and open-source: “If people who see it and are interested, they can actually go and build their own at home.”
The presenters expressed strong optimism about the future of robotics. Saridakis said she was keen on the MSI’s mission to “inspire the inventive genius in everyone” and hopes events like these “get [people] excited about robotics and other STEM-related fields.”
For students like Pinosky, public events are a chance to think big and get excited about real problems—something that reminds her of the halcyon days of her own adolescent curiosity.
“When you’re a kid and see people working on real problems that you engage with and touch and see, it just seems more exciting than something that you might see on TV,” Pinosky said. “I was really inspired by that as a kid, so I’d want other people to have the same experience.”