November 13, 2009

“Cyborg astronauts” could scan for life on Mars

U of C researcher Patrick McGuire (B.A. ’89) envisions a future where “cyborg astrobiologists,” astronauts equipped with high-tech headgear and software, scour the surface of Mars, electronically sifting through mountains of rocks and soil before being directed to only the most worthwhile samples.

McGuire is looking to reinvent the way researchers study other worlds by creating robots that can not only see, but independently analyze images and find patterns that might ultimately give clues to the existence of extraterrestrial life.

He hopes to apply the technology to astronauts’ spacesuits, resulting in cyborg spacefarers that can process visual stimuli the naked eye can’t.

These cyborg astronauts have the potential to make important discoveries in the field: In testing in the western U.S., McGuire’s system has been used to analyze small iron deposits in sedimentary rock, called redbeds, which McGuire says are sometimes caused by fungus-algae hybrids known as lichens.

Lichens that grow on the redbeds are “not obvious to the human eye,” McGuire said, but cyborgs might be a way to find these tiny signs of primitive life.

Using a neural network developed by physicist John Hopfield, called a Hopfield Neural Network (HNN), McGuire’s system mimics biological sight by comparing new data with previously collected data. This way, McGuire’s robots can single out objects that are different from their surroundings.

A patch of land is recorded as an image and analyzed in layers of red, blue, and green using software McGuire developed himself. Numerical values are assigned to each color, which are transformed into values for hue, saturation, and intensity.

The HNN maps these averages onto distinct points in space and creates diagrams out of them. The computer breaks these images into similar areas, called segmentation maps.

If any points on the map differ from their surroundings, they show up on “uncommon maps,” which detect and highlight unique areas of coloring. From these uncommon maps, McGuire creates interest maps, which are the final product of the analysis.

In the past, astronomers have primarily launched unmanned robots, like the Mars rovers Spirit and Opportunity, to take photographs and soil samples, which are beamed back for earth-bound explorers to study. McGuire’s system allows a robot, or robot-assisted human explorer, to focus on areas more likely to merit further investigation. Though most maps created by HNN come from still pictures, McGuire hopes to apply his neural system to video, looking for significant changes in color over time as land is traversed using feedback loops, allowing the robots to “look at a series of images in real time,” McGuire said.

Because of the feedback, the novelty detection software has a memory of stored information based on what is has already seen. The AI system “takes the set of ones and zeros [assigned to the colors] and looks for other sets of ones and zeros that [are] similar to that,” McGuire said. “It’s a way to look for patterns that are familiar.”

Even before cyborg-astrobiologists become fully autonomous, astronauts could use this type of technology for planetary exploration. As of now, the system is a “wearable computer,” McGuire said, with the potential of becoming a semi-autonomous system functioning without human control.

The computer system includes a video camera, visor, keyboard, mouse, and long-lasting battery pack that all strap on to the user.

To develop the system, McGuire drew from his experience with neural networks that began with his undergraduate education at U of C. “I brought ideas, know-how, and technology” to create the cyborg system, he said. Past projects required him to be able to find points of interest in images, which helped him come up with the system of uncommon mapping.

Besides being used in space, McGuire said this technology could be used to study places with harsh and extreme environments on earth, such as the ocean floor and Antarctica.