CHICAGO – Crewmembers on the International Space Station will help a group of researchers at NASA’s Ames Research Center test robotic software this summer that could take over some of the time-consuming maintenance chores that keep them from conducting experiments on the orbiting laboratory.

"One of the big things NASA cares a lot about is how can we make astronaut time in space, whether it’s on the space station or missions beyond the space station, more efficient and more productive," said Terrence Fong, one of the Ames researchers. "One of the big challenges we face right now is the vast amount of time the crew spends doing routine, repetitive, tedious maintenance work."

For the third year in a row, the Ames group will use the Synchronized Position, Hold, Engage, Reorient, Experimental Satellites (Spheres) testbeds on board the station to evaluate systems intended for free-flying astronaut-helpers in space. Spheres is a set of basketball-sized spacecraft powered by compressed gas that maneuver inside the station’s pressurized modules, guided by software uploaded from the ground for testing.

In the telerobotic free-flyer project, researchers upgraded the original Spheres processors with off-the-shelf smartphones, which far outstrip the initial capability and are able to handle the processing requirements of real-time telerobotics, said Fong, who presented an update on his team’s work to the ISS Research and Development Conference here.

In the first series of tests, run Dec. 12, 2012, with the "Smart Spheres" upgrade, the group demonstrated simple ground control using a laptop at Mission Control Center-Houston. One of the Spheres conducted a simulated inspection flight inside Japan’s Kibo module – the largest on the station. The following year the group loaded controls into one of the station laptops so the crew could command a Smart Sphere as it orbited another Sphere holding a stationary position, to exercise the kind of guidance and pointing that would be needed to inspect the station interior or sample sound levels and air quality.

They also evaluated the use of relative Wi-Fi signal strength as a way to guide a Smart Sphere from one station module to another, a process that didn’t work very well because of the limited number of Wi-Fi sources inside the station – two in U.S.-built nodes and one in a Russian module.

"You can realistically rely on being able to detect your location relative to two or at most three wireless access points," Fong said of the signal strength "fingerprinting" technique used with the smartphone’s Wi-Fi receiver. "In a typical office building today the density is much higher. You might see several or even dozens of access points as you move around."

This summer the group plans to upgrade the Spheres again with an even smarter phone – a Google "Project Tango" prototype that includes a structured lighting sensor drawn from X-Box Connect technology. The Ames researchers want to use light patterns beamed by the smartphone sensor to generate 3D vision that can guide the floating sphere through the station interior.

"We’ve done a bunch of testing in the lab at Ames and … a bunch of parabolic flights to try to characterize the performance of that smartphone," Fong said. " … This gives us the ability to sense in real time, at pretty high frame rates – more than 30 hertz – the depth of the various things in the world, and based on that we can construct 3D models as well do 3D navigation."