Category Archives: NASA_JPL

Google’s Project Tango headed to International Space Station

 

Source from: http://www.pcworld.com/article/2110660/googles-project-tango-headed-to-international-space-station.html

Google’s Project Tango, the prototype smartphone packed with sensors so it can learn and sense the world around it, is heading to the International Space Station.

Two of the Tango phones are due to be launched to the ISS on the upcoming Orbital 2 mission, which is scheduled to launch in May and take supplies to the station. The phones will be used as part of a NASA project that’s developing robots that could one day fly around the inside or outside of the space station, or even be used in NASA’s planned mission to land on an asteroid.

Work on the robots is already going on at NASA’s Ames Research Center in Silicon Valley, and this week the space agency let a small group of reporters visit its lab and see some of the research.

spheres space station
Three Spheres satellites float inside the International Space Station.

The phones, which are being supplied to a limited number of developers at present, were unveiled by Google a month ago. They include several cameras and infrared range-finding so the phone can build up a three-dimensional model of its surroundings—a significant difference from current handsets that can see only a two-dimensional world through a single camera.

Google has already shown the phones being used to build up a detailed map of the interior of a home or office, but NASA has much bigger plans. At Ames, which is just minutes from Google’s Mountain View headquarters, researchers have attached a Tango handset to a robot development platform called a “Sphere.”

Technically an 18-sided polyhedron, each Sphere is about the size of a soccer ball and self-powered. They can free-fly around the inside of the ISS thanks to carbon dioxide-powered thrusters, said Chris Provencher, Smart Spheres project manager at NASA.

The Spheres have already been used in developing autonomous equipment. The space agency conducted a Spheres test with a Nexus S smartphone as part of Shuttle mission STS-135 in 2011, but the Tango phones promise more capabilities.

“We are researching how effective Project Tango’s vision-based navigation capabilities are for performing localization and navigation of a mobile free flyer on ISS,” said Andres Martinez, Spheres Manager at NASA.

“Specifically, we are researching how well the 3-D modeling and visual odometry can be used to let the [Spheres] free flyer learn its environment and maneuver through it based on what it sees,” said Martinez. “This is in contrast to the current Spheres localization system, which relies on fixed sensors in the environment to help the Spheres track its position.”

On Monday, NASA Administrator Charles Bolden saw a demonstration of the Tango-equipped Spheres during a visit to Ames. One was connected to a Spheres satellite, which was slowly gliding across a huge granite table in a laboratory.

There are already three Spheres units on the ISS.

Hearing that researchers are working toward a robot that would autonomously fly around the inside and possibly outside of the ISS carrying out checks, Bolden asked if the same technology could be put to use on NASA’s planned asteroid mission. The space agency wants to approach and capture a piece of an asteroid, and Bolden wondered if the work could form the base of a robot that could approach, analyze and help identify a target for the mission.

That could be so, said Provencher.

Researchers hit upon the idea of using smartphones in their development work when they realized the features they wanted—Wi-Fi, a camera, more processing power—were all present in an off-the-shelf device.

The phones in use by NASA have had several minor modifications. The lithium-ion battery pack has been removed, the phone is powered by six AA batteries and the cellular radio chip has also been removed to put it into “the ultimate airplane mode,” said Provencher. A cover has also been put over the screen to contain pieces of glass should it be shattered.

(Additional reporting by Melissa Aparicio in San Francisco.)

A close(ish) encounter with Voyager 2

Proudly source from: http://robohub.org/a-closeish-encounter-with-voyager-2/

It is summer 1985. I’m visiting Caltech with colleague and PhD supervisor Rod Goodman. Rod has just been appointed in the Electrical Engineering Department at Caltech, and I’m still on a high from finishing my PhD in Information Theory. Exciting times.

Rod and I are invited to visit the Jet Propulsion Labs (JPL). It’s my second visit to JPL. But it turned into probably the most inspirational afternoon of my life. Let me explain.After the tour the good folks who were showing us round asked if I would like to meet some of the post-docs in the lab. As he put it: the fancy control room with the big wall screens is really for the senators and congressmen – this is where the real work gets done. So, while Rod went off to discuss stuff with his new Faculty colleagues I spent a couple of hours in a back room lab, with a Caltech post-doc working on – as he put it – a summer project. I’m ashamed to say I don’t recall his name so I’ll call him Josh. Very nice guy, a real southern californian dude.

Now, at this point, I should explain that there was a real buzz at JPL. Voyager 2, which had already more than met its mission objectives was now on course to Uranus and due to arrive in January 1986. It was clear that there was a significant amount of work in planning for that event. The first ever opportunity to take a close look at the seventh planet.

So, Josh is sitting at a bench and in front of him is a well-used Apple II computer. And behind the Apple II is a small display screen so old that the phosphor is burned. This used to happen with CRT computer screens – it’s the reason screen savers were invented. Beside the computer are notebooks and manuals, including prominently a piece of graph paper with a half-completed plot. Josh then starts to explain: one of the cameras on Voyager 2 has (they think) a tiny piece of grit* in the camera turntable – the mechanism that allows the camera to be panned. This space grit means that the turntable is not moving as freely as it should. It’s obviously extremely important that when Voyager gets to Uranus they need to be able to point the cameras accurately, so Josh’s project is to figure out how much torque is (now) needed to move the camera turntable to any desired position. In other words: re-calibrate the camera’s controller.

At this point I stop Josh. Let me get this straight: there’s a spacecraft further from earth, and flying faster, than any manmade object ever, and your summer project is to do experiments with one of its cameras, using your Apple II computer. Josh: yea, that’s right.

Josh then explains the process. He constructs a data packet on his Apple II, containing the control commands to address the camera’s turntable motor and to instruct the motor to drive the turntable. As soon as he’s happy that the data packet is correct, he then sends it – via theRS232 connection at the back of his Apple II – to a JPL computer (which, I guess would be a mainframe). That computer then, in turn, puts Josh’s data packet together with others, from other engineers and scientists also working on Voyager 2, after – I assume – carefully validating the correctness of these commands. Then the composite data packet is sent to theDeep Space Network (DSN) to be transmitted, via one of the DSNs big radio telescopes, to Voyager 2.

Then, some time later, the same data packet is received by Voyager 2, decoded and de-constructed and said camera turntable moves a little bit. The camera then sends back to Earth, again via a composite data packet, some feedback from the camera – the number of degrees the turntable moved. So a day or two later, via a mind-bogglingly complex process involving several radio telescopes and some very heavy duty error-correcting codes, the camera-turntable feedback arrives back at Josh’s desktop Apple II with the burned-phosphor screen. This is where the graph paper comes in. Josh picks up his pencil and plots another point on his camera-turntable calibration graph. He then repeats the process until the graph is complete. It clearly worked because six months later Voyager 2 produced remarkable images of Uranus and its moons.

This was, without doubt, the most fantastic lab experiment I’d ever seen. From his humble Apple II in Pasadena Josh was doing tests on a camera rig, on a spacecraft, about 1.7 billion miles away. For a Thunderbirds kid, I really was living in the future. And being a space-nerd I already had some idea of the engineering involved in NASA’s deep space missions, but that afternoon in 1985 really brought home to me the extraordinary systems engineering that made these missions possible. Given the very long project lifetimes – Voyager 2 was designed in the early 1970s, launched in 1977, and is still returning valuable science today – its engineers had to design for the long haul; missions that would extend over several generations. Systems design like this requires genius, farsightedness and technical risk taking. Engineering that still inspires me today.

*it later transpired that the problem was depleted lubricant, not space grit.