Curiosity Swapping Out It's Main Computer


Recommended Posts

Curiosity is attempting one of the most complex and dangerous maneuvers possible: Switching out its primary on-board computer for the identical, redundant fail-safe computer. It is hoped that the swap will restore Curiosity to full operational capability. The failure is due to some corrupted flash memory.

Curiosity?s RCEs is a single-board RAD750 computer ? a radiation-hardened computer made by BAE that has a PowerPC 750 (G3) CPU clocked at around 200MHz, 256MB of RAM, 2GB of flash, and 256KB of EEPROM. It runs VXworks as its OS and managed with a Linux workstation.

http://www.extremete...l-functionality

Link to comment
Share on other sites

I believe they did something similar on the earlier rovers, I always love seeing these suckers pull through! (wasn't it opportunity that had bad flash and couldn't upload pictures at first?)

Link to comment
Share on other sites

I believe they did something similar on the earlier rovers, I always love seeing these suckers pull through! (wasn't it opportunity that had bad flash and couldn't upload pictures at first?)

If I remember right, yes. I think it's so interesting that such a complex advanced piece of machinery has about the same specs as a G3 iBook.

Link to comment
Share on other sites

If I remember right, yes. I think it's so interesting that such a complex advanced piece of machinery has about the same specs as a G3 iBook.

Seriously. We have smartphones with higher specs than Curiosity. It's actually a little sad.

Link to comment
Share on other sites

Seriously. We have smartphones with higher specs than Curiosity. It's actually a little sad.

Radiation is difficult to deal with since you have so many different types (Total Ionizing Dose, Single Event Upset, Proton, Neutron, etc...) that can kill electronics.
Link to comment
Share on other sites

Seriously. We have smartphones with higher specs than Curiosity. It's actually a little sad.

It isn't sad when you consider the importance of such a computer. The rover doesn't need anything high-end. It needs something that's radiation-hardened (e.g. resistant to ionizing radiation) and something that is reliable. You wouldn't put a Ferrari engine in a car you use to drive to work everyday right?

Link to comment
Share on other sites

Radiation is difficult to deal with since you have so many different types (Total Ionizing Dose, Single Event Upset, Proton, Neutron, etc...) that can kill electronics.

Oh I know. Doesn't make it any less sad. We're making water resistant dual-core phones (Xperia Z) and the Curiosity is floating up there with nearly decade old hardware, is the point. I'm sure there are things that can be done to improve the hardware we're sending up in space. Why wouldn't a dual or quad-core processor not be able to make it through space, but a processor essentially from 5 or more years ago? It's a bit ridiculous when you think about it!

It isn't sad when you consider the importance of such a computer. The rover doesn't need anything high-end. It needs something that's radiation-hardened (e.g. resistant to ionizing radiation) and something that is reliable. You wouldn't put a Ferrari engine in a car you use to drive to work everyday right?

No, of course you wouldn't. However, I'd argue this is hardly even remotely close to the same thing. We're talking about exploring space. You don't believe that hardware capable of doing 10-20 times more would be infinitely more efficient? Using your analogy if we were to go into space today, you would rather have an engine built many years ago, or something built by today's standards and technology?

Link to comment
Share on other sites

You wouldn't put a Ferrari engine in a car you use to drive to work everyday right?

...I would.

(sorry, somebody had to say it)

Oh I know. Doesn't make it any less sad. We're making water resistant dual-core phones (Xperia Z) and the Curiosity is floating up there with nearly decade old hardware, is the point. I'm sure there are things that can be done to improve the hardware we're sending up in space. Why wouldn't a dual or quad-core processor not be able to make it through space, but a processor essentially from 5 or more years ago? It's a bit ridiculous when you think about it!

No, of course you wouldn't. However, I'd argue this is hardly even remotely close to the same thing. We're talking about exploring space. You don't believe that hardware capable of doing 10-20 times more would be infinitely more efficient? Using your analogy if we were to go into space today, you would rather have an engine built many years ago, or something built by today's standards and technology?

I think the Curiosity team explained why such a slow processor on their AMA. In short, the specs get finalized years before the actual mission and when Curiosity's planning started, that's what they had available.

Link to comment
Share on other sites

Oh I know. Doesn't make it any less sad. We're making water resistant dual-core phones (Xperia Z) and the Curiosity is floating up there with nearly decade old hardware, is the point. I'm sure there are things that can be done to improve the hardware we're sending up in space. Why wouldn't a dual or quad-core processor not be able to make it through space, but a processor essentially from 5 or more years ago? It's a bit ridiculous when you think about it!

No, of course you wouldn't. However, I'd argue this is hardly even remotely close to the same thing. We're talking about exploring space. You don't believe that hardware capable of doing 10-20 times more would be infinitely more efficient? Using your analogy if we were to go into space today, you would rather have an engine built many years ago, or something built by today's standards and technology?

I think there must be a technical reason to choose that old system instead of our 8 cores.
Link to comment
Share on other sites

Oh I know. Doesn't make it any less sad. We're making water resistant dual-core phones (Xperia Z) and the Curiosity is floating up there with nearly decade old hardware, is the point. I'm sure there are things that can be done to improve the hardware we're sending up in space. Why wouldn't a dual or quad-core processor not be able to make it through space, but a processor essentially from 5 or more years ago? It's a bit ridiculous when you think about it!

Power consumption, for one. Plus, as was already said, things like this are designed for a specific processing requirement and anything more than that is wasted resources. Your phone has million more functions than Curiosity, so of course it's going to have a million times more processing power. In the electronics world, a few MHz and a few KB, goes a long long way.

Link to comment
Share on other sites

Seriously. We have smartphones with higher specs than Curiosity. It's actually a little sad.

but the reason for it is radiation hardening, way way more testing in space like conditions, etc...

Link to comment
Share on other sites

No, of course you wouldn't. However, I'd argue this is hardly even remotely close to the same thing. We're talking about exploring space. You don't believe that hardware capable of doing 10-20 times more would be infinitely more efficient? Using your analogy if we were to go into space today, you would rather have an engine built many years ago, or something built by today's standards and technology?

I understand what you're saying and I agree, it makes sense to use powerful hardware to aid in the exploration of space and planets. However, it makes even more sense to use something that's reliable and stable. A computer hardware failure would kill a mission. Also, I'm sure they'd use something more powerful if they needed it. Anything beyond that would be a waste of power.

Link to comment
Share on other sites

In response to all of the above responses directed towards me:

Don't get me wrong, I don't think an octo-core is necessary. I also understand the issues with power consumption and wasted resources. I don't expect them to use brand-spanking new, top of the line hardware. I just find it odd that it'sas behind as it is. Personally, I think they could have done better. Maybe next time? ;)

Link to comment
Share on other sites

You wouldn't put a Ferrari engine in a car you use to drive to work everyday right?

I would, I'd have the only Ferrari powered Vauxhall Omega in existence, (I think)

Link to comment
Share on other sites

The RAD computers are designed from go to be radiation resistant, largely by using larger circuit elements on the die at lower clock speeds. This allows it to continue even if one of the CPU circuit elements takes a direct hot by a cosmic ray (usually energetic protons) because the ions that event creates are few in a much larger current flow. Higher clock speeds and smaller elements would make these radiation induced ions statistically more significant. They also use error correction to a much greater degree than other computers.

SpaceX uses newer computer bits, but compensates for each board having a bit more radiation sensitivity by using them in polled arrays; if a board takes a hit and sends a result differing from the others it gets voted off the island and the others continue. Later they can reset it to see if it's let back in the game.

Tidbit: a couple of says ago a UK micro-satellite was orbited by an Indian launcher that is to test using a Nexus One smartphone as its main computer. Should get results soon.

  • Like 2
Link to comment
Share on other sites

Also see They Write The Right Stuff. Consider how obnoxiously buggy and indigested nearly every piece of today's code is. Can't do any updates over 14 minute latency, ~2 (!) kbit/s channel for something that had to be got there in 8 months and cost several billion. There's zero room for error and zero minus one room for any field testing - it has to work and it's accepted that even then it will fail for unknown reasons (and it does). Solution - to slow the development down to such a mindboggling degree to iron out every possible bug, optimize it thoroughly and check it for what otherwise would be considered absolute impossibilities. And that's why "we haven't got anywhere". It's for this reason I'd argue that we haven't got anywhere near better with our earthly, ridiculously fast hardware and big blobs of software and useless data, either - it's many orders of magnitude more inefficient, buggy and insecure, being crapped out like shyte.

The middle road is to throw hardware at the problem - multiple arrays of lower cost. I think it's the wrong way - expecting and allowing for known problems.

Link to comment
Share on other sites

How long does it take for commands sent to reach curiosity ?

The one-way communication delay with Earth varies from 4 to 22 minutes, depending on the planets' relative positions, with 12.5 minutes being the average. Took 13:46 minutes at the time of landing.

Link to comment
Share on other sites

The more interesting question, how do they send commands to something that is constantly moving away or coming closer at high speeds. The baudrate must be ever changing

Link to comment
Share on other sites

Oh I know. Doesn't make it any less sad. We're making water resistant dual-core phones (Xperia Z) and the Curiosity is floating up there with nearly decade old hardware, is the point. I'm sure there are things that can be done to improve the hardware we're sending up in space. Why wouldn't a dual or quad-core processor not be able to make it through space, but a processor essentially from 5 or more years ago? It's a bit ridiculous when you think about it!

No, of course you wouldn't. However, I'd argue this is hardly even remotely close to the same thing. We're talking about exploring space. You don't believe that hardware capable of doing 10-20 times more would be infinitely more efficient? Using your analogy if we were to go into space today, you would rather have an engine built many years ago, or something built by today's standards and technology?

Your's is a perfect example of consumer mentality. Hardware is not really obsolete until it stops working yet the vendors have make us believe we always need the latests and greatest (and more expensive). With the right software and optimizations apparent low hardware can be enough for any task. I mean, I'm sure Nasa techs know a little more about this that us, right? And I'm willing to bet that Curiosity is a little bit more mission critical than our facebook-checking pocket devices. Just a little.

Link to comment
Share on other sites

Why do people think old hardware is bad? Old hardware is much more known about than new hardware, it's characteristics are known, as are rare bugs and other problems, how long they last etc. plus you can optimise them to use much less power by shrinking the dye.

The reason ABSOLUTELY NO single-device (this excludes clusters) mission critical system uses a brand new CPU or type of RAM, it hasn't been tested and for all you know after spending ?500m on getting a spacecraft built and launched into space could fail right away.

I wish people would stop moaning that things don't have the latest and greatest hardware in them - they don't need the latest and greatest hardware, what they need is reliability.

Link to comment
Share on other sites

This topic is now closed to further replies.