Why NASA won't use Windows. [Article]

Talk about anything you want, but keep it within the rules, please.
User avatar
vosszaa
Old Wrinkly Member
Posts: 247
Joined: March 7th, 2007, 7:04 am

Why NASA won't use Windows. [Article]

Post by vosszaa »

I found this quite interesting.

The first American in space, Alan Shepherd, said: "It's a very sobering feeling to be up in space and realise that one's safety factor was determined by the lowest bidder on a government contract." It must be even more sobering for some of today's astronauts that some of the computers up in space are older than they are. You might think the likes of NASA's Phoenix Mars Lander - which touched down on the red planet in May - would feature staggeringly powerful computing hardware and some of the most highly advanced software. But conditions beyond the Earth's atmosphere mean the computers on board space missions, whether manned or unmanned, are designed to survive rather than to impress. The tiniest design fault or software glitch could spell disaster - truly life or death.

Temperature variations, radiation, shock and vibration all put a greater strain on components than they will ever be subject to on Earth. The vacuum means that air-cooling isn't an option, and normal storage methods simply don't work. Combine that with a requirement for equipment to last ten times longer than your home PC ever will (the still-operational Voyager mission, for example, launched back in 1977), and you start to realise the enormity of the challenges faced by the scrupulous engineers that design the computer hardware and software inside shuttles and other space-bound vehicles.

The world of science fiction bandies around concepts such as robots and warp drives with an ease only afforded by the unconstrained imagination. By comparison, the reality is positively arcane. The twin Voyager spacecraft are the two furthest man-made objects from Earth - Voyager 1 is in the region of ten billion miles from the sun, but to get there has been a journey of 31 years. Think back to what passed for computing technology in 1977 and you'll start to get a picture of how dated Voyager's equipment actually is. Processing power on board each craft is provided by three RCA 1802 chips running at a pedestrian 6.4MHz, while RAM is limited to a paltry 12KB. But when Voyager 2 sent back groundbreaking data in December last year suggesting that the Solar System is asymmetrical, the age of the components didn't matter a jot. Voyager's project scientist Ed Stone described the Voyager missions as "the most romantic and beautiful project ever attempted by NASA" and, thanks to the reliability of its on-board systems, that project survives to this day - almost 20 years after it achieved its primary mission objective of a flyby of Jupiter and Saturn. And, partly thanks to the low power drain of those limited processors, the twin Voyager craft are expected to continue to feed us priceless data about our solar system until at least 2020.

With spacecraft, the "bottom line" is reliability: reducing faults to an absolute minimum through exhaustive and meticulous testing, built-in redundancy and sticking to tried and tested methods. And, while the computing equipment on the likes of the Voyager craft and the famous Apollo 11 mission that took Neil Armstrong to the moon is pitifully basic compared with even something as humble as an Eee PC, that doesn't mean it isn't essential to the success of those missions.

For Apollo 11, engineers on the ground calculated the real-time trajectory to the moon and back based on information fed to Earth by the shuttle's 30kg on-board computer. They also devised three separate solutions for the all-important lunar descent and compared reams of data transmitted from space with predicted values to detect potential problems. As Apollo 13 commander Jim Lovell is quoted as saying: "Space flights are not miracles, but are directly related to technological engineering on the ground."

Fast-forward to the present day, and the same painstaking approach to developing and testing computer components destined for space remains intact. The Phoenix's descent to the polar regions of Mars earlier this year may have marked another notable landmark for space exploration, but the computing technology inside it was far from cutting-edge. In fact, the 33MHz processor on the radiation-hardened RAD6000 single-board computer inside the shuttle can trace its legacy right back to the IBM PowerPC architecture that was previously to be found inside Apple Macs.

Yet, despite its modest credentials, the RAD6000 can be found in dozens of NASA programmes. Vic Scuderi, business area manager for BAE Systems' Space Electronics division - the company behind the RAD6000 - spoke to PC Pro about the challenges involved in building hardware and software for use in space. "We've been told it's become the workhorse of the industry," Scuderi said of the RAD6000. "It's been proven in space. One of our measurements isn't so much that we've got the latest and greatest, but that we're able to show by heritage this computer is able to do what it's supposed to do."

The stresses that computing hardware has to withstand in such extreme conditions include temperatures that can vary between plus and minus 120C, massive shock and vibration during launch and landing, and - probably the most threatening - radiation. Ionisation of the semiconductor materials leads to a slow degradation of transistor performance, resulting in increased current leakage and a shifting of switching thresholds. Sufficient accumulation will eventually cause failure of the chips comprising the computer system. Scuderi claims the RAD6000 was developed to withstand both single-event upset and total-dose radiation, and in serious doses. "A human being can't take more than 400 or 500 rads [the unit of measuring radiation]," he said. "When satellites are orbiting in extreme radiation belts, hardware has to be designed to cope with 500,000 to 1,000,000 rads."

This is achieved by combining radiation hard-by-design and hard-by-process techniques. The former ensures that custom-designed components can divert single-event upset radiation (similar to a lightning rod) to avoid damaging semiconductor materials, while the latter involves the silicon process used to create the chips from the ground up, so they can withstand radiation over time. This dual approach reduces the reliance on redundancy, where more than one of a particular system is present to reduce the effect of damage or failure caused by radiation. This methodical, ruthlessly disciplined development process ensures failures are kept to a minimum. "When I purchase an integrated circuit," said Scuderi, "I need to know when it was made, what lot it was part of, what testing was done, its certificates of conformance, and so on. What could have been a $3 commercial product becomes a $10,000 part, so when we reach our final testing, if a part failure occurs, we can trace it all the way back to the origins and find the source of the failure."

Raphael Some, New Millennium Program Technologist at NASA's Jet Propulsion Laboratory, told us: "The philosophy is to test to levels beyond those expected. Usually a safety factor of two to three is applied to environmental stresses at the component level, and somewhat less at the subsystem and system levels."

Unsurpassed reliability might be the primary requirement for space hardware, but if it can't match its dependability with the most parsimonious use of energy, it's about as likely to fly as an astronaut with the measles. Power is scarce, but every bit of power consumed produces the equivalent amount of heat, and dissipating that heat from a spacecraft in a vacuum isn't easy. There are usually large black radiators for this purpose that perform essentially the opposite function to the solar panels they visually resemble.

The Phoenix Mars Lander mission will be facing the sun for the duration of its three-month mission, so can rely purely on solar power to conduct its work. But solar panels produce modest power, and the RAD6000 computer is given an allocation of only 20W. "Typical power budgets for deep-space missions are of the order of a few tens of watts for the entire computer system," said Some. "This impacts not only the hardware that may be flown, but also the software that can be accommodated - both of which must be extremely efficient to ensure correct and timely operation within the power constraints."

While solar power is the most viable source of power for missions such as the Phoenix Mars Lander, once you get further away from the sun, it becomes less viable. In these cases - such as for the twin Voyager craft - radioisotope thermoelectric generators (RTGs) are used instead. The energy generated by the isotopic decay process of plutonium oxide is converted into heat and, in turn, to power by a thermoelectric converter. Each of the Voyager craft has three such generators on board that provide power to all its systems, including the on-board computers. Over time, there is a gradual decrease in the amount of energy they create, which means non-essential systems are slowly being switched off, but the RTGs are expected to provide enough power for the crafts to function until at least 2020.

The sheer variation in temperature is another problem that computer systems are exposed to. "Space can be extremely hot - surfaces facing the sun, or entering the atmospheres of Venus or Jupiter, for example - or extremely cold - surfaces facing deep space and shielded from the sun or when operating on the surface of Mars," Some told us. "There are two basic approaches used for dealing with these extreme environments: placing the computer equipment in a thermally-controlled environment, or designing the computer to operate in extreme temperature environments in the first place. "The former, involving the use of 'cold plates' and 'warm boxes', is the most commonly used strategy. This allows the computer to experience relatively benign temperature ranges at the cost of electric heaters, thermal radiators, heat pipes and other thermal-management hardware. The latter - based on the use of custom materials and circuits that can operate through extremely hot or cold environments - is non-standard, but is the only approach available in certain cases, and is becoming more prevalent as available power and mass margins continue to decrease, and the environments being visited become more challenging."

Of course, the strongest, most durable hardware in the world means nothing if the software that controls it isn't designed to the same standard. You won't find Windows loaded on board any space shuttle - the conditions and the need for far higher levels of stability than any commercial package can offer means the software found on spacecraft is as bespoke as the hardware. NASA's Some outlines the challenges: "Space missions require custom, one-off software that is correct and catastrophic-bug-free. It must operate in extreme environments, and in circumstances which are often unknown. In addition, the design of software systems needs to be tolerant to hardware faults as well as software faults.

"It must be possible for these software systems to be easily and reliably modified (patched, augmented or modified to accommodate new and unanticipated requirements, for example) after launch, from the ground, over low-bandwidth communication links and with minimal ability to remotely observe or control system operation during software mod installation, checkout and operation." With these challenges in mind, it's no surprise that software is designed in-house, and with the same obsessively detailed care and attention paid to hardware development. Each line of code is tested and re-tested, requiring multiple sign-offs before it's accepted as bug-free. The cost of failure doesn't bear thinking about: the Ariane 5 rocket's voyage lasted only 37 seconds before the software system crashed due to an error-handling element in the code not being enabled. The result? The $370 million-project ended in a ball of fire.

As the nature of space missions becomes more specialised, the demands placed on the software on board increases. The Phoenix Mars Lander is a case in point. "The software running the RAD6000 was responsible for navigating the spacecraft from Earth to Mars," Scuderi said. "The same computer was used for the landing process. Once on the ground, that same software evolves into a science lab, where all of the scoops of dirt collected by the robotic scoop are collected, placed in containers and analysed. All elements of this computer can be reprogrammed depending on need."

Such adaptability is likely to become increasingly important. In 1975, CC Kraft Jr observed: "Virtually every online, direct access, commercial computer system in the world reflects to some degree the space guidance and checkout requirements of some years ago." Now that situation is being reversed. The commercial sector's hunger for more energy-efficient processors and components means the space programme can now benefit from some of the power we take for granted in our PCs.

Some added: "The future of space computing is, in my view, likely to see high-performance multi-core machines based on commercial off-the-shelf components being used to implement parallel-processing clusters used as on-board computer servers for processing science data and to support autonomous operations.

"Missions being planned [even now] need significantly higher processing throughput than can be provided by radiation-hardened computers currently available or planned in the near future. The use of commercial state-of-the-art processing chips in fault-tolerant hardware/software architectures is a strategy that has the potential to meet these needs, has been in development for some time and should be ready for flight in the near future."

Scuderi gives an example of how companies such as BAE Systems are meeting this need with a new type of memory. "There are no hard drives that will survive 15 years in space," he said. "We are working with technology partner Ovonyx on a phase-change material called chalcogenide - the same material used on read/write CDs and DVDs. A small laser is used to heat up this material, and changes from a solid state to a crystalline state, so it's a basic binary sequence. We have taken this stuff and created C-RAM, which is non-volatile and non-destructive. So, in cases where programs are uploaded and changed as a space mission progresses, previous elements can be retained and stored, to be restored later. C-RAM is being looked into for future missions to Jupiter."

In the meantime, the likes of the Phoenix Mars Lander will continue to provide groundbreaking data and images from space. Designers of the Apollo spacecraft felt the on-board computer was so important it was virtually a "fourth crew member"; today, the computer is, arguably, the most important crew member of all.

From: http://www.pcpro.co.uk
Published by PCAuthority Australia
Image

The tallest tower.. begins from the ground
Today, you are novice..
Tomorrow, you might be The Master..
And when you are..
Vosszaa will hunt you down..