Tag Archive for NASA

Out of This World Ethernet

Out of This World EthernetA while ago I wrote about Ethernet marching on. The IEEE had ratified the IEEE 802.3bp Ethernet standard which addresses how Ethernet operates in harsh environments. Now Ethernet has been installed in the harshest environment where we live, the International Space Station. During an April 2019 Extravehicular Activities (EVA), U.S. astronaut Anne McClain and Canadian astronaut David Saint Jacques upgraded the International Space Station’s communication systems by installing Ethernet cables.

Cabling Install and Maintenance reports that during a six-plus-hour spacewalk the astronauts installed Ethernet cables on the exterior of the space station to upgrade the wireless communication system and to improve its hard-wired communication system.

CBS News says the spacewalker’s connected Ethernet cabling at the forward end of the station’s  U.S.’s primary research laboratory for U.S. payloads module (Destiny module) that will extend wireless connectivity for science instruments mounted outside the space station.

NASA Tweeted a video clip of the cable installation during which the narrator explained, “... They’ll be de-mating and mating some cables to provide additional Ethernet to the International Space Station.

rb-

Pulling more cable to expand wireless coverage – nice to know some things are truly universal. Whether you call it cable pulling, or mating cables, the truck-roll cost to the ISS must be pretty steep. At least NASA installers don’t need ladders.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

The Computer That Took Man To The Moon

The Computer That Took Man To The Moon 50 Years ago50 years ago Man first stepped on the Moon. When NASA’s Apollo 11 touched down in the Sea of Tranquility on July 20, 1969, it was a triumph of the human spirit. The Moon landing was also a technological triumph. The technological triumph was lead by the Apollo Guidance Computer (AGC).

Apollo moon mission guidance computer

The AGC helped the Apollo astronauts safely travel from Earth to the Moon and back. David Szondy at New Atlas explains that Apollo needed computers to navigate to the Moon. On Earth, navigation is about finding one’s way from one fixed point on the globe to another. For a trip to the Moon, navigation is more complex. He likened the planning to standing with a rifle on a turntable that’s spinning at the center of a much larger turntable. Then there is a third turntable sitting on the rim. And, all the tables are spinning at different and varying speeds. Now you have to hit the target by aiming at where it will be three days from now.

In order to hit the target of the Moon, the AGC provided spacecraft guidance, navigation, and control. The AGC was used in all of NASA’s Apollo Moon missions. The AGC was designed by Dr. Charles Stark Draper at the MIT Instrumentation Lab with the support of the AC Spark Plug Division of General Motors (GM), Kollsman Instrument Corporation. The AGC was built by Raytheon. It used approximately 4,000 integrated circuits from Fairchild Semiconductor.

The Apollo Guidance Computer was not much to look at. Mr. Szondy writes it looked like a brass suitcase. It was made of 30,000 components hand-built on two gold metal trays.  One tray was for memory. The second was for logic circuits. The AGC measured 24in × 12.5in × 6.5in and weighed in at 70 lb. Inside, it isn’t even very impressive by modern computer standards. It had about as much oomph as a Commodore 64 with a total of about 74 KB ROM and 4 KB RAM memory and a 12-microsecond clock speed. Gizmodo estimated it would cost $3000 to build an AGC —using 1960s-like components. Each AFC cost NASA around $200,000 (equivalent to $1.5 million today).

Three computers for each trip to the Moon

The AGC was carried aboard both the Command Service Module (CSM) and the Lunar Excursion Module (LEM). The computer flew on 15 manned missions, including nine Moon flights, six lunar landings, three Skylab missions, and the Apollo-Soyuz Test Mission in 1975.

Three computers were required for each mission. One on the CSM and two on the LEM. The CSM’s computer would handle the translunar and transearth navigation and the LEM’s would provide for autonomous landing, ascent, and rendezvous guidance. The second LEM computer was a backup designed to get the LEM back to the CSM in the event of a failure of the primary LEM AGS computer.

Margaret HamiltonThe scientist in charge of the software development program for the Apollo Guidance Computer was Margaret Hamilton, Director at the MIT Instrumentation Laboratory. AGC programs had to be written in low-level assembly language because high-level programming languages such as C for system programming had not yet been invented. The AGC programs were hard-wired into coils so it couldn’t crash.

DrDobbs explained the AGC used a unique form of Read-Only Memory (ROM) known as “rope core memory” to store its operating program. This technology used tiny rings of iron that had wires running through them. When a wire ran through the center of the ring, it represented the binary number 1. When it ran outside, it was 0. The result was an indestructible memory that could not be erased, altered, or corrupted.

rope core memory

NASA Apollo Rope core memory with a Quarter for scale

To program these rope memories, MIT used what they dubbed the LOL method, for “little old ladies.” This was because the programming was done by ex-textile workers who skillfully sent wire-carrying needles through the iron rings. They were aided by an automated system that showed them which hole in the workpiece to insert the needle into, but it was still a highly-skilled job that required concentration and patience.

Multitasking operating system

Apollo 11 LEM EagleThe Apollo Guidance Computer ran a multitasking operating system called EXEC, capable of executing eight jobs simultaneously. The two major lunar flight programs were called COLOSSUS and LUMINARY. The former was chosen because it began with “C” like the CSM, and the latter because it began with “L” like the LEM. Although these programs had many similarities, COLOSSUS and LUMINARY were the only ones capable of navigating a flight to the moon.

NASA also had to develop the discipline of software engineering for software validation and verification were developed, making extensive use of hardware and software simulators. By 1968, over 1,400 man-years of software engineering effort had been expended, with a peak manpower level of 350 engineers.

The AGC user interface, the DSKY (DiSplay&KeYboard) was mounted in both the Command Module and the Lunar Module. The astronauts had to enter commands and data for the AGC with large buttons the astronauts could operate with their spacesuit gloves on. The keyboard also gave them feedback beyond the other million lights and indicators in the cockpits.

Mainframe computerMr. Szondy put the scale of the AGC development in some context. The AGS was being developed at a time when computer technology and the entire electronics industry was undergoing a revolution. When the Apollo program began, computers were still gigantic machines that took up whole rooms. (rb– check out EMERAC in the 1957 movie Desk Set). There was only a handful of big iron in the entire world and they required a priesthood of attendants to care for and feed the monoliths. The engineers at NASA spent 2,000 man-years of engineering down-sizing main-frame technology to fit inside the Apollo spaceships.

And it wasn’t just computing technologies that were advancing. In 1958 the integrated circuit (IC) was introduced. The IC threw the whole question of who was designing and who was supplying computers into flux.

An early user of integrated circuits

ACG was one of the first computers to use integrated circuits. Integrated circuits of the time were rudimentary and very expensiveTexas Instruments (TXN) was selling ICs to the military for about $1,000 each. In 1963 the Apollo program consumed 60 percent of the integrated circuit production in the United States. By 1964, over 100,000 IC’s had been used in the Apollo program. when Philco-Ford was chosen to supply the ICs, the price had dropped to $25 each.

Mr. Szondy writes that the Apollo Guidance Computer is one of the unsung successes of the Space Race because it was so phenomenally successful, having had very few in-flight problems. The Apollo Guidance Computer led the way with an impressive list of firsts, The AGC was the first:

  • Most advanced fly-by-wire and inertial guidance system,
  • Digital flight computer,
  • Real-time embedded computing system to collect data automatically and provide mission-critical calculations,
  • Computer to use silicon chips, and
  • Onboard computer where the lives of crew depended on it functioning as advertised.

The AGC was the most advanced miniature computer to date.

rb-

In 1969 Scooby-Doo, Frosty the Snowman, and The Brady Bunch debut on TV. But what most people of a certain age remember is when 650 million people worldwide watched Neil Armstrong’s “one small step for man, one giant leap for mankind” to became a defining moment in the hearts and minds across the globe.

Related Posts

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Black Hole Data

Black Hole Data The first image of a black hole was published on April 10, 2019. The black hole, M87* at the center of the Messier 87 galaxy is located 53 million light-years away from Earth. NASA says a black hole is an extremely dense object from which no light can escape. Anything that comes within a black hole’s “event horizon,” will be consumed, because of the black hole’s unimaginably strong gravity.

the first image of a black hole

By its very nature, a black hole cannot be seen, the bright ring in the picture is the event horizon, the point where an object approaching a black hole is unable to escape its gravitational pull. Objects that pass into the event horizon go through spaghettification, a process, first described by Stephen Hawking, where gravitational forces stretch the object out like a piece of pasta. The M87* image shows a silhouette of the black hole against the glow of the event horizon captured by researchers at the Event Horizon Telescope (EHT).

APEX Atacama Pathfinder Experiment antenna.The EHT is the brainchild of Shep Doeleman, the director of EHT and astronomer at the Harvard-Smithsonian Center for Astrophysics. It is a virtual global array of eight ground-based radio telescopes. The EHT captured around 3.5 PB of data for the black hole image in April 2017. It then took two years to correlate the data to form the image. The EHT team not only had to figure out intergalactic science but also massive information technology problems. The researchers had to solve IT problems pretty typical for enterprise IT professionals, only bigger.

According to an article at SearchDataBackup each EHT telescope can record data at a rate of 64 Gbps, and each observation period can last more than 10 hours. The author calculated that each site generated around half a petabyte of data per run. The distributed locations included volcanoes in Hawaii and Mexico, mountains in Arizona and the Spanish Sierra Nevada, the Chilean Atacama Desert, and Antarctica. The sites were kept in sync using precise atomic clocks and GPS systems to carefully time the observations.

The data from each telescope was recorded at 16 Gbps and distributed among a total of 32 hard disk drives grouped into 4 modules of 8 disks each. The EHT can record a total rate at each site of 64 Gbps by using 4 units in tandem.

Sites making up the virtual Event Horizon Telescope.

 

One problem EHT ran into was the failure rate of traditional hard drives in the extreme telescope locations. ComputerWorld reports that 28 of 32 conventional hard drives failed at the Sierra Negra telescope, on the top of an extinct volcano in Mexico.

WD 10TB helium disk driveSearchDataBackup says the solution was helium hard drives. The hermetically sealed helium drives are self-contained environments, so they could survive the extreme environments in which EHT’s telescopes operated. EHT first deployed helium hard drives in 2015. EHT data scientist Lindy Blackburn told SearchDataBackup that EHT now uses about 1,000 helium drives with up to 10 TB of capacity from Western Digital, Seagate, and Toshiba. He told SearchDataBackup,

The move to helium-sealed drives was a major advancement for the EHT … Not only do they perform well at altitude and run cooler, but there have been very few failures over the years. For example, no drives failed during the EHT’s 2017 observing campaign.

The amount of data collected by EHT was too much to send over the Internet so the researchers went old-school and used FedEx sneakernet style to send the data to be processed. Geoffrey Bower an astronomer in Hawaii told ScienceNews that mailing the disks is always a little nerve-wracking. So far, there have been no major shipping mishaps. But the cost and logistics involved with tracking and maintaining a multi-petabyte disk inventory is also challenging. Therefore, EHT is always on the lookout for another method to move petabyte-scale data.

Cloud computing

`

 SearchDataBackup points out that normally the cloud would be a good option for long-term storage of unifying data sourced from multiple, globally distributed endpoints. However, Mr. Blackburn told them the cloud was not a cold storage option for the project. He said the high recording speed and the sheer volume of data captured made it impractical to upload to a cloud. He explained, “At the moment, parallel recording to massive banks of hard drives, then physically shipping those drives somewhere is still the most practical solution.”

The data collected on the helium hard disk drive packs were processed by a grid computer made of about 800 CPUs all connected through a 40Gbps network at the MIT Haystack Observatory MA, and the Max Planck Institute for Radio Astronomy in Germany.

Katie Bouman is the MIT student who developed the algorithm that pieced together the data from the EHT with disk drives

Geoff Crew, co-leader of the EHT correlation working group at Haystack Observatory told SearchDataBackup It is impractical to use the cloud for computing. Mr. Crew said;

Cloud computing does not make sense today, as the volume of data would be prohibitively expensive to load into the cloud and, once there, might not be physically placed to be efficiently computed.

The EHT scientists built algorithms that converted sparse data into images. They developed a way to cut the number of possible images by sorting out which results were physically plausible and which were wildly unlikely making it less hard to create the images.

The Haystack VLBI Correlator grid computer at the MIT Haystack Observator

Converting sparse data into images matters beyond astronomy. Mr. Blackburn told 538 the problem comes up in other areas as well; it occurs in medical imaging when doctors use MRIs to convert radio waves into pictures of your body. It’s also a key part of self-driving cars, which rely on computer visualization to “see” everything from potholes to people.

data protectionJust like any enterprise, EHT had to find a workable method of data protection. That includes deciding what won’t be protected. EHT has not found a cost-effective way to replicate or protect the raw radio signal data from the telescope sites. However, once the data has been processed and reduced to tens of petabytes it is backed up on-site on several different RAID systems and on Google Cloud Storage. Mr. Crew told SearchDataBackup;

The reduced data is archived and replicated to a number of internal EHT sites for the use of the team, and eventually, it will all be publicly archived. The raw data isn’t saved; we presently do not have any efficient and cost-effective means to back it up.

Mr. Blackburn said the raw data isn’t worth backing up. Because of the complexity of protecting such a large amount of data, it would be simpler to run another observation and gather a new set of data. Mr. Blackburn said; “Backing up original raw data to preserve every bit is not so important.”

Mr. Blackburn said he can’t seriously consider implementing a backup process unless it is “sufficiently straightforward and economical.

Instead, he said he’s looking at where technology might be in the next five or 10 years to find the best method to handle petabyte-scale raw data from the telescopes. Mr. Blackburn told SearchDataBackup;

Right now, it is not clear if that will be continuing to record to hard drives and using special-purpose correlation clusters, recording to hard drives and getting the data as quickly as possible to the cloud, or if SSD or even tape technology will progress to a point to where they are competitive in both cost and speed to hard disks

rb-

The image of the black hole validated Einstein’s general theory of relativity and proves that enterprise-class IT can solve intergalactic problems.

The EHT team had to figure out how to save, move and backup massive quantities of data and of course do more with less. EHT’s Geoff Crew summed up the problem most IT pros have; “Most of our challenges are related to insufficient money, rather than technical hurdles.”

Related articles
  • Trolls hijacked a scientist’s image to attack Katie Bouman. They picked the wrong astrophysicist. (MSN)

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

A Printer for Rocket Scientists

A Printer for Rocket ScientistsWe all dream about the elusive paperless office. Seems even rocket scientists can’t figure it out. Mashable is reporting that the rocket scientists aboard the International Space Station (ISS) research laboratory which orbits 254 miles above Earth and travels at more than 17,500 miles per hour print a lot. The astronauts print roughly 1,000 pages a month on two printers; one is installed on the U.S. side of the ISS, the other in the Russian segment. They print critical mission information, emergency evacuation procedures, and sometimes, photos from home on a 20-year-old printer.

The international space station is one of humanity's great engineering triumphs. washingtonpost.comNASA IT techs just ordered new printers for the International Space Station (ISS) to replace the Epson 800 Inkjet printers which have been on-board the ISS since the people moved in, in November of 2000. ISS told the author, “When the printer was new, it was like 2000-era tech and we had 2000-era laptop computers. Everything worked pretty good … the printer’s been problematic for the last five or six years.”

Stephen Hunter, Manager of ISS Computer Resources, called the Epson 800 Inkjet printer, “a museum piece.”  NASA had dozens of this printer and, as one failed, they’d send up another one.

Epson 800 Inkjet printerBut now it’s time for something new. In 2018, NASA will send two brand new, specialized printers up to the station. Mr. Hunter, who has been updating the ISS’s office technology for the last two years, told Mashable that the ISS printers have needed to be replaced for a long time. However, he can’t drive over to Best Buy, buy a new printer, and launch it into space.

He started working with HP (HPQ) on an ISS IT overhaul, replacing over 100 existing ISS workstations with HP Gen 2 Z-Book laptops for the crew, so it was only natural they would turn to HP again for the printer project. Enrique Lores, President of HP’s Imaging, Printing, and Solutions business welcomed the opportunity, “We couldn’t pass up the opportunity to do this … It was an incredible technical challenge.”

By Hewlett-Packard Company [Public domain], via Wikimedia CommonsHP couldn’t just suggest that NASA launch any ordinary laser printer into space. Its friable toner dust and significant power consumption would make it a poor fit for life in micro-gravity. Ronald Stephens Research and Development Manager for HP’s Specialty Printing Systems Division explained, “NASA had a very unique set of requirements that we had to meet.”

NASA wanted a printer that could:

• Print and handle paper management in zero gravity – On Earth printers rely on gravity for paper management. Whatever HP provided would have to hold the paper, so it didn’t jam in the printer or float away when the printer’s done with it according to Mashable.

NASA• Handle ink waste during printing – NASA’s Hunter explained that typical inkjet printers do deposit some extra ink during the printing process. With gravity in place, the ink typically stays in the printer or even on the printed sheet. In zero gravity, it floats out. The NASA IT expert said astronauts could ingest the ink or it could contaminate the crew’s numerous onboard experiments.

• Be flame retardant – HP replaced the printer’s shell with fire-retardant plastic.

• Be power-efficient – The ISS generates all its own electricity through solar panels. That means they must tightly manage power consumption. The article says any new device they bring on board must be power efficient. One bit of good news: HP doesn’t have to change the power configuration on the printer. The ISS can supply a standard 110 AV outlet.

Instead of building a specialized printer from scratch. HP recommended the HP Envy 5600. It’s a standard, all-in-one device you can buy at retail for $129.99. But the printers heading up to the ISS underwent significant modification.

We removed the capability to do scanning, fax, and copy out of it to reduce weight and remove glass portions,” said NASA’s Hunter.

Removing what could weigh the printer down or break and become a space disaster was only the start. The most challenging part was related to zero gravity. Ultimately, HP went through every printer system and component to analyze how it would be affected by zero gravity.

HP turned to 3D printing and developed, experimental 3D material — nylon filled with glass beads. Its unique properties allowed HP to swap out the multiple parts that make up the printer output tray and turn it into one that’s both lighter, flexible, and more reliable.

HP ISS PrinterAfter all the modifications, the HP space printer still looks like a printer. It’s 20 inches wide, 16 inches deep, and five inches high. There’s no lid or glass, but, aside from the 3D printed materials, the ISS’s next printer looks pretty unremarkable. The HP ENVY Zero-Gravity Printer still uses standard inkjet ink.

To work out the kinks of the new ISS printer, HP worked with a small team from NASA that included Pettit and three other astronauts. Astronauts’ concerns about printing in space are much the same as they are on the ground. “You want it to be uneventful… you want to hit print and have a hard copy,” said Pettit.

The Vomit Comet flies a parabolic flightUp to this point, all of NASA and HP’s work was theoretical. They did all they could to make the space printer space-ready. However, the only way to know if this printer is suitable for use on the space station before actually sending it to space is by testing it in zero gravity and the only way to do that is on NASA’s Vomit Comet.

The Vomit Comet is a plane that flies a parabolic flight. As it loops up and down, passengers achieve, at the peak of the curve, about 20 seconds of near-weightlessness. During those times, the team tested printing and that the paper flowed through the printer and ejected in the right way. NASA’s Hunter said, “It went flawlessly. Everything works to our expectation.”

By SpaceX (transferred from English Wikipedia) [Public domain], via Wikimedia CommonsNASA plans to send the first two printers up to the station on Elon Musk’s Space-X Dragon C16 rocket as part of Space X mission CRS-14 scheduled for launch in February 2018.

NASA and HP have retrofitted roughly 50 HP Envy printers and expect each one to last roughly two years. “We want to use this through the remainder of the ISS program. Officially through 2024, with plans through 2028,” said NASA’s Hunter.

This will be the last printer they get in the space station,” predicated HP’s Stephens.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

How Much Code Does It Take?

How Much Code Does It Take?David McCandless from Information is Beautiful tries to answer the question of how many millions of lines of code does it take to? For reference, the Visual Capitalist calculates that a million lines of code (MLOC), if printed, would be about 18,000 pages of text. That’s 14x the length of Leo Tolstoy’s War and Peace. The total lines of code to run systems vary widely as Mr. McCandless shows in the infographic.

  • Stack of paperIt took less than a million lines of code to run the NASA Space Shuttle.
    • The Mars Rover Curiosity takes less than 5 million lines of code to run.
    • The latest version of the Firefox web browser includes just under 10 million lines of code.
    General Motors’ (GM) Chevy Volt requires just over 10 million lines of code.
    Microsoft (MSFT) Office 2008 for the Apple (AAPL) Mac consists of over 35 million lines of code.
    • And it took 50 million lines of code to bring us Microsoft Vista.
    • Finally, all Google (GOOG) services combine for a whopping 2 billion lines – that means it would take 36 million pages to “print out” all of the code behind all Google services. That would be a stack of paper 2.2 miles high!

Information is Beautiful Infographic
Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.