OrbitsEdge
OrbitsEdge
  • Home
  • Mission
  • News & Press
    • In the News
    • Press Releases
    • Interviews
  • Contact
  • White Papers
  • More
    • Home
    • Mission
    • News & Press
      • In the News
      • Press Releases
      • Interviews
    • Contact
    • White Papers
  • Home
  • Mission
  • News & Press
    • In the News
    • Press Releases
    • Interviews
  • Contact
  • White Papers

In the News

Data centres in space will boost satellite computing power and storage

  

Cutting-Edge Computing Goes Spaceborne

Spaceborne Computer missions demonstrate faster, easier protection against space radiation


Originally published 02/11/2025


When Hewlett Packard Enterprise Co. (HPE) sent an unmodified high-performance computer to the International Space Station in 2017, no computer expert thought it would last a week. Over a year-and-a-half later, Spaceborne Computer-1 returned home, having operated successfully for its entire mission.


“No one in the aerospace industry thought this was going to work,” said Mark Fernandez, now principal investigator for Spaceborne Computer-2 at Spring, Texas-based HPE. “The longest publicly stated life expectancy for Spaceborne-1 was four days, because we did nothing to the hardware at all.”


Unaltered, off-the-shelf computers don’t last long in space due to radiation that the atmosphere protects us from on Earth. When high-energy particles or photons strike microchips, they can alter the voltage in nearby transistors, corrupting data, changing the computer’s behavior, and eventually destroying its electronics.

The solution has been radiation hardening: a commercial computer’s electronics are mounted on insulating boards instead of conventional semiconductors and shielded in a protective layer, in a lengthy process that Fernandez said “takes 10 years and millions of dollars.”

As a result, radiation-hardened computers “are usually several generations behind the current state of the art,” said Rupak Biswas, director of exploration technology at NASA’s Ames Research Center in California’s Silicon Valley, who proposed the first Spaceborne Computer mission. Meanwhile, he said, as NASA sends astronauts farther from Earth, increasing the lag time for transmissions, the agency will want more computing power aboard its spacecraft. “So the idea was, what if we take one of our latest-generation processors, put it in space, see what radiation does to it, and use software to correct those errors, as opposed to depending on the hardware?”


NASA reached out to Silicon Graphics International, a company that had built supercomputers for Ames, where Fernandez was one of the chief technology officers. HPE acquired SGI not long after. By then Fernandez was already developing software that would monitor all of a computer’s components and slow it down incrementally when any behavior fell outside normal operating parameters, avoiding damage.


When the approach worked, it wasn’t just HPE and NASA who were pleased. “Hearing Mark say this thing worked great, when there had been no small consensus that it was nothing more than a fire hazard, was very encouraging to us, that this would be something we could do,” said Rick Ward, founder and chief technology officer of OrbitsEdge Inc. The recent startup was making its own plans to put cutting-edge computers in space without traditional
radiation hardening.


Following HPE’s success, OrbitsEdge is making HPE’s Edgeline system the heart of one of its SatFrame satellites. Developed while Spaceborne Computer-1 was on the space station, Edgeline is a rugged system intended to provide processing power in the field and is now on the station as Spaceborne Computer-2. OrbitsEdge has signed an agreement to use the Edgeline 8000 in its 1,000-watt satellite, which will likely be developed after one or two smaller versions, and the company has already purchased several units for development.


HPE said a number of other commercial space companies are also considering its software-hardened systems for use in space. Meanwhile, the company is working with NASA to provide the first commercial high-powered computing in space by opening up Spaceborne Computer-2 to do work for paying customers, a service it hopes to offer shortly

 


https://spinoff.nasa.gov/Cutting-Edge_Computing_Goes_Spaceborne 

In the News

Data centres in space will boost satellite computing power and storage

 

 

Start-ups and multinationals are developing space-based data centres to process information in orbit.

By Ryan Morrison

 July 3, 2022updated 26 Jul 2022 2:01pm 


Wildfires, troop movements and other fast-changing events on Earth are monitored by a small army of satellites watching the planet day by day, moment by moment. Analysing the output of this floating forest of cameras and sensors takes time, however, in large part thanks to the difficulty in transmitting the resulting terabytes of data back to Earth – a delay that, conceivably, could cost lives.

To solve this problem, several larger satellites are launched with onboard computing power and data storage, allowing for the output of the cameras to be analysed in space and the results sent to Earth. However, this adds to the cost of launch and isn’t viable for smaller satellites.

But what if satellites never needed to send their data to ground stations in the first place? That’s the question being pondered by a host of start-ups and multinationals, as they plan to launch data centres themselves into orbit.


By processing and storing near-Earth observations in space-based clouds, governments and private companies on the ground can not only receive clean observation data more quickly, but count on a data storage solution immune from the risk of flood, fire and earthquake that its equivalent would experience planet-side. 


Indeed, there are already some powerful computers in space equipped to do this. The International Space Station, for example, uses its HPE supercomputer to run machine learning models capable of swiftly processing images of astronaut gloves or run experimental data, rather than sending gigabytes of data back to Earth – a process that can take weeks or months due to limited bandwidth.

While there are some clear advantages in terms of speed, there are also risks involved with putting data and processing power in orbit. A review by Capitol Technology University in 2018 outlined several exotic dangers to satellite missions, including geomagnetic storms crippling equipment, space dust turning to hot plasma as it hits the spacecraft, and even collisions with other objects in a similar orbit.

Despite these risks, demand from satellite operators for more efficient data processing solutions has seen several companies forge ahead with plans for orbital data centres. This is, in part, because so many recent launches have been for ‘smallsats’, machines weighing under 1,200kg that have no room aboard for data processing and storage.


As such, a new sector is slowly evolving to service these devices and their larger cousins. This emerging industry anticipates a fleet of orbital data centres zipping silently around the Earth within the next two decades.


The exact form they will take will vary, says Rick Ward, an expert in data processing in space and CTO of OrbitsEdge, a company looking to launch a fleet of space computers. “Some will be large devices sitting in geostationary orbit able to hold petabytes of data, whereas others will be in low-Earth orbit (LEO) with more powerful computers processing data from nearby satellites,” he predicts.


How would a space data centre work?

Man could never have broken orbit without data storage – although he didn’t need much, compared to our present computing standards. The guidance computer on board Apollo 11, for example, only needed 4KB of RAM and a 32 KB hard disk to land Neil Armstrong and Buzz Aldrin onto the lunar surface. An Apple Watch Series 7, by comparison, has 1GB of RAM and 32GB of storage.


That’s not to say that space-bound computers haven’t caught up with their terrestrial cousins: the ISS’s supercomputer, for example, can operate in harsh environments and perform edge computing at teraflop speeds. 


Companies such as OrbitsEdge, however, predict that the future of space computing is less likely to focus on raw computing power than on distributed storage. The reasons for this, explains Ward, are becoming increasingly obvious to those managing data centres on Earth. 


“Ask Amazon to show you their power bill and you will see the cost of storing data on Earth,” he says. “It isn’t just storage either, but data handling and processing as well. The biggest expense, beyond the cost of land in city-centre locations, is electricity. For orbital data centres that can come straight from the Sun through direct solar power.” 


Ongoing costs, explains Ward, gradually eat into a company’s profit margins. The main cost concern for space-based data centres, meanwhile, comes in the upfront investment - in other words, paying for the rocket that launches the satellite in the first place. After that, companies should expect smooth sailing from their orbital assets, at least in cost terms. 


One proposal from the Florida-based firm would see a small number of data centres launched into geostationary orbit. High above the Earth, these larger satellites would receive and store petabytes of data from larger constellations at lower orbits, many of which would act as processing hubs capable of relaying data at low latencies back to ground stations. It will be a distributed data centre, with processing and storage across multiple devices, although Ward prefers to call it a “single megastructure.” 

Data flows of this complexity and scale are already in place between satellites and ground stations, explains Andrii Muzychenko, EOS SAT Project Manager at EOS Data Analytics. “Middle satellites with higher transmission rates can send data to several ground stations and reach 10-50 TB per day with 2-3x compression,” he says. Heavier satellites, meanwhile, “can take images and directly transfer hundreds of terabytes with 2-3x compression through telecommunications satellites”.


It’s therefore easy to imagine a similar framework being applied between data centres in geostationary orbit, LEO observation satellites and ground stations. “I see it as an iterative process where you first build one, then build 100 or 1,000 and so on, until you have an ever-growing amount of capacity to service a growing sector,” he says.


One key function Ward anticipates outsourcing to this array is change analysis. Having AI systems process subtle hourly changes in terrestrial observation data on satellites would lead to vast efficiencies in how we use such information to monitor the destructive effects of climate change, among other events. 


Japanese telecom giant NTT is also working on designing orbital data centres, the first of which is due to launch by early 2025. Its plans are more scaled-back than OrbitsEdge, in that single satellites will be tasked with not only storing but processing data - significantly speeding up the time in which they could communicate with ground stations in an emergency. NTT has also said that its data centres will be powered by photonic chips that allow for lower power consumption and a greater ability to resist solar radiation.



Who will use a space data centre?

The logic behind orbital data centres, explains Ward, is irresistible. Over the course of a generation, “we will see data centres moving normal operations to space,” he says, motivated in large part by a calculus that maintaining these hubs in a vacuum is much more affordable than paying for power and rental costs here on Earth. Some of the first tempted beyond the stratosphere, he adds, are likely to be those companies managing such facilities in notoriously expensive locations like New York and London.

Still, concerns remain about the practicality of such an operation. Upfront costs, for example, remain a significant issue. Right now, the most affordable options entail paying $2,000 per kilogram launched, a pricey proposition for orbital facilities likely to weigh several tonnes. Those costs are expected to fall significantly, however, once SpaceX, Elon Musk's space enterprise, expands its launch capacity by debuting its Starship launch vehicle in 2023. 


Physical risks to orbital data centres must also be considered. While space is devoid of earthquakes and atmospheric phenomena, satellites are always in danger of being struck by micrometeorites, engulfed by geomagnetic storms, or destroyed in collisions with other orbital assets. Nation-states are also waking up to this reality.


The UK government, for example, recently announced new regulations designed to mitigate against space debris, including new requirements on de-orbiting satellites as they reach the end of their lives and ensuring they carry enough on-board fuel to conduct emergency manoeuvres to avoid collisions. 

More of these initiatives should be expected from nation-states as LEO gets more crowded. Over the next few years, SpaceX alone plans to launch 13,000 internet satellites, while Amazon hopes to send more than 3,000 into orbit as part of its Kuiper internet service.


Governments from the EU to China, meanwhile, are also considering mega-constellations of satellites. As such, there is a heightened danger of frequency clashing and signal degradation as all of these satellites fight to be heard by ground stations - potentially ruling out orbital data centres before they’ve even been launched. 


Ward himself concedes that these risks will need to be tackled before any data centre megastructures get launched into orbit. “We have to send test devices to space first,” he says. NTT, meanwhile, doesn’t expect to have an operational data centre in space before 2026.


As such, while space represents a tempting prospect for data centre operators afflicted with rising rental and energy costs, it may be several years yet before their dreams of orbital arrays get off the ground.


https://techmonitor.ai/technology/data-centre/data-centres-space-satellite-computing

In the News

Living on the edge: Satellites adopt powerful computers

 by Debra Werner — January 24, 2022


Australian startup Spiral Blue is testing prototype of its Space Edge Zero computer on SatRevolution of Poland’s Earth-observation satellites. Credit: SatRevolution

The latest Apple Watch has 16 times the memory of the central processor on NASA’s Mars 2020 rover. For the new iPhone, 64 times the car-size rover’s memory comes standard.

For decades, people dismissed comparisons of terrestrial and space-based processors by pointing out the harsh radiation and temperature extremes facing space-based electronics. Only components custom built for spaceflight and proven to function well after many years in orbit were considered resilient enough for multibillion-dollar space agency missions.

While that may still be the best bet for high-profile deep space missions, spacecraft operating closer to Earth are adopting state-of-the-art onboard processors. Upcoming missions will require even greater computing capability.

Satellite sensors produce “an enormous amount of data in the form of scientific research, Earth observation, national security,” Naeem Altaf, IBM distinguished engineer and IBM Space Tech chief technology officer, said by email. “To extract the quick value of data, we will need to bring compute closer to data.”

Consider Earth observation. Traditionally, electro-optical imagery and synthetic-aperture radar data have been sent to the ground for processing. That’s still largely the case, but new Earth-observation sensors continue expanding the volume of data acquired in orbit, sometimes quite dramatically. At the same time, customers are eager for speedy access to insights drawn from various datasets.

Weather observation is a good example. Numerical weather models merge vast quantities of data drawn for space, airborne, maritime and terrestrial sensors. While no one proposes running the forecasting algorithms on satellites, AAC Clyde Space, the Swedish company supplying the core avionics for the European Space Agency’s Arctic Weather Satellite, sees improvements in onboard processing as a way to speed up weather data delivery.

“We see an opportunity in the future to do a lot of processing on board: preparing data, compressing data and starting to fuse data,” said Luis Gomes, AAC Clyde Space CEO. “Our objective is real-time weather observations from space. For that, we need to package the data efficiently and effectively to reduce the amount of time that we are downlinking.”

Hyperspectral sensors also produce huge datasets that make onboard processing “quite critical,” Gomes said.

Some of the new satellite computers will be devoted to crunching sensor data. Others will help spacecraft choreograph complex operations.

Future satellites are likely to operate in swarms, communicating through intersatellite links and working together to capture unique datasets and extend communications networks. Eventually, constellations will employ artificial intelligence to solve problems by, for example, healing or repositioning satellites based on onboard analysis of their health and performance, which will require extensive edge processing, said Chuck Beames, chairman of the SmallSat Alliance, an industry association.

COMMERCIAL SOLUTIONS

Edge processing, bringing computation closer to data sources, is increasingly popular on Earth. Oil and gas companies, for example, analyze data near sensors that monitor heavy equipment at remote sites to quickly identify equipment problems and to trim communications and data storage expenses.

Companies ranging from IBM and Hewlett Packard Enterprise to startups around the world are positioning themselves to meet what they see as inevitable demand for enhanced space-based edge processing, beginning onboard satellites and extending to data centers in Earth and lunar orbit.

An artist’s rendering of Japan’s Hayabusa-2 asteroid mission passing near Earth. Israeli startup Ramon.Space supplied computing technology for the Japanese Space Agency mission. Credit: JAXA

Exodus Orbitals, a Canadian startup that rents satellite services to software application developers, established the Edge Computing in Space Alliance in November. The organization quickly attracted nearly two dozen members.

One of the members, Ramon.Space, advertises “space-resilient supercomputing systems.” While they bear little resemblance to terrestrial supercomputers, they are far different from low capacity spaceflight computers and “a lot closer to the kind of computing capability that we have on Earth,” said Lisa Kuo, vice president of strategic sales for Ramon.Space, an Israeli firm established in 2004 that is expanding internationally. “We go over space computing systems with a very fine-tooth comb and adopt the optimal radiation-hardening technique for each component.”

In contrast to the bespoke approach, startup Exo-Space of Pasadena, California, offers FeatherEdge, a platform that applies artificial intelligence and machine learning to Earth observation data to quickly extract valuable information.

Long term, Exo-Space plans to “adapt the technology to the more general-purpose use cases like constellation management or predictive maintenance,” said CEO Jeremy Allam.

Sydney-based Spiral Blue also applies artificial intelligence to Earth imagery with its Space Edge computer.

“Satellites can capture far more data than they can actually bring down,” said Taofiq Huq, Spiral Blue founder and CEO. With improved onboard processing, satellites can highlight and downlink the most important information, like ship locations for maritime vessel tracking, he added.

BOX IT UP

Other firms specialize in packaging terrestrial computers for spaceflight. OrbitsEdge, for example, works with customers including HPE to provide radiation shielding and thermal management systems that allow computers designed for terrestrial applications to function in orbit.

“By relying on the high-power computation pipeline, we have assurances that whatever we’re flying is the most modern stuff,” said Rick Ward, chief technology officer and founder of the Titusville, Florida-based OrbitsEdge. “When we segue to quantum computing, and we’ve already had conversations with some of the quantum computing companies, we can do that as well.”

Cosmic Shielding Corp. takes a similar approach but instead of focusing on safeguarding processors, the Atlanta startup developed a 3D-printed polymer to protect people and electronics in orbit.

“You can build a satellite bus out of this material, and it will provide significant improvements,” said Yanni Barghouty, Cosmic Shielding founder and CEO. “Right now, we’re seeing around a 60 to 70-percent radiation dose-rate reduction versus traditional materials.”

EXTENDING THE EDGE

In addition to enhancing onboard processing, companies are installing edge processors in ground stations and making plans to launch constellations devoted to data processing.

“Edge computing can be performed at different segments, depending upon the use case and the criticality of data,” said IBM’s Altaf. “We can have dedicated compute satellites, which are tasked to take on the heavy payloads in orbit and perform computation services for other satellites.”

If history is any guide, demand for data processing in orbit will continue to climb. Successive generations of terrestrial applications invariably require additional memory and processing speed.

In space, like on the ground, “you want it faster, you want better networking, and you want more power,” said Mark Fernandez, HPE principal investigator for the Spaceborne Computer-2 on the International Space Station.



This article originally appeared in the January 2022 issue of SpaceNews magazine.


https://spacenews.com/living-on-the-edge-satellites-adopt-powerful-computers/

Hewlett Packard Enterprise’s space station computer is in demand

 by Debra Werner — January 24, 2022


Q&A with Mark Fernandez, principal investigator for HPE's Spaceborne Computer-2

Since traveling in February 2020 to the International Space Station, Spaceborne Computer-2 has completed 20 experiments focused on health care, communications, Earth observation and life sciences. Still, the queue for access to the off-the-shelf commercial computer linked to Microsoft’s Azure cloud keeps growing.

Mark Fernandez, principal investigator for Spaceborne Computer-2, sees a promising future for space-based computing. He expects increasingly capable computers to be installed on satellites and housed in orbiting data centers in the coming years. Edge processors will crunch data on the moon, and NASA’s lunar Gateway will host advanced computing resources, Fernandez told SpaceNews.

Fernandez, who holds a doctorate in scientific computing from the University of Southern Mississippi, served as software payload developer for HPE’s original Spaceborne Computer, a supercomputer that reached ISS in August 2017 and returned to Earth a year and a half later in a SpaceX Dragon cargo capsule.

What do people mean when they talk about supercomputers in space?

Small clusters at the edge are positioned as supercomputers because they are more than just a tiny edge device. We called Spaceborne-1 a supercomputer because we did one teraflop of computation in space. That’s orders of magnitude more than anyone had ever done before.

What are you learning from Spaceborne Computer-2?

What is surprising to me is the diversity of the experiments. We have 39 experiments in the queue, and the number of experiments is growing.

We’re analyzing astronaut DNA. That one, in particular, is pleasing to me because the scientists had been waiting weeks or months to get this big DNA sequence down to Earth to analyze it. You can compare this big dataset to the big human genome, but you’re only interested in the mutations.

Well, it took us about 13 minutes of processing and then about two seconds to download it. Suddenly, the scientists said instead of monitoring the health of one astronaut every month, they could monitor the whole crew daily and get a better idea of when space travel is adversely affecting them.

We’re looking at how satellites communicate with each other. Different types of encryption, different types of protocols, different types of compression.

What gives you the most security and uses the least amount of energy?

A lot of experiments have to do with weather and disaster preparation. High-resolution imagery of storms and tornadoes are large data files. Basically, the first responders just want to know where the forest fire is. What is the track of the tornado? You can tell them that in just a few words.

Instead of pictures?

A picture takes forever to get down. We can process that. I want to know where it’s flooded and not flooded. I want to know if the interstate is passable or not.

Are you sending only the most valuable information to the ground?

That’s the first layer of the onion that we’re exploring. It’s an intelligent edge. We don’t want to push all the computation to the edge. We don’t want to push all the computation to the cloud. If I have a multi-step workflow, I can do two or three steps at the edge. But I’m far better off bursting that smaller, mid-workflow results to the cloud.

For example?

It goes back to astronaut DNA. Mutations are updated all the time in databases at the National Institute of Health and the National Cancer Institute. We have the cloud search those databases.

What’s the best approach for various types of data?

We’ve got some serious propeller-head scientists running things only on the cloud or only in space on Spaceborne Computer. They differentiate it. They run it only on the CPU, only on the GPU. They are coming up with guidelines.

People also talk about edge processing for satellite operations.

The analog is autonomous driving. Just as all the cars will be talking to each other, all these satellites will be talking to each other. One of them is going to raise their hand and say, “I’ve got good connectivity down to Earth. I’ll deliver that message.” Then, they all agree.


HPE established an alliance in 2019 with OrbitsEdge, a Florida startup with a satellite bus for sensitive electronics. Are you working to install HPE computers on OrbitsEdge satellites?

Yes, indeed. OrbitsEdge is putting up a satellite with multiple distinct computers from HPE. To you, it looks like your computer on your satellite. But they’re actually hosting multiple computers from multiple people completely firewalled off from each other because they are on physically separate devices. They can run whatever protocols they want and whatever communications they like.

How do you envision computing resources in cislunar space?

When we get to the moon, the data center and the high-performance computing will be orbiting the moon, and the outposts will be the edge.

What are the challenges ahead for space-based computing?

They’re all related to space exploration. Power, cooling and networking are not stable. Networking is the most unstable. There are multiple times a day [on ISS] when we don’t have connectivity. If this was your cellphone, you would go get a new provider. But the space station doesn’t have an option.

Where do you imagine this going in Earth orbit, on the moon and Mars?

If OrbitsEdge gets its proof-of-concept going and can have a multi-tenant satellite, the next logical step is a multi-tenant data center built out of larger satellites. OrbitsEdge focuses on power, cooling and networking. They’re leaving that compute to us.

On the moon, you would have low energy communication up to the Gateway. The Gateway will have the power, cooling and storage. A similar architecture is being considered for the Mars outpost.

Do space applications continually demand more computing resources like terrestrial applications?

Yes, you want it faster, you want better networking, and you want more power. No one has complained that they have plenty of Spaceborne Computer right now. They ask, “When can I get back on it?”

This article originally appeared in the January 2022 issue of SpaceNews magazine.


https://spacenews.com/hewlett-packard-enterprises-space-station-computer-is-in-demand/

Space Is the Final Frontier for Data Centers


There are good reasons to send 19-inch racks into orbit, and beyond.


Maria Korolov | Jan 18, 2022


Last year marked the first time humanity deployed a conventional data center in space. The HPE Spaceborne Computer-2 – a set of HPE Edgeline Converged EL4000 Edge and HPE ProLiant machines, each with an Nvidia T4 GPU to support AI workloads – was sent to the International Space Station in February of 2021.

This is the first off-the-shelf server deployed in space to run actual production workloads.

"It is not hardened," said Mark Fernandez, principal investigator for Spaceborne Computer-2 at Hewlett Packard Enterprise. "The goal is to avoid the time and cost to harden a computer, so you can go with the latest technology."

Time for a hardware refresh

Elsewhere in space – on Mars landers, in satellites, in space station control systems – most of the computers are decades old.

"The hardened processors available today are circa 1995, 1996," Fernandez told Data Center Knowledge. Not only are they slow but it's hard to find developers who can write software for these machines, he said.

Plus, all of today's applications are designed to run on modern computers.

The ISS itself runs on Intel 80286SX CPUs that date back to the late 1980s. There are also more than a hundred laptops on the ISS, as well as tablets and other devices. They are used as remote terminals to the command and control multiplexer demultiplexer computers, as well as for email, Internet, and recreation.

Key systems run on hardened hardware that is protected against radiation. That means that they use either redundant circuits or insulating substrates instead of the usual semiconductor wafers on chips.

Developing such a computer takes years, as does testing. Missions are also planned years in advance. By the time such a computer gets to space, it's woefully out of date.

"We want to take data center quality pizza boxes up to space," Fernandez said. To solve the problem of protecting the computers against radiation, HPE decided to try using software.

The first attempt to put a server in space, the Spaceborne Computer-1, was launched in 2017 and spent nearly two years up on the space station, though the mission was only scheduled to run for one year.

That mission had three goals, according to Fernandez.

"First, can you take a computer right off the factory floor, package it up to fit on a rocket and get it a space station," he said. "Second, can you train astronauts to install it and get it working. And third, once it's working, will it give you the right answers, and for how long?"

Spaceborne Computer-1 sat in a locker on the ISS. The lockers are designed to mount inside a space station. Inside that locker, HPE put a standard 19-inch rack, Fernandez said. "So we didn't have to modify the servers at all."

That first mission had two servers, running a suite of internationally recognized benchmarks, 24 hours a day, seven days a week, 365 days of the year.

"We had to prove it worked," Fernandez said. "We wanted to stress the CPU, stress the memory, stress the disks. With the benchmarks, you know what the results are supposed to be, so when the job finishes, you can see if you got the right answer. We did 50,000 benchmarks and not once did we get an error."

The same benchmarks were run on Earth on an identical system.

The servers were still working when NASA shut them down and brought them back down to Earth.

"They wanted to take a look at them," he said.

Space doesn’t like SSDs

The space-based system had 20 solid state disk drives, of which nine had failed over the course of the mission. With the Earth-based twins, only one drive had failed.

"We also had five times the number of correctable errors on space than on Earth," said Fernandez. "But they were correctable, and corrected themselves. So we're good there. The main concern was the solid state disks. That's where we paid the most attention in Spaceborne 2."

This time, NASA also wanted a system that would last for at least three years – the length of time it would take to go to Mars and back. So HPE doubled the hardware; now there are four servers total, two in each locker.

Spaceborne Computer-2 includes the off-the-shelf HPE Edgeline Converged EL4000 Edge System, a rugged server designed to perform in harsher edge environments with higher shock, vibration, and temperature levels. It's paired with the industry standard HPE ProLiant DL360.

"The Edgeline 4000 includes a GPU so we can do AI, machine learning, and image processing," Fernandez said.

As of mid-December, none of the drives in the new system have failed, he said. However, the servers aren't running the intense benchmarks they were running before, so the computers aren't stressed as much. This time around, the Spaceborne 2 is running actual production workloads.

Existing applications

It may be small, but the first edge computing data center is now operational in outer space.

One of the jobs it's doing is DNA analysis. Previously, astronauts would have their DNA tested once a month, and the data sent down to Earth for processing.

Now the processing takes place on the ISS, and just the results are sent down to Earth. It reduces the amount of data that needs to be transmitted by 20,000 times, Fernandez said.

"Now the scientists here on Earth are about to think about things that weren't even conceivable before," he said. "When I can process the data in 13 minutes and download the results in two seconds, I can monitor astronaut health daily instead of monthly."

Researchers can also analyze the DNA of rodents and plants on the ISS, he said.

"Another big area is communications research. That includes work related to 5G and beyond communications testing and simulations, satellite-to-satellite communications, different communication protocols, different security algorithms, different encryption algorithms, and also new protocols for satellites to send data down to Earth.”

Image processing is another top use case for space-based data centers, Fernandez said. Cameras in orbit collect massive numbers of images from Earth, but there's a limit to how much can be downloaded.

Much of the images are of clouds, or empty seas. What people are actually interested in is what's changing in the images, Fernandez said. "Where is it flooded in Houston after the hurricane? Is this road still passable after the flood? You want to get information to the first responders as soon as possible."

The obvious use cases involve simple counting. How many cars are in the store parking lot? How much construction equipment is still on the site? How many container ships are in port?

This particular use case is still in the proof-of-concept stage, Fernandez said, where the recipients use the Spaceborne computer to do the processing in parallel with their existing systems. "I want to know, did I get the correct answer, and how much sooner did I get it to you than you got it the normal way?"

That's when the light bulbs go off, he said, and customers realize that they need this kind of processing on their next satellite.

Of course, putting a data center on a satellite has a different set of challenges than putting it in the ISS. The ISS internal environment is human-friendly. The temperature levels are regulated, and there's air, and human hands are around to fix anything that needs fixing.

The need for air to cool the servers has already been addressed. Both Spaceborne 1 and Spaceborne 2 were water cooled, Fernandez said.

"We're allowed to tap into the water cooling loop on the space station," he said.

Path to commercialization

To make all of it happen, HPE is working with a partner, OrbitsEdge.

"Our plan is to build a box that does radiation shielding and thermal management so whatever we put in that box can fly and work," said Rick Ward, founder and CTO at OrbitsEdge.

The OrbitsEdge satellite system is shaped roughly like an umbrella with solar panels on top to collect energy and provide shade for the computer below it. Then, at the very bottom, there are radiators that send excess heat directly into space.

In other words, power and cooling are free.

Ward didn't go into detail about how exactly the cooling system works. "It's not water, but something else that works as an integrated cooling system and radiation shield," he told Data Center Knowledge. "But I can't say anything beyond that."

The first demonstration satellite is expected to launch before the end of 2022.

Eventually, Ward said, he expects to see servers not just in satellites and the ISS and the new commercial space stations, but on the Lunar Gateway space station and on the moon itself, and also on Mars, and in orbit around Mars.

The earliest use case will be imaging satellites, he said. "That's our low-hanging fruit."

The goal here is to lower the barriers to entry for pace operations and make space computing similar to any other kind of edge computing, he said.

Adding computing power to satellites that are already scheduled to go up is a simple use case and will make those satellites more valuable.

As launch costs come down, special-purpose constellations of satellites that are just designed for data processing can be sent up to handle space-based workloads.

"I will say five years from now, there will exist an operational capability to process space data in space," Ward said.

At some point it will become possible to considerably expand the computing power available in space, either because the launch costs will drop dramatically, or because of space-based manufacturing of computer equipment.

At that point, space-based data centers can start handling workloads for Earth-based customers, he said.

Terrestrial data centers have high power costs and use valuable real estate, of which there's a limited supply. In space, there's no clouds to get in between the sun and the solar panels, and cooling is free.

Today's data centers have relatively low initial costs, the cost of the building and equipment, but the ongoing operating costs never stop, Ward said. "In space, you invert that. You have a high upfront cost, but your ongoing costs are significantly lower."

Space-based data centers can offer other advantages. Quantum computing, for example, requires extremely low temperatures. In space, you can get down to extremely low temperatures simply by keeping the computer in the shade. And there's no vibration in space.

And space-based manufacturing allows new kinds of lithography, perfect crystals, and other advantages over terrestrial facilities.

Putting data centers in space makes good business sense, HPE's Fernandez said: "We're doing a lot of proofs of concept and experiments. The volume of those indicates to me that this is a market."

Another company that plans to start early trials of a space-based data center platform is NTT, which is working in partnership with Japan’s SKY Perfect JSAT Holdings.

The company plans for a number of satellites with computing and storage capabilities that form a single data center via optical links. The first satellites that make up this network are expected to launch by 2025.

"We've almost finished designing the basic architecture of this system," NTT spokesperson Daisuke Kawano told Data Center Knowledge. "We've already received positive feedback from potential customers around the world."

The on-board computing will speed up data downloads from satellites, he said. Second, computing power can reduce the amount of information that needs to be transmitted by compressing it, or analyzing it in space.

"We expect more competitors to emerge next year with the same concept," Kawano said.

Last year the European Space Agency launched the PhiSat-1, the first satellite with AI processing on board. The PhiSat-1 uses Intel's Movidius Myriad 2 chip, an off-the-shelf technology not specifically designed for space travel.

There's still a lot of work that needs to be done before we see a full-scale space data center industry, said North Dakota State University computer science professor Jeremy Straub, an expert in space-based computing.

Launch costs need to fall further, he told Data Center Knowledge.

"We also need more infrastructure in space," he added.

He agreed that it makes sense to add more computing powers to satellites, though he wouldn't call this an example of data centers in space.

"I would also expect a lot of computing on space stations," he said. "We'll see a server room, to support activities on the station. It won't really be a data center in the sense of serving others, and not the same size as you typically think of in a data center."


https://www.datacenterknowledge.com/hardware/space-final-frontier-data-centers 

Tech Billionaires Are In Space, And Data Centers Are Following Close Behind

NationalData Center


July 26, 2021 Dan Rabb, Bisnow Data Centers Reporter 


Tech billionaires are launching themselves into orbit, and now data centers are heading to space as well.

Space-based data centers are close to becoming a reality, as an ever-growing flood of satellite data necessitates processing and storage in orbit around the Earth. At the same time, a planned lunar data center is at the heart of an international effort to build a permanent settlement on the moon.


While no one will confuse the lunar surface with Virginia's Loudoun County anytime soon, industry insiders focused on the intersection of rack space and outer space say off-Earth data centers will be a springboard for innovation and commercial applications in space.

On the horizon? Lunar laboratories, space-based drones, autonomous spacecraft and other technologies — and that's just the beginning. 

“This is going to be a bedrock piece of infrastructure for a fully developed space information ecosystem,” said Richard Ward, the founder of space data startup OrbitsEdge, which plans to launch a constellation of edge data centers into orbit starting next year.

“We're looking at it as a utility, and I think it’s eventually going to be considered as vital as electricity in terms of the possibilities it opens up.” 

While data centers in space may pave the way for innovation from the realm of science fiction, a market for off-Earth computing and storage is already here.

Commercial spaceflight has driven down the cost of launching satellites and other equipment, and a torrent of data now constantly floods back to Earth from imaging and remote sensing satellites used to study everything from traffic patterns to ocean temperature.

This explosion of data has helped drive global demand for terrestrial data centers and cloud capacity — both Microsoft and Google created cloud platforms specifically for this market, building satellite links into data centers and developing cloud-based command and control services. 

So why move that processing power to space? Experts say that the operators of these satellites want to avoid having to use another expensive resource: the transmission link to the ground. 

A significant percentage of the information collected by remote sensing equipment is of little or no value, said North Dakota State University computer science professor Jeremy Straub, an expert in space-based computing.

A satellite used to study agricultural yields may still generate images when the satellite isn’t over farmland or when the ground is obscured by clouds. That’s a lot of useless information that requires money, time and energy to transmit.

“Processing the data that’s collected in space using a space-based system allows you to make use of the transmission link more effectively,” Straub said. “You can process the data in space to separate just the most important data, which allows you to reduce cost and to send it back to Earth more quickly.”

Often called orbital edge computing, this kind of simple automated space-based data processing is already in use commercially by a number of operators. Satellites launched by San Francisco-based Loft Orbital, which colocates payloads from multiple clients, include a central computing hub capable of running multiple processing tasks simultaneously for different payloads. 


But the first true space data centers — satellites bearing enterprise servers that remotely provide advanced processing power and connectivity for other space-based infrastructure — are still likely a year or more away from launch.

Japanese telecom giant NTT and satellite operator SKY Perfect JSAT are collaborating on a network of data center satellites. NTT said it expects the first of these satellites to launch by 2025.

In the U.S., Florida-based OrbitsEdge expects to activate the first of 30 planned data center satellites by the end of 2022. Each satellite is effectively an edge data center similar to what might be found on Earth, with a modular rack designed to accommodate off-the-shelf servers. Initially, these units will house Hewlett Packard Enterprise servers, currently in their second round of testing on the International Space Station as part of a broader effort to adapt computing hardware to the harsh environment of space.

The increased exposure to radiation outside the Earth’s atmosphere presented one of the most significant engineering problems, OrbitsEdge CEO Richard Ward said. Even after developing a proprietary casing that blocks much of the particle bombardment, he expects the servers to have a shelf life of just five years. 

Developing cooling systems that operate without monitoring also presented challenges, according to Ward — although he and other experts point to some advantages of space for data centers. While solar energy may be unpredictable on Earth, the energy available for solar panels in space can be predicted precisely years in advance.  

Ward said that, while he expects these data centers to be used mainly to process imaging data at first, they will eventually enable everything from advanced research space-based labs to autonomous spacecraft. He said space-based data centers are the missing piece of infrastructure needed to create an information ecosystem that will open the floodgates of innovation and commercialization in space.

“Optimizing and improving Earth observation data is just the low-hanging fruit,” Ward said. 

The European Space Agency shares the view that data centers in space are needed to make space more accessible to both governments and private industry. They’re looking at building one on the moon.

In May, the ESA contracted Italian aerospace firm Thales Alenia to study the feasibility of establishing a lunar data center in the next decade. The future facility is being studied as a central element of the ESA’s joint effort with NASA to establish research stations and self-sustaining settlements on the moon. 

ESA’s Moonlight Initiative, along with NASA’s Artemis mission, aims to build lunar infrastructure as both a test run for Mars and as a tool to stimulate space commercialization. The building blocks of a future telecommunications network around the moon — such as a GPS-like lunar navigation system and space-based 4G nodes — are in the works.

These projects envision permanent science colonies on the lunar surface – with laboratories for experiments and base camps for lunar exploration using drones and autonomous vehicles.  Remote computing power and data storage are key elements of these plans, but terrestrial data centers are of limited utility for many applications. It takes information about 1.5 seconds to travel from the moon to Earth.

That kind of latency is a non-starter when the computing is being used for tasks like helping an astronaut fly a drone over the moon’s surface. 

Reduced latency for space-based computing is a key element of making space travel more accessible, according to ESA officials. If any of the computing functions on a spacecraft can be handled remotely, that means less hardware that has to be specifically designed for that mission, less development time and, perhaps most importantly, less money. 

“We see this as significantly reducing the cost and complexity of subsequent individual expeditions,” said Graham Turnock, chief executive of the UK Space Agency, speaking to media following the project’s announcement.

“It will be a base for future exploration and economic activity of the sort that we can only begin imagining today, and starting to put that infrastructure in place is essential.”


Do space-based data centers have any utility for current data storage or processing needs on Earth? The next time Jeff Bezos goes to space, will it be to scout locations for the next AWS facility?

The final frontier makes for bad real estate when it comes to today’s Earthly colocation or cloud computing needs, North Dakota State’s Straub said. Just the cost of transporting the necessary materials into space would far exceed the cost of building a new data center anywhere in the world. 

“The economics don’t make sense, and they won’t at any point in the foreseeable future,” Straub said. “Are we going to be launching server racks into space because it’s cheaper than building a data center in Houston? No.”


Contact Dan Rabb at dan.rabb@bisnow.com 


https://www.bisnow.com/national/news/data-center/tech-billionaires-are-in-space-and-data-centers-are-following-close-behind-109658

Data Centers Above the Clouds: Colocation Goes to Space

By Doug Mohney | June 30, 2020


As  the cost of building and launching satellites continues to drop,  melding IT concepts with satellite operations to bring data center  services into Earth orbit and beyond is emerging as the next big thing. Colocation  of server hardware, virtually running applications in the cloud, and  edge computing are all familiar concepts to the data center world, but  the space industry wants to apply those ideas into satellite-based  business models.


Until  recently, satellite hardware and software were tightly tied together  and purpose-built for a single function. The introduction of  commercial-off-the-shelf processors, open standards software, and  standardized hardware is enabling companies to repurpose orbiting  satellites for different tasks by simply uploading new software and  allowing the sharing of a single satellite by hosting hardware for two  or more users.


This  “Space as a Service” concept can be used for operating multi-tenant  hardware in a micro-colocation model or offering virtual server capacity  for “above the clouds” computing. Several space startups are  integrating micro-data centers into their designs, offering computing  power to process satellite imaging data or monitor distributed sensors  for Internet of Things (IoT) applications.


OrbitsEdge Plans Racks in Space


Florida-based  OrbitsEdge is embracing a data center in orbit model, taking  off-the-shelf rackmount servers and bolting them into a satellite bus  (the structural frame housing payloads).

“We’re both edge computing and data center,” said Rick Ward, Chief Technical Officer of OrbitsEdge.  “We want to put big-performance computing infrastructure into space to  process data, cleanse it, aggregate data from multiple sources and  analyze it. We are that missing piece of the infrastructure to  commercial space.”


OrbitsEdge  is able to communicate with other satellites to collect and process  their data, as well as performing overhead edge computing where a  traditional data center is unavailable or not close enough. The company  sees opportunities in offloading and storing data from Earth Observation  satellites, processing it into immediately usable imagery, and sending  the results directly to end-users in the field. It has had discussions  with the U.S. Department of Defense, NASA, and commercial cloud  providers on how such non-traditional resources could be useful for  various use cases on Earth, in space, and on the surface of other  celestial bodies.


“It’s  another location for processing data above the clouds,” said Sylvia  France, President of OrbitsEdge. “There’s a lot of interest in fintech,  being able to make buy/sell decisions based on counting cars in parking lots. We’re also talking to entertainment companies as well, from space tourists to augmented reality firms.”


The  OrbitsEdge SatFrame is the company’s proprietary satellite bus, with a  standardized 19-inch server rack with available volume for 5U of  hardware. The company’s first two SatFrame pathfinder satellites will  support 18-inch deep hardware with production designs capable to grow to  support full-sized 36 inch deep hardware.


Onboard  Satframe-1 and Satframe-2 will be HPE EL8000 servers. Frank said exact  setups for hardware are still being worked out, with different  configurations to be implemented onboard each satellite to test and  verify various CPUs and other hardware.


While  HPE has flown a server onboard the International Space Station, the  human-supporting environment is relatively benign compared to what  OrbitsEdge needs to do. Supporting off-the-shelf servers in space  requires SatFrame to have a large solar panel array to generate power,  batteries to keep the system running when it is in the shadow of the  planet, thermal controls to dump heat from operating hardware, and  protection from cosmic radiation and solar flare events.


If  successful, OrbitsEdge may go beyond Earth orbit and to the Moon, Mars,  and on deep-space missions. As distances increase, so do communications  delays and bandwidth is more constrained. Probes and humans will need  on-site computing for autonomous vehicle operations, vision processing,  and analysis of raw data.

“Our  initial plan is to start at Low Earth Orbit then go to Geosynchronous  Earth Orbit and cis-lunar locations,” said Ward. “Possibly planetary  surface missions where we’re either static as a part of a base or  habitat, but we also have the capability to attach onto a vehicle.”

Loft Offers ‘Space Infrastructure As A Service’


The  attractiveness of sharing a satellite for lower operational costs and  faster time to deliver production services is keeping San Francisco  start-up Loft Orbital very busy, especially when combined with  substantial simplifications for customers in setup and operations. Among  Loft’s announced clients are DARPA’s Blackjack program, geo-data  specialist Fugro, European satellite operator Eutelsat, the UAE  government, and startups Orbital Sidekick and SpaceChain.

“Conceptionally,  the idea of AWS operating compute infrastructure for others is what  we’re doing for space,” said Loft Orbital co-founder and COO Alex  Greenberg. “We’ll have our first satellite launch this year and have  four missions underway. We’re adding more customers very quickly.”

While Loft Orbital normally offers the option of hosting a customer’s payload onboard  their satellites and controlling it via its Cockpit web portal, in some  cases Loft will also develop or buy the payload itself, allowing the  customer to focus on their applications.


“In  the data center analogy, we’re the virtualization between the data  center and the hardware, we’re providing Space Infrastructure as a  Service,” Greenberg said.


Onboard  its first satellite Yet Another Mission 2 (YAM-2), Loft is providing  this turnkey process for Eutelsat’s IoT service. Eutelsat is more  accustomed to operating large expensive communications satellites,  rather than building and operating small satellites. It makes financial  and business sense for Loft to provide the infrastructure for Eutelsat’s  satellite IoT service than for the company to get into that field from  scratch. Loft’s first two satellite missions will include  proof-of-concept tests for Eutelsat’s future IoT constellation.


“We’re taking away effort from the customer, saving the customer time,  resources, and money” Greenberg explained. “But there’s a lot more than  that as well. We’re optimizing for simplicity and speed, with our  payload hub acting as an abstraction layer between the payload and the  satellite bus. Traditionally, tons of subsystems have to be customized.  Building satellites and payloads in low volumes means there’s no  economies of scale.”


Loft  successfully bet on having a steady stream of customers, buying  multiple copies of a satellite bus – essentially a barebones satellite  without sensors — ahead of time to get quantity discounts and then  pulling out the bus and plugging in payloads when enough customers are  lined up to fill it.

“The  net result is we make the customer’s life a lot easier,” said  Greenberg. “We leave the bus as is, there’s no non-recurring engineering  or customization required. We get them to orbit a lot faster since they  don’t have to do the engineering and we literally bought the bus well  in advance, putting not only payload and bus manufacturing, but also  launch procurement and mission operations timelines in parallel.”


Another  capability Loft offers is a software-defined payload leveraging the  software-defined radios onboard its satellites. Customers are already  using the service, selecting specific antenna depending on the radio  frequencies required. Loft can timeshare usage between multiple  customers for applications such as IoT and RF spectrum surveys.


Future  plans include onboard processing, with Loft ingesting data from  payloads such as IoT and imagery and then allowing customers to use the  satellite compute environment to analyze their data onboard the  satellite rather than shipping it to the ground.


Improved Economics for Space-Powered  IoT

Price-conscious satellite Internet of Things (IoT) start-ups such as Lacuna Space and OQ Technology are embracing hosting hardware  and running virtualized tasks on third-party satellites when they can  find usable opportunities, but it’s hard to find a perfect fit for every  requirement.


“The main advantage of hosting is financial,” said Rob Spurrett, CEO of Lacuna Space.  “It is simply more cost effective to share space with other payloads  because, in principle, the platforms become progressively cheaper as  they get larger … Sometimes there are last minute deals on hosted  platforms where a payload supplier is running late, or cancelled, and  those can be great bargains, but hard to come by.”


Lacuna  Space uses a tweaked version of the LoRaWAN protocol to pick up data  from IoT devices around the world. Its’ first five platforms in space  are a mix of dedicated satellites and hosted communication packages  sharing space onboard other satellites. Moving forward, Lacuna Space  will build and launch 24 dedicated satellites because sharing requires compromise.


“You  tend to lose a degree of control (by sharing),” Spurrett stated. “The  platform and mission performance is not necessarily driven by just your  needs, but by a compromise where the combination of needs of all the  payloads need to be considered … As our constellation becomes more  complex, then using hosted platforms becomes more complex and the  logistical difficulties overrun the cost savings.”


OQ Technology conducted the first tests of its 5G-based NB-IoT service using a  satellite originally launched by Dutch-based GomSpace. NB-IoT is short  for Narrowband Internet of Things,  and is a low-power wide-area network to connect distributed devices.  The satellite was reconfigured to communicate with NB-IoT devices on the  ground by uploading new software written by OQ. As the company moves  forward, OQ Technology plans to use a combination of existing  satellites, hosted payloads, and its own satellites to deliver global  NB-IoT coverage.


Like  Lacuna Space, OQ is using what’s available, but there aren’t any  perfect fits for sharing satellites. “We don’t choose one, we have to  use what is out there and reliable, investors like when you can scale up  and invest less in hardware,” said founder and CEO Omar Qaise. “Not  every satellite has the right frequency and power we need, so hopefully  there will be in future enough ‘constellation as a service’ platforms  with flexibility. Today we have not identified any for (OQ Technology’s)  commercial case, but there are many companies promising that.”


https://datacenterfrontier.com/data-centers-above-the-clouds-colocation-goes-to-space/

Process Data, Before it Gets to Earth

 

By John Tucker | January 29, 2020



As  we move toward the commercialization of space, data is going to become a  much sought after commodity for businesses and public organizations  that are involved in the development of the space industry.


Sending  data back to Earth to get processed is going to take a vast amount of  bandwidth and will cause delays in communication. Without the correct  data infrastructure in place, progress will be made much slower. By creating robust data centers in space, OrbitsEdge is getting ahead of the problem with a solution that will prove to become an invaluable resource to any other space startup.

Bottlenecks  in data processing in space are already a problem, and as space opens  up to the possibilities brought about by the lower cost in commercial  space activity, this problem will only grow. OrbitsEdge is already there with a solution though, and by joining forces with  Hewlett Packard, they are proving that they now have the data processing  power that they need.


What is OrbitsEdge?

Formed  in 2018, OrbitsEdge helps other businesses or public organizations  collect and process huge amounts of data in space. Whether that data is  accumulated through the Internet of Things or from testing, the amount  of time it takes for this data to return to Earth to get processed can  cause a lot of issues.


This  Florida based startup has paired up with Hewlett Packard Enterprises  (HPE) to create the perfect synergy. Utilizing the micro-datacenter  technology developed by HPE, OrbitsEdge will further develop this for  use in space and it will be hosted on the OrbitsEdge SatFrame.


Who’s Behind OrbitsEdge?

The  OrbitsEdge founder and Cheif Technical Officer, Richard Ward has a  background in Deep Space Industries. He has been instrumental in  developing the proprietary SatFrame which houses the equipment to be  used in Low Earth Orbit.


CEO,  Barb Stinnett has over three decades worth of’ Silicon Valley  experience and has been involved in many major tech businesses over the  last few decades, including Hewlett Packard, Oracle, and Cisco. She has  also been CEO and has held a number of board of directors positions for  private equity and venture capital firm portfolios.


OrbitsEdge And Hewlett Packard Enterprises

In  November 2019, OrbitsEdge announced that it would be utilizing the HPE  Edgeline Converged Edge Systems to enable space companies to manage  their data more efficiently in space.

Created by OrbitsEdge, the SatFrame is designed to host the technology and compensate for stressors such as radiation in space.


With  both companies working together to produce technology that can  withstand the harsh environments of space, many other companies and  space organizations will gain a valuable technological resource that  will help propel their interests forward.


With  edge computing bringing the technology to NewSpace companies operating  in the emerging space industries, OrbitsEdge has positioned itself in a  prime and valuable position. The savvy space entrepreneurs are in a  prime spot, whereby they will be able to assist many companies to reach  their goals of developing their commercial businesses in space.


https://medium.com/newspace-hub/process-data-before-it-gets-to-earth-2d2b79c0ca37

Space-data-as-a-service gets going


By Patrick Nelson | December 12, 2019


Development  of IoT services in space will require ruggedized edge computing.  OrbitsEdge, a vendor has announced a deal with HPE for development.

Upcoming  space commercialization will require hardened edge-computing  environments in a small footprint with robust links back to Earth, says  vendor OrbitsEdge, which recently announced that it had started collaborating with Hewlett Packard Enterprise on computing-in-orbit solutions.


OrbitsEdge says it’s the first to provide a commercial data-center environment for installing in orbit, and will be using HPE’s Edgeline Converged Edge System in a hardened, satellite micro-data-center platform that it’s selling called SatFrame.


The  idea is “to run analytics such as artificial intelligence (AI) on the  vast amounts of data that will be created as space is commercialized,”  says Barbara Stinnett, CEO of OrbitsEdge, in a press release.


Why data in space?

IoT  data collection along with analysis and experimental testing are two  examples of space industrialization that the company gives as use cases  for its micro-data center product. However, commercial use of space also  includes imagery, communications, weather forecasting and navigation.  Space tourism and commercial recovery of space resources, such as mined  raw materials from asteroids are likely to be future space-uses, too.


Also, manufacturing – taking advantage of vacuums and zero-gravity  environments – is among the economic activities that could take advantage of number crunching in orbit.


Additionally, Cloud  Constellation Corp., a company I wrote about in 2017, unrelated to  OrbitsEdge or HPE, reckons highly sensitive data should be stored  isolated in space. That would be the “ultimate air-gap security,” it describes its SpaceBelt product.


Why edge in space?

OrbitsEdge  believes that data must be processed where it is collected, in space,  in order to reduce transmission bottlenecks as streams are piped back to  Earth stations. “Due to the new wave of low-cost commercial space  activity, the bottleneck will get worse,” the company explains on its  website.

What  it means is that getting satellites into space is now cheap and is  getting cheaper (due primarily to reusable rocket technology), but that  there’s a problem getting the information back to traditional cloud  environments on the surface of the Earth; there’s not enough backhaul  data capacity, and that increases processing costs. Therefore, the cloud  needs to move to the data-collection point: It’s “IoT above the cloud,”  OrbitsEdge cleverly taglines.


How it works

Satellite-mounted  solar arrays collect power from the sun. They fill batteries to be used  when the satellite is in the shadow of Earth.


Cooling-  and radiation-shielding protect a standard 5U, 19-inch server rack.  There’s a separate rack for the avionics. Then integrated, traditional  space-to-space, and space-to-ground radio communications handle the  comms. Future-proofing is also considered: laser data pipes, too, could  be supported, the company says.


On Earth option

Interestingly,  the company is also pitching its no-maintenance, low Earth orbit  (LEO)-geared product as being suitable for terrestrial extreme  environments, too. OrbitsEdge claims that SatFrame is robust enough for  extreme chemical and temperature environments on Earth. Upselling, it  also says that one could combine two micro-data centers: a LEO SatFrame  running HPE’s Edgeline, communicating with another one in an extreme  on-Earth location—one at the Poles, maybe.


“To  keep up with the rate of change and the number of satellites being  launched into low Earth orbit, new services have to be made available,”  OrbitsEdge says. “Shipping data back to terrestrial clouds is  impractical, however today it is the only choice,” it says.


https://www.networkworld.com/article/3489484/space-data-as-a-service-gets-going.html

OrbitsEdge partners with HPE on orbital data center computing and analytics

By Darrell Etherington | December 3, 2019


What  kinds of businesses might be able to operate in space? Well, data  centers are one potential target you might not have thought of. Space  provides an interesting environment for data center operations,  including advanced analytics operations and even artificial  intelligence, due in part to the excellent cooling conditions and  reasonable access to renewable power supply (solar). But there are  challenges, which is why a new partnership between Florida-based space  startup OrbitsEdge and Hewlett Packard Enterprises (HPE) makes a lot of sense.


The  partnership will make OrbitsEdge a hardware supplier for HPE’s Edgeline  Converged Edge Systems, and basically it means that the space startup  will be handling everything required to “harden” the standard HPE  micro-data center equipment for use in outer space. Hardening is a  standard process for getting stuff ready to use in space, and  essentially prepares equipment to withstand the increased radiation,  extreme temperatures and other stressors that space adds to the mix.


OrbitsEdge,  founded earlier this year, has developed a proprietary piece of  hardware called the “SatFrame” which is designed to counter the stress  of a space-based operating environment, making it relatively easy to  take off-the-shelf Earth equipment like the HPE Edgeline system and get  it working in space without requiring a huge amount of additional,  custom work.


In  terms of what this will potentially provide, the partnership will mean  it’s more feasible than ever to set up a small-scale data center in  orbit to handle at least some of the processing of space-based data  right near where it’s collected, rather than having to shuttle it back  down to Earth. That process can be expensive, and difficult to source in  terms of even finding companies and infrastructure to use. As with  in-space manufacturing, doing things locally could save a lot of  overhead and unlock tons of potential down the line.


https://techcrunch.com/2019/12/03/orbitsedge-partners-with-hpe-on-orbital-datacenter-computing-and-analytics/

Space IT Bridge: OrbitsEdge announces OEM agreement with HP Enterprise for in-orbit compute power

 By Doug Mohney | December 3, 2019


Startup OrbitsEdge,  Inc. announced it has signed an original equipment manufacturer (OEM)  agreement with Hewlett Packard Enterprise (HPE) to host HPE Edgeline Convered Edges Systems onboard its SatFrame space-hardened satellite to enable commercial  space companies to deploy computing in orbit and accelerate exploration.  Given HPE’s previous work onboard the International Space Station  (ISS), this isn’t a big surprise.


“Hewlett  Packard Enterprise is the ideal partner for OrbitsEdge since its  technologies have proven to withstand extreme environments on Earth and  in space, with its deployment of the Spaceborne Computer in the  International Space Station (ISS). This partnership follows HPE’s  innovative strategy of enabling new solutions to be developed and  deployed years in advance,” said Barbara Stinnett, chief executive  officer of OrbitsEdge, Inc. “OrbitsEdge will leverage HPE’s edge  technology to run sophisticated analytics such as artificial  intelligence (AI) on the vast amounts of data that will be created as  space is commercialized,” she added.


OrbitsEdge proprietary SatFrame bus is designed to support and protect commercial off-the-shelf (COTS)  data center rack-mountable computing gear from the challenges of  in-orbit operations, with SatFrame providing protection against  radiation as well as providing temperature control, power, and  communications. An HPE Edgeline Converged Edge System will be the first  hosted payload onboard SatFrame to provide what OrbitsEdge calls a  “micro-datacenter in orbit” for processing space-based data and help  minimize the time and cost of backhaul to earth.


“We  are committed to pushing technology limits to power the next era of  innovation, whether it’s here on Earth or in space,” said Phillip  Cutrone, vice president and general manager, Worldwide OEM at HPE. “The  HPE Edgeline Converged Edge Systems provide datacenter-grade  performance, data acquisition, industrial networks, and control in harsh  edge environments to enable real-time insight and action. By combining  our technologies with the OrbitsEdge SatFrame hardening design, the  commercial space industry gains advanced systems to create new  space-based applications and solutions.”


The  SatFrame 445 bus provides a standard 19 inch server rack for up to 5U  (Rack U, not Cubesat U of space, satellite bros) hardware and can  support up to full-size 36 inch deep hardware. OrbitsEdge plans to  launch a “sub 300” kilogram satellite in its first flight demonstration  with 18 inch (half-deep) hardware onboard, with payloads operating on a  “day/night” cycle on the satellite to conserve power and manage heat,  powering up when the satellite is in the sun and shutting down on the  night side of the Earth.


One  potential application for OrbitsEdge-style in-orbit computing power would be to process imagery directly from other low Earth orbit (LEO)  satellites. Today, visual and radar imagery are typically transmitted in  raw form down to a ground station and into the data center and then  processed and sent to the end-user. On-orbit processing would  substantially reduce satellite downlink bandwidth needs and could  provide a processed image directly to an end-user more quickly by  removing the ground data center as an intermediary. Faster imaging  processing would be a bonanza for civilian and national defense users –  the latter group an area HPE is quite familiar with.


Other  applications for in-orbit computing include financial transactions and  any that need low-latency outside of a traditional data center. How  OrbitsEdge fits into the overall scheme of edge computing and 5G will be  interesting to watch, since edge and 5G both are emphasizing  low-latency as an advantage over backhauling computations back to a  traditional data center.


https://www.spaceitbridge.com/orbitsedge-announces-oem-agreement-with-hp-enterprise-for-in-orbit-compute-power.htm

Space Startup OrbitsEdge And HP Partner To Build Data Centers In Outer Space

By Ron Mendoza | December 3, 2019


OrbitsEdge,  provider of Low Earth Orbit (LEO) Edge micro-data centers, partners up  with Hewlett Packard Enterprise (HPE) to help make data more accessible  for companies in space.

Florida-based startup OrbitsEdge announced on Tuesday via press release that it has signed an original equipment manufacturer (OEM) contract  with HPE. With this new agreement, OrbitsEdge will be the supplier of  HPE's Edgeline Converged Edge Systems. The team-up will forge data  centers that will be deployed in outer space, which is designed to make  computing and data processing more accessible from where data is  collected rather than sending it back to Earth.


OrbitsEdge  applies a hardening solution to HPE's equipment to enable it to endure  the extreme conditions in space, like radiation and other environmental  stressors that it will be subjected to in space. Founded this year,  OrbitsEdge proprietary technology for protecting hardware is "SatFrame." The "ruggedized satellite bus" is designed to withstand the harsh environment in space.

A Former HPE Executive as CEO
 

The company also appointed a former Hewlett Packard executive, Barbara Stinnett, as CEO back in  September. Stinett's resumé spans over 30 years of experience with  Silicon Valley companies, namely HPE, Cisco and Oracle.


"Hewlett  Packard Enterprise is the ideal partner for OrbitsEdge since its  technologies have proven to withstand extreme environments on Earth and  in space, with its deployment of the Spaceborne Computer in the  International Space Station (ISS). This partnership follows HPE's  innovative strategy of enabling new solutions to be developed and  deployed years in advance," said Stinnett.


"OrbitsEdge  will leverage HPE's edge technology to run sophisticated analytics such  as artificial intelligence (AI) on the vast amounts of data that will  be created as space is commercialized," she added.


"We  are committed to pushing technology limits to power the next era of  innovation, whether it's here on Earth or in space," said Phillip  Cutrone, vice president and general manager, Worldwide OEM at HPE. 


"The  HPE Edgeline Converged Edge Systems provide datacenter-grade  performance, data acquisition, industrial networks, and control in harsh  edge environments to enable real-time insight and action. By combining  our technologies with the OrbitsEdge SatFrame hardening design, the  commercial space industry gains advanced systems to create new  space-based applications and solutions."


https://www.ibtimes.com/space-startup-orbitsedge-hp-partner-build-data-centers-outer-space-2878743

Space IT Bridge / OrbitsEdge – Edge Computing in the sky

By Doug Mohney | October 4, 2019


OrbitsEdge is likely to give headaches to traditional satellite providers and  offer intriguing possibilities to the growing edge computing movement.  The company is offering a proprietary satellite bus designed to protect  off-the-shelf rack mountable computing gear from the harsh environment  of space, enabling users to tap into IT resources with low latency. It’s  also likely to temporarily confuse Cubesat people with its use of “U”  for rack space volume.


“What  we’re looking at in the past and today, all the computers that go up on  satellites are vintage tech,” said Rick Ward, Chief Technology Officer  at OrbitsEdge. “There’s a tremendous amount of work on radiation  hardening to make sure they work for a very long time. There’s no modern  computer out in space. We’re looking to change that.”


The  SatFrame 445 satellite will fly in Low Earth Orbit (LEO), providing  power, thermal control/cooling, improved radiation protection and a host  of communications capabilities to a standard 19 inch server rack with  available space for 5U of hardware up to full-size 36 inch deep  hardware. In addition, software “hardening” of devices will be necessary  to compensate for radiation faults and potential damage.


Radiation  is the biggest threat to computing in space as solar flares and cosmic  radiation randomly zip through RAM and CPUs, with best case scenarios  simple “bit flips” in memory storage or processes requiring a reboot.  Physical damage to chips also occurs over time, making memory and CPUs  unusable. Worse yet, devices become more vulnerable to radiation as  fabrication processes get smaller. Packing more transistors onto a piece  of silicon means the latest generation of chips are those most likely  to be brutalized and rendered ineffective by higher radiation levels  found outside of Earth’s atmosphere.


Only  recently has newer off-the-shelf IT and computing hardware gone up into  orbit, but experience is limited. Smaller cubesats have used cell  phones due to low cost and compactness along with a low-cost/lower  lifetime philosophy of 1 to 3 years in orbit, while HP Enterprise (HPE)  recently launched a “supercomputer” to the International Space Station.  The Spaceborne computer was built around an HPE Apollo 40-class system  and used a modified Linux OS, with the computer returned to Earth after  over a year of operation for teardown and fine analysis.


OrbitsEdge  plans to launch a “sub 300 kilogram range” satellite as a testbed for  its technologies and COTS hardware, with half-deep rack (18 inch)  hardware onboard.   Payloads will operate on a “day/night” cycle on the  demo satellite to conserve power and manage heat, powering up when solar  energy is available to run devices and shutting down when on the night  side of Earth.


“Our  demo mission is the smallest,” said Ward. “We’re only taking what’s  essential to the mission. One of the things about high capacity  computing is its very power intensive. We’re running a 1 kilowatt  heater, so you have to get rid of the heat. If you want to run at night,  you more than double mass take that step up,” between larger solar  panels and batteries needed to provide power when the sun isn’t  available.


Multiple  commercial architectures will be onboard, but Ward declined to provide  specifics on what gear or potential CPU types may be on board.   He did,  however, concur with Space IT Bridge that potential load outs could include low-end CPUs, representation for  GPUs such as NVIDIA, and the latest silicon. The upside to the latest  chip fabrication technologies is placing multiple cores on a single chip  and the ability to monitor CPUs, shutting down one when it is damaged.


Why  put computing into space in the first place? One real world application  is being able to process imagery faster from other LEO satellites.  Radar and visual imagery are transmitted in raw form to a ground station  into the data center, consuming time and bandwidth. On-orbit processing  would reduce bandwidth needs and could provide a processed image  directly to an end user faster. For civilian and national defense users,  faster imagery processing would be a bonanza.


Other  potential applications could include any requiring extremely low  latency, such as financial transactions, and any that could benefit from  edge computing. A commercial version of the OrbitsEdge satellite will  have multiple radios “some talking up, some talking down, some talking  sideways,” said Ward, illustrating the need to send processed data back  to the ground, upward for relay through a GEO communications satellite,  and to communicate with other satellites for picking up and passing  along raw and processed information.


OrbitsEdge  is still exploring different business models. Initial satellites may be  populated with servers with users leveraging VMware to run virtual  instances of the apps they need with agencies and enterprises requesting  more customized hardware loads tailored to specific needs. For security  and speed purposes, organizations may order (buy) dedicated satellites,  but potential customers need to become comfortable with and understand  the advantages to on-orbit edge computing.


CEO  Barbara Stinnett says OrbitsEdge has seed funding good through 2020,  staff on hand and is preparing to secure a Series A round in the first  quarter of 2020, talking to a mixture of venture capitalist funds and  strategic partners. There are also a series of OEM announcements in the  works with more information expected to be released in the upcoming  months.


“We  have three markets interested, all around sustainability,” Stinnett  said. Oil, gas, and water infrastructure is one sector, government the  second, and life sciences/health care as the third. Being able to  provide easily accessible and computing resources is of interest in  multiple markets.

Stinnett  would not discuss how many satellites OrbitsEdge expects to put into  orbit, saying the company had looked at it and would be disclosing their  plans at a future time.


Space IT Bridge finds the concept of OrbitsEdge intriguing in a couple of aspects. It  brings back the age-old discussion of “Big Dumb Pipe” verses “Smart  Network” started up in 1990s-2000s VON Magazine era. Big dumb pipe advocates believed if you have enough broadband and  low latency, everything can be solved by hauling functions and processes  back to the data center, an argument that proved significantly true  with the deployment of SDN and NFV in telecom networks.


However,  5G and its introduction of edge computing has brought back discussion  of a smart network. The 5G community believes edge computing is an asset  in time-sensitive applications affected by latency or just waiting  around for a response, but the telecom community continues to define a  set of use cases where edge computing is a “win” in 5G (Due in part to  the fact 5G network deployments are continuously flowing works in  progress dependent on RF bandwidth dictating architecture).


LEO  broadband services being deployed by OneWeb, SpaceX, Telesat, and  LeoSat are the “Big Dumb Pipe” of the 2020s. Will low latency and  sufficient broadband in an underserved/unserved area be good enough for  many/most users and applications or will OrbitsEdge fill in the role of  “Smart network” by bringing edge computing to the equation? There’s no  clear answer at the moment.


https://www.spaceitbridge.com/orbitsedge-edge-computing-in-the-sky.htm

  • Home
  • Mission
  • Edge in Space
  • In the News

OrbitsEdge, Inc.

Cocoa Beach, Florida, United States

(334) 791-7472

Copyright © 2025 OrbitsEdge - All Rights Reserved.

Powered by

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.

Accept