Members Can Post Anonymously On This Site
SERVIR/ResilienceLinks Webinar on “Floods, Drought, and Water Security: How is Water Data Critical to Climate Resilience?”
-
Similar Topics
-
By NASA
4 Min Read NASA Finds ‘Sideways’ Black Hole Using Legacy Data, New Techniques
Image showing the structure of galaxy NGC 5084, with data from the Chandra X-ray Observatory overlaid on a visible-light image of the galaxy. Chandra’s data, shown in purple, revealed four plumes of hot gas emanating from a supermassive black hole rotating “tipped over” at the galaxy’s core. Credits: X-ray: NASA/CXC, A. S. Borlaff, P. Marcum et al.; Optical full image: M. Pugh, B. Diaz; Image Processing: NASA/USRA/L. Proudfit NASA researchers have discovered a perplexing case of a black hole that appears to be “tipped over,” rotating in an unexpected direction relative to the galaxy surrounding it. That galaxy, called NGC 5084, has been known for years, but the sideways secret of its central black hole lay hidden in old data archives. The discovery was made possible by new image analysis techniques developed at NASA’s Ames Research Center in California’s Silicon Valley to take a fresh look at archival data from the agency’s Chandra X-ray Observatory.
Using the new methods, astronomers at Ames unexpectedly found four long plumes of plasma – hot, charged gas – emanating from NGC 5084. One pair of plumes extends above and below the plane of the galaxy. A surprising second pair, forming an “X” shape with the first, lies in the galaxy plane itself. Hot gas plumes are not often spotted in galaxies, and typically only one or two are present.
The method revealing such unexpected characteristics for galaxy NGC 5084 was developed by Ames research scientist Alejandro Serrano Borlaff and colleagues to detect low-brightness X-ray emissions in data from the world’s most powerful X-ray telescope. What they saw in the Chandra data seemed so strange that they immediately looked to confirm it, digging into the data archives of other telescopes and requesting new observations from two powerful ground-based observatories.
Hubble Space Telescope image of galaxy NGC 5084’s core. A dark, vertical line near the center shows the curve of a dusty disk orbiting the core, whose presence suggests a supermassive black hole within. The disk and black hole share the same orientation, fully tipped over from the horizontal orientation of the galaxy.NASA/STScI, M. A. Malkan, B. Boizelle, A.S. Borlaff. HST WFPC2, WFC3/IR/UVIS. The surprising second set of plumes was a strong clue this galaxy housed a supermassive black hole, but there could have been other explanations. Archived data from NASA’s Hubble Space Telescope and the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile then revealed another quirk of NGC 5084: a small, dusty, inner disk turning about the center of the galaxy. This, too, suggested the presence of a black hole there, and, surprisingly, it rotates at a 90-degree angle to the rotation of the galaxy overall; the disk and black hole are, in a sense, lying on their sides.
The follow-up analyses of NGC 5084 allowed the researchers to examine the same galaxy using a broad swath of the electromagnetic spectrum – from visible light, seen by Hubble, to longer wavelengths observed by ALMA and the Expanded Very Large Array of the National Radio Astronomy Observatory near Socorro, New Mexico.
“It was like seeing a crime scene with multiple types of light,” said Borlaff, who is also the first author on the paper reporting the discovery. “Putting all the pictures together revealed that NGC 5084 has changed a lot in its recent past.”
It was like seeing a crime scene with multiple types of light.
Alejandro Serrano Borlaff
NASA Research Scientist
“Detecting two pairs of X-ray plumes in one galaxy is exceptional,” added Pamela Marcum, an astrophysicist at Ames and co-author on the discovery. “The combination of their unusual, cross-shaped structure and the ‘tipped-over,’ dusty disk gives us unique insights into this galaxy’s history.”
Typically, astronomers expect the X-ray energy emitted from large galaxies to be distributed evenly in a generally sphere-like shape. When it’s not, such as when concentrated into a set of X-ray plumes, they know a major event has, at some point, disturbed the galaxy.
Possible dramatic moments in its history that could explain NGC 5084’s toppled black hole and double set of plumes include a collision with another galaxy and the formation of a chimney of superheated gas breaking out of the top and bottom of the galactic plane.
More studies will be needed to determine what event or events led to the current strange structure of this galaxy. But it is already clear that the never-before-seen architecture of NGC 5084 was only discovered thanks to archival data – some almost three decades old – combined with novel analysis techniques.
The paper presenting this research was published Dec. 18 in The Astrophysical Journal. The image analysis method developed by the team – called Selective Amplification of Ultra Noisy Astronomical Signal, or SAUNAS – was described in The Astrophysical Journal in May 2024.
For news media:
Members of the news media interested in covering this topic should reach out to the NASA Ames newsroom.
Share
Details
Last Updated Dec 18, 2024 Related Terms
Black Holes Ames Research Center Ames Research Center's Science Directorate Astrophysics Chandra X-Ray Observatory Galaxies Galaxies, Stars, & Black Holes Galaxies, Stars, & Black Holes Research General Hubble Space Telescope Marshall Astrophysics Marshall Science Research & Projects Marshall Space Flight Center Missions NASA Centers & Facilities Science & Research Supermassive Black Holes The Universe Explore More
4 min read Space Gardens
Article 18 mins ago 8 min read NASA’s Kennedy Space Center Looks to Thrive in 2025
Article 1 hour ago 4 min read NASA Open Science Reveals Sounds of Space
NASA has a long history of translating astronomy data into beautiful images that are beloved…
Article 1 hour ago Keep Exploring Discover More Topics From NASA
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
5 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
Data from the SWOT satellite was used to calculate average water levels for lakes and reservoirs in the Ohio River Basin from July 2023 to November 2024. Yellow indicates values greater than 1,600 feet (500 meters) above sea level; dark purple represents water levels less than 330 feet (100 meters). Data from the U.S.-European Surface Water and Ocean Topography mission gives researchers a detailed look at lakes and reservoirs in a U.S. watershed.
The Ohio River Basin stretches from Pennsylvania to Illinois and contains a system of reservoirs, lakes, and rivers that drains an area almost as large as France. Researchers with the SWOT (Surface Water and Ocean Topography) mission, a collaboration between NASA and the French space agency CNES (Centre National d’Études Spatiales), now have a new tool for measuring water levels not only in this area, which is home to more than 25 million people, but in other watersheds around the world as well.
Since early 2023, SWOT has been measuring the height of nearly all water on Earth’s surface — including oceans, lakes, reservoirs, and rivers — covering nearly the entire globe at least once every 21 days. The SWOT satellite also measures the horizontal extent of water in freshwater bodies. Earlier this year, the mission started making validated data publicly available.
“Having these two perspectives — water extent and levels — at the same time, along with detailed, frequent coverage over large areas, is unprecedented,” said Jida Wang, a hydrologist at the University of Illinois Urbana-Champaign and a member of the SWOT science team. “This is a groundbreaking, exciting aspect of SWOT.”
Researchers can use the mission’s data on water level and extent to calculate how the amount of water stored in a lake or reservoir changes over time. This, in turn, can give hydrologists a more precise picture of river discharge — how much water moves through a particular stretch of river.
The visualization above uses SWOT data from July 2023 to November 2024 to show the average water level above sea level in lakes and reservoirs in the Ohio River Basin, which drains into the Mississippi River. Yellow indicates values greater than 1,600 feet (500 meters), and dark purple represents water levels less than 330 feet (100 meters). Comparing how such levels change can help hydrologists measure water availability over time in a local area or across a watershed.
Complementing a Patchwork of Data
Historically, estimating freshwater availability for communities within a river basin has been challenging. Researchers gather information from gauges installed at certain lakes and reservoirs, from airborne surveys, and from other satellites that look at either water level or extent. But for ground-based and airborne instruments, the coverage can be limited in space and time. Hydrologists can piece together some of what they need from different satellites, but the data may or may not have been taken at the same time, or the researchers might still need to augment the information with measurements from ground-based sensors.
Even then, calculating freshwater availability can be complicated. Much of the work relies on computer models. “Traditional water models often don’t work very well in highly regulated basins like the Ohio because they have trouble representing the unpredictable behavior of dam operations,” said George Allen, a freshwater researcher at Virginia Tech in Blacksburg and a member of the SWOT science team.
Many river basins in the United States include dams and reservoirs managed by a patchwork of entities. While the people who manage a reservoir may know how their section of water behaves, planning for water availability down the entire length of a river can be a challenge. Since SWOT looks at both rivers and lakes, its data can help provide a more unified view.
“The data lets water managers really know what other people in these freshwater systems are doing,” said SWOT science team member Colin Gleason, a hydrologist at the University of Massachusetts Amherst.
While SWOT researchers are excited about the possibilities that the data is opening up, there is still much to be done. The satellite’s high-resolution view of water levels and extent means there is a vast ocean of data that researchers must wade through, and it will take some time to process and analyze the measurements.
More About SWOT
The SWOT satellite was jointly developed by NASA and CNES, with contributions from the Canadian Space Agency (CSA) and the UK Space Agency. NASA’s Jet Propulsion Laboratory, managed for the agency by Caltech in Pasadena, California, leads the U.S. component of the project. For the flight system payload, NASA provided the Ka-band radar interferometer (KaRIn) instrument, a GPS science receiver, a laser retroreflector, a two-beam microwave radiometer, and NASA instrument operations. The Doppler Orbitography and Radioposition Integrated by Satellite system, the dual frequency Poseidon altimeter (developed by Thales Alenia Space), the KaRIn radio-frequency subsystem (together with Thales Alenia Space and with support from the UK Space Agency), the satellite platform, and ground operations were provided by CNES. The KaRIn high-power transmitter assembly was provided by CSA.
To learn more about SWOT, visit:
https://swot.jpl.nasa.gov
News Media Contacts
Jane J. Lee / Andrew Wang
Jet Propulsion Laboratory, Pasadena, Calif.
818-354-0307 / 626-379-6874
jane.j.lee@jpl.nasa.gov / andrew.wang@jpl.nasa.gov
2024-176
Share
Details
Last Updated Dec 17, 2024 Related Terms
SWOT (Surface Water and Ocean Topography) Jet Propulsion Laboratory Water on Earth Explore More
5 min read NASA Mars Orbiter Spots Retired InSight Lander to Study Dust Movement
Article 1 day ago 5 min read NASA’s Perseverance Rover Reaches Top of Jezero Crater Rim
Article 5 days ago 5 min read NASA’s Juno Mission Uncovers Heart of Jovian Moon’s Volcanic Rage
Article 5 days ago Keep Exploring Discover Related Topics
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
This article is from the 2024 Technical Update.
Multiple human spaceflight programs are underway at NASA including Orion, Space Launch System, Gateway, Human Landing System, and EVA and Lunar Surface Mobility programs. Achieving success in these programs requires NASA to collaborate with a variety of commercial partners, including both new spaceflight companies and robotic spaceflight companies pursuing crewed spaceflight for the first time. It is not always clear to these organizations how to show their systems are safe for human spaceflight. This is particularly true for avionics systems, which are responsible for performing some of a crewed spacecraft’s most critical functions. NASA recently published guidance describing how to show the design of an avionic system meets safety requirements for crewed missions.
Background
The avionics in a crewed spacecraft perform many safety critical functions, including controlling the position and attitude of the spacecraft, activating onboard abort systems, and firing pyrotechnics. The incorrect operation of any of these functions can be catastrophic, causing loss of the crew. NASA’s human rating requirements describe the need for “additional rigor and scrutiny” when designing safety-critical systems beyond that done
for uncrewed spacecraft [2]. Unfortunately, it is not always clear how to interpret this guidance and show an avionics architecture is sufficiently safe. To address this problem, NASA recently published NASA/TM−20240009366 [1]. It outlines best practices for designing safety-critical avionics, as well as describes key artifacts or evidence NASA needs to assess the safety of an avionics architecture.
Failure Hypothesis
One of the most important steps to designing an avionics architecture for crewed spacecraft is specification of the failure hypothesis (FH). In short, the FH summarizes any assumptions the designers make about the type, number, and persistence of component failures (e.g., of onboard computers, network switches). It divides the space of all possible failures into two parts – failures the system is designed to tolerate and failures it is not.
One key part of the FH is a description of failure modes the system can tolerate – i.e., the behavior exhibited by a failed component. Failure modes are categorized using a failure model. A typical failure model for avionics splits failures into two broad categories:
Value failures, where data produced by a component is missing (i.e., an omissive failure) or incorrect (i.e., a transmissive failure). Timing failures, where data is produced by a component at the wrong time.
Timing failures can be further divided into many sub-categories, including:
Inadvertent activation, where data is produced by a component without the necessary preconditions. Out-of-order failures, where data is produced by a component in an incorrect sequence. Marginal timing failures, where data is produced by a component slightly too early or late.
In addition to occurring when data is produced by a component, these failure modes can also occur when data enters a component. (e.g., a faulty component can corrupt a message it receives). Moreover, all failure modes can manifest in one of two ways:
Symmetrically, where all observers see the same faulty behavior. Asymmetrically, where some observers see different faulty behavior.
Importantly, NASA’s human-rating process requires that each of these failure modes be mitigated if it can result in catastrophic effects [2]. Any exceptions must be explicitly documented and strongly justified. In addition to specifying the failure modes a system can tolerate, the FH must specify any limiting assumptions about the relative arrival times of permanent failures and radiation-induced upsets/ errors or the ability for ground operator to intervene to safe the system or take recovery actions. For more information on specifying a FH and other artifacts needed to evaluate the safety of an avionics architecture for human spaceflight, see the full report [1].
View the full article
-
By NASA
At Goddard Space Flight Center, the GSFC Data Science Group has completed the testing for their SatVision Top-of-Atmosphere (TOA) Foundation Model, a geospatial foundation model for coarse-resolution all-sky remote sensing imagery. The team, comprised of Mark Carroll, Caleb Spradlin, Jordan Caraballo-Vega, Jian Li, Jie Gong, and Paul Montesano, has now released their model for wide application in science investigations.
Foundation models can transform the landscape of remote sensing (RS) data analysis by enabling the pre-training of large computer-vision models on vast amounts of remote sensing data. These models can be fine-tuned with small amounts of labeled training and applied to various mapping and monitoring applications. Because most existing foundation models are trained solely on cloud-free satellite imagery, they are limited to applications of land surface or require atmospheric corrections. SatVision-TOA is trained on all-sky conditions which enables applications involving atmospheric variables (e.g., cloud or aerosol).
SatVision TOA is a 3 billion parameter model trained on 100 million images from Moderate Resolution Imaging Spectroradiometer (MODIS). This is, to our knowledge, the largest foundation model trained solely on satellite remote sensing imagery. By including “all-sky” conditions during pre-training, the team incorporated a range of cloud conditions often excluded in traditional modeling. This enables 3D cloud reconstruction and cloud modeling in support of Earth and climate science, offering significant enhancement for large-scale earth observation workflows.
With an adaptable and scalable model design, SatVision-TOA can unify diverse Earth observation datasets and reduce dependency on task-specific models. SatVision-TOA leverages one of the largest public datasets to capture global contexts and robust features. The model could have broad applications for investigating spectrometer data, including MODIS, VIIRS, and GOES-ABI. The team believes this will enable transformative advancements in atmospheric science, cloud structure analysis, and Earth system modeling.
The model architecture and model weights are available on GitHub and Hugging Face, respectively. For more information, including a detailed user guide, see the associated white paper: SatVision-TOA: A Geospatial Foundation Model for Coarse-Resolution All-Sky Remote Sensing Imagery.
Examples of image reconstruction by SatVision-TOA. Left: MOD021KM v6.1 cropped image chip using MODIS bands [1, 3, 2]. Middle: The same images with randomly applied 8×8 mask patches, masking 60% of the original image. Right: The reconstructed images produced by the model, along with their respective Structural Similarity Index Measure (SSIM) scores. These examples illustrate the model’s ability to preserve structural detail and reconstruct heterogeneous features, such as cloud textures and land-cover transitions, with high fidelity.NASAView the full article
-
By European Space Agency
Launched in May 2024, ESA’s EarthCARE satellite is nearing the end of its commissioning phase with the release of its first data on clouds and aerosols expected early next year. In the meantime, an international team of scientists has found an innovative way of applying artificial intelligence to other satellite data to yield 3D profiles of clouds.
This is particularly news for those eagerly awaiting data from EarthCARE in their quest to advance climate science.
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.