Members Can Post Anonymously On This Site
Be a Burst Chaser and Witness the Most Powerful Explosions in the Universe!
-
Similar Topics
-
By NASA
At NASA, high-end computing is essential for many agency missions. This technology helps us advance our understanding of the universe – from our planet to the farthest reaches of the cosmos. Supercomputers enable projects across diverse research, such as making discoveries about the Sun’s activity that affects technologies in space and life on Earth, building artificial intelligence-based models for innovative weather and climate science, and helping redesign the launch pad that will send astronauts to space with Artemis II.
These projects are just a sample of the many on display in NASA’s exhibit during the International Conference for High Performance Computing, Networking, Storage and Analysis, or SC24. NASA’s Dr. Nicola “Nicky” Fox, associate administrator for the agency’s Science Mission Directorate, will deliver the keynote address, “NASA’s Vision for High Impact Science and Exploration,” on Tuesday, Nov. 19, where she’ll share more about the ways NASA uses supercomputing to explore the universe for the benefit of all. Here’s a little more about the work NASA will share at the conference:
1. Simulations Help in Redesign of the Artemis Launch Environment
To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
This simulation of the Artemis I launch shows how the Space Launch System rocket's exhaust plumes interact with the air, water, and the launchpad. Colors on surfaces indicate pressure levels—red for high pressure and blue for low pressure. The teal contours illustrate where water is present. NASA/Chris DeGrendele, Timothy Sandstrom Researchers at NASA Ames are helping ensure astronauts launch safely on the Artemis II test flight, the first crewed mission of the Space Launch System (SLS) rocket and Orion spacecraft, scheduled for 2025. Using the Launch Ascent and Vehicle Aerodynamics software, they simulated the complex interactions between the rocket plume and the water-based sound suppression system used during the Artemis I launch, which resulted in damage to the mobile launcher platform that supported the rocket before liftoff.
Comparing simulations with and without the water systems activated revealed that the sound suppression system effectively reduces pressure waves, but exhaust gases can redirect water and cause significant pressure increases.
The simulations, run on the Aitken supercomputer at the NASA Advanced Supercomputing facility at Ames, generated about 400 terabytes of data. This data was provided to aerospace engineers at NASA’s Kennedy Space Center in Florida, who are redesigning the flame deflector and mobile launcher for the Artemis II launch.
2. Airplane Design Optimization for Fuel Efficiency
In this comparison of aircraft designs, the left wing models the aircraft’s initial geometry, while the right wing models an optimized shape. The surface is colored by the air pressure on the aircraft, with orange surfaces representing shock waves in the airflow. The optimized design modeled on the right wing reduces drag by 4% compared to the original, leading to improved fuel efficiency. NASA/Brandon Lowe To help make commercial flight more efficient and sustainable, researchers and engineers at NASA’s Ames Research Center in California’s Silicon Valley are working to refine aircraft designs to reduce air resistance, or drag, by fine-tuning the shape of wings, fuselages, and other aircraft structural components. These changes would lower the energy required for flight and reduce the amount of fuel needed, produce fewer emissions, enhance overall performance of aircraft, and could help reduce noise levels around airports.
Using NASA’s Launch, Ascent, and Vehicle Aerodynamics computational modeling software, developed at Ames, researchers are leveraging the power of agency supercomputers to run hundreds of simulations to explore a variety of design possibilities – on existing aircraft and future vehicle concepts. Their work has shown the potential to reduce drag on an existing commercial aircraft design by 4%, translating to significant fuel savings in real-world applications.
3. Applying AI to Weather and Climate
This visualization compares the track of the Category 4 hurricane, Ida, from MERRA-2 reanalysis data (left) with a prediction made without specific training, from NASA and IBM’s Prithvi WxC foundation model (right). Both models were initialized at 00 UTC on 2021-08-27.The University of Alabama in Huntsville/Ankur Kumar; NASA/Sujit Roy Traditional weather and climate models produce global and regional results by solving mathematical equations for millions of small areas (grid boxes) across Earth’s atmosphere and oceans. NASA and partners are now exploring newer approaches using artificial intelligence (AI) techniques to train a foundation model.
Foundation models are developed using large, unlabeled datasets so researchers can fine-tune results for different applications, such as creating forecasts or predicting weather patterns or climate changes, independently with minimal additional training.
NASA developed the open source, publicly available Prithvi Weather-Climate foundation model (Prithvi WxC), in collaboration with IBM Research. Prithvi WxC was pretrained using 160 variables from NASA’s Modern-era Retrospective analysis for Research and Applications (MERRA-2) dataset on the newest NVIDIA A100 GPUs at the NASA Advanced Supercomputing facility.
Armed with 2.3 billion parameters, Prithvi WxC can model a variety of weather and climate phenomena – such as hurricane tracks – at fine resolutions. Applications include targeted weather prediction and climate projection, as well as representing physical processes like gravity waves.
4. Simulations and AI Reveal the Fascinating World of Neutron Stars
3D simulation of pulsar magnetospheres, run on NASA’s Aitken supercomputer using data from the agency‘s Fermi space telescope. The red arrow shows the direction of the star’s magnetic field. Blue lines trace high-energy particles, producing gamma rays, in yellow. Green lines represent light particles hitting the observer’s plane, illustrating how Fermi detects pulsar gamma rays. NASA/Constantinos Kalapotharakos To explore the extreme conditions inside neutron stars, researchers at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, are using a blend of simulation, observation, and AI to unravel the mysteries of these extraordinary cosmic objects. Neutron stars are the dead cores of stars that have exploded and represent some of the densest objects in the universe.
Cutting-edge simulations, run on supercomputers at the NASA Advanced Supercomputing facility, help explain phenomena observed by NASA’s Fermi Gamma-ray Space Telescope and Neutron star Interior Composition Explorer (NICER) observatory. These phenomena include the rapidly spinning, highly magnetized neutron stars known as pulsars, whose detailed physical mechanisms have remained mysterious since their discovery. By applying AI tools such as deep neural networks, the scientists can infer the stars’ mass, radius, magnetic field structure, and other properties from data obtained by the NICER and Fermi observatories.
The simulations’ unprecedented results will guide similar studies of black holes and other space environments, as well as play a pivotal role in shaping future scientific space missions and mission concepts.
5. Modeling the Sun in Action – From Tiny to Large Scales
Image from a 3D simulation showing the evolution of flows in the upper layers of the Sun, with the most vigorous motions shown in red. These turbulent flows can generate magnetic fields and excite sound waves, shock waves, and eruptions. NASA/Irina Kitiashvili and Timothy A. Sandstrom The Sun’s activity, producing events such as solar flares and coronal mass ejections, influences the space environment and cause space weather disturbances that can interfere with satellite electronics, radio communications, GPS signals, and power grids on Earth. Scientists at NASA Ames produced highly realistic 3D models that – for the first time – allow them to examine the physics of solar plasma in action, from very small to very large scales. These models help interpret observations from NASA spacecraft like the Solar Dynamics Observatory (SDO).
Using NASA’s StellarBox code on supercomputers at NASA’s Advanced Supercomputing facility, the scientists improved our understanding of the origins of solar jets and tornadoes – bursts of extremely hot, charged plasma in the solar atmosphere. These models allow the science community to address long-standing questions of solar magnetic activity and how it affects space weather.
6. Scientific Visualization Makes NASA Data Understandable
This global map is a frame from an animation showing how wind patterns and atmospheric circulation moved carbon dioxide through Earth’s atmosphere from January to March 2020. The DYAMOND model’s high resolution shows unique sources of carbon dioxide emissions and how they spread across continents and oceans.NASA/Scientific Visualization Studio NASA simulations and observations can yield petabytes of data that are difficult to comprehend in their original form. The Scientific Visualization Studio (SVS), based at NASA Goddard, turns data into insight by collaborating closely with scientists to create cinematic, high-fidelity visualizations.
Key infrastructure for these SVS creations includes the NASA Center for Climate Simulation’s Discover supercomputer at Goddard, which hosts a variety of simulations and provides data analysis and image-rendering capabilities. Recent data-driven visualizations show a coronal mass ejection from the Sun hitting Earth’s magnetosphere using the Multiscale Atmosphere-Geospace Environment (MAGE) model; global carbon dioxide emissions circling the planet in the DYnamics of the Atmospheric general circulation Modeled On Non-hydrostatic Domains (DYAMOND) model; and representations of La Niña and El Niño weather patterns using the El Niño-Southern Oscillation (ENSO) model.
For more information about NASA’s virtual exhibit at the International Conference for High Performance Computing, Networking, Storage and Analysis, being held in Atlanta, Nov. 17-22, 2024, visit:
https://www.nas.nasa.gov/SC24
For more information about supercomputers run by NASA High-End Computing, visit:
https://hec.nasa.gov
For news media:
Members of the news media interested in covering this topic should reach out to the NASA Ames newsroom.
Authors: Jill Dunbar, Michelle Moyer, and Katie Pitta, NASA’s Ames Research Center; and Jarrett Cohen, NASA’s Goddard Space Flight Center
View the full article
-
By NASA
6 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
The NISAR mission will help researchers get a better understanding of how Earth’s surface changes over time, including in the lead-up to volcanic eruptions like the one pictured, at Mount Redoubt in southern Alaska in April 2009.R.G. McGimsey/AVO/USGS Data from NISAR will improve our understanding of such phenomena as earthquakes, volcanoes, and landslides, as well as damage to infrastructure.
We don’t always notice it, but much of Earth’s surface is in constant motion. Scientists have used satellites and ground-based instruments to track land movement associated with volcanoes, earthquakes, landslides, and other phenomena. But a new satellite from NASA and the Indian Space Research Organisation (ISRO) aims to improve what we know and, potentially, help us prepare for and recover from natural and human-caused disasters.
The NISAR (NASA-ISRO Synthetic Aperture Radar) mission will measure the motion of nearly all of the planet’s land and ice-covered surfaces twice every 12 days. The pace of NISAR’s data collection will give researchers a fuller picture of how Earth’s surface changes over time. “This kind of regular observation allows us to look at how Earth’s surface moves across nearly the entire planet,” said Cathleen Jones, NISAR applications lead at NASA’s Jet Propulsion Laboratory in Southern California.
Together with complementary measurements from other satellites and instruments, NISAR’s data will provide a more complete picture of how Earth’s surface moves horizontally and vertically. The information will be crucial to better understanding everything from the mechanics of Earth’s crust to which parts of the world are prone to earthquakes and volcanic eruptions. It could even help resolve whether sections of a levee are damaged or if a hillside is starting to move in a landslide.
The NISAR mission will measure the motion of Earth’s surface — data that can be used to monitor critical infrastructure such as airport runways, dams, and levees. NASA/JPL-Caltech What Lies Beneath
Targeting an early 2025 launch from India, the mission will be able to detect surface motions down to fractions of an inch. In addition to monitoring changes to Earth’s surface, the satellite will be able to track the motion of ice sheets, glaciers, and sea ice, and map changes to vegetation.
The source of that remarkable detail is a pair of radar instruments that operate at long wavelengths: an L-band system built by JPL and an S-band system built by ISRO. The NISAR satellite is the first to carry both. Each instrument can collect measurements day and night and see through clouds that can obstruct the view of optical instruments. The L-band instrument will also be able to penetrate dense vegetation to measure ground motion. This capability will be especially useful in areas surrounding volcanoes or faults that are obscured by vegetation.
“The NISAR satellite won’t tell us when earthquakes will happen. Instead, it will help us better understand which areas of the world are most susceptible to significant earthquakes,” said Mark Simons, the U.S. solid Earth science lead for the mission at Caltech in Pasadena, California.
Data from the satellite will give researchers insight into which parts of a fault slowly move without producing earthquakes and which sections are locked together and might suddenly slip. In relatively well-monitored areas like California, researchers can use NISAR to focus on specific regions that could produce an earthquake. But in parts of the world that aren’t as well monitored, NISAR measurements could reveal new earthquake-prone areas. And when earthquakes do occur, data from the satellite will help researchers understand what happened on the faults that ruptured.
“From the ISRO perspective, we are particularly interested in the Himalayan plate boundary,” said Sreejith K M, the ISRO solid Earth science lead for NISAR at the Space Applications Center in Ahmedabad, India. “The area has produced great magnitude earthquakes in the past, and NISAR will give us unprecedented information on the seismic hazards of the Himalaya.”
Surface motion is also important for volcano researchers, who need data collected regularly over time to detect land movements that may be precursors to an eruption. As magma shifts below Earth’s surface, the land can bulge or sink. The NISAR satellite will help provide a fuller picture for why a volcano deforms and whether that movement signals an eruption.
Finding Normal
When it comes to infrastructure such as levees, aqueducts, and dams, NISAR’s ability to provide continuous measurements over years will help to establish the usual state of the structures and surrounding land. Then, if something changes, resource managers may be able to pinpoint specific areas to examine. “Instead of going out and surveying an entire aqueduct every five years, you can target your surveys to problem areas,” said Jones.
The data could be equally valuable for showing that a dam hasn’t changed after a disaster like an earthquake. For instance, if a large earthquake struck San Francisco, liquefaction — where loosely packed or waterlogged sediment loses its stability after severe ground shaking — could pose a problem for dams and levees along the Sacramento-San Joaquin River Delta.
“There’s over a thousand miles of levees,” said Jones. “You’d need an army to go out and look at them all.” The NISAR mission would help authorities survey them from space and identify damaged areas. “Then you can save your time and only go out to inspect areas that have changed. That could save a lot of money on repairs after a disaster.”
More About NISAR
The NISAR mission is an equal collaboration between NASA and ISRO and marks the first time the two agencies have cooperated on hardware development for an Earth-observing mission. Managed for the agency by Caltech, JPL leads the U.S. component of the project and is providing the mission’s L-band SAR. NASA is also providing the radar reflector antenna, the deployable boom, a high-rate communication subsystem for science data, GPS receivers, a solid-state recorder, and payload data subsystem. The U R Rao Satellite Centre in Bengaluru, India, which leads the ISRO component of the mission, is providing the spacecraft bus, the launch vehicle, and associated launch services and satellite mission operations. The ISRO Space Applications Centre in Ahmedabad is providing the S-band SAR electronics.
To learn more about NISAR, visit:
https://nisar.jpl.nasa.gov
News Media Contacts
Jane J. Lee / Andrew Wang
Jet Propulsion Laboratory, Pasadena, Calif.
818-354-0307 / 626-379-6874
jane.j.lee@jpl.nasa.gov / andrew.wang@jpl.nasa.gov
2024-155
Share
Details
Last Updated Nov 08, 2024 Related Terms
NISAR (NASA-ISRO Synthetic Aperture Radar) Earth Science Earthquakes Jet Propulsion Laboratory Natural Disasters Volcanoes Explore More
2 min read Hurricane Helene’s Gravity Waves Revealed by NASA’s AWE
On Sept. 26, 2024, Hurricane Helene slammed into the Gulf Coast of Florida, inducing storm…
Article 22 hours ago 3 min read Integrating Relevant Science Investigations into Migrant Children Education
For three weeks in August, over 100 migrant children (ages 3-15) got to engage in…
Article 2 days ago 5 min read NASA, Bhutan Conclude Five Years of Teamwork on STEM, Sustainability
Article 4 days ago Keep Exploring Discover Related Topics
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
6 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
NASA’s SPHEREx observatory undergoes integration and testing at BAE Systems in Boulder, Colorado, in April 2024. The space telescope will use a technique called spectroscopy across the entire sky, capturing the universe in more than 100 colors. BAE Systems The space telescope will detect over 100 colors from hundreds of millions of stars and galaxies. Here’s what astronomers will do with all that color.
NASA’s SPHEREx mission won’t be the first space telescope to observe hundreds of millions of stars and galaxies when it launches no later than April 2025, but it will be the first to observe them in 102 colors. Although these colors aren’t visible to the human eye because they’re in the infrared range, scientists will use them to learn about topics that range from the physics that governed the universe less than a second after its birth to the origins of water on planets like Earth.
“We are the first mission to look at the whole sky in so many colors,” said SPHEREx Principal Investigator Jamie Bock, who is based jointly at NASA’s Jet Propulsion Laboratory and Caltech, both in Southern California. “Whenever astronomers look at the sky in a new way, we can expect discoveries.”
Short for Spectro-Photometer for the History of the Universe, Epoch of Reionization and Ices Explorer, SPHEREx will collect infrared light, which has wavelengths slightly longer than what the human eye can detect. The telescope will use a technique called spectroscopy to take the light from hundreds of millions of stars and galaxies and separate it into individual colors, the way a prism transforms sunlight into a rainbow. This color breakdown can reveal various properties of an object, including its composition and its distance from Earth.
NASA’s SPHEREx mission will use spectroscopy — the splitting of light into its component wavelengths — to study the universe. Watch this video to learn more about spectroscopy. NASA’s Goddard Space Flight Center Here are the three key science investigations SPHEREx will conduct with its colorful all-sky map.
Cosmic Origins
What human eyes perceive as colors are distinct wavelengths of light. The only difference between colors is the distance between the crests of the light wave. If a star or galaxy is moving, its light waves get stretched or compressed, changing the colors they appear to emit. (It’s the same with sound waves, which is why the pitch of an ambulance siren seems to go up as its approaches and lowers after it passes.) Astronomers can measure the degree to which light is stretched or compressed and use that to infer the distance to the object.
SPHEREx will apply this principle to map the position of hundreds of millions of galaxies in 3D. By doing so, scientists can study the physics of inflation, the event that caused the universe to expand by a trillion-trillion fold in less than a second after the big bang. This rapid expansion amplified small differences in the distribution of matter. Because these differences remain imprinted on the distribution of galaxies today, measuring how galaxies are distributed can tell scientists more about how inflation worked.
Galactic Origins
SPHEREx will also measure the collective glow created by all galaxies near and far — in other words, the total amount of light emitted by galaxies over cosmic history. Scientists have tried to estimate this total light output by observing individual galaxies and extrapolating to the trillions of galaxies in the universe. But these counts may leave out some faint or hidden light sources, such as galaxies too small or too distant for telescopes to easily detect.
With spectroscopy, SPHEREx can also show astronomers how the total light output has changed over time. For example, it may reveal that the universe’s earliest generations of galaxies produced more light than previously thought, either because they were more plentiful or bigger and brighter than current estimates suggest. Because light takes time to travel through space, we see distant objects as they were in the past. And, as light travels, the universe’s expansion stretches it, changing its wavelength and its color. Scientists can therefore use SPHEREx data to determine how far light has traveled and where in the universe’s history it was released.
Water’s Origins
SPHEREx will measure the abundance of frozen water, carbon dioxide, and other essential ingredients for life as we know it along more than 9 million unique directions across the Milky Way galaxy. This information will help scientists better understand how available these key molecules are to forming planets. Research indicates that most of the water in our galaxy is in the form of ice rather than gas, frozen to the surface of small dust grains. In dense clouds where stars form, these icy dust grains can become part of newly forming planets, with the potential to create oceans like the ones on Earth.
The mission’s colorful view will enable scientists to identify these materials, because chemical elements and molecules leave a unique signature in the colors they absorb and emit.
Big Picture
Many space telescopes, including NASA’s Hubble and James Webb, can provide high-resolution, in-depth spectroscopy of individual objects or small sections of space. Other space telescopes, like NASA’s retired Wide-field Infrared Survey Explorer (WISE), were designed to take images of the whole sky. SPHEREx combines these abilities to apply spectroscopy to the entire sky.
By combining observations from telescopes that target specific parts of the sky with SPHEREx’s big-picture view, scientists will get a more complete — and more colorful — perspective of the universe.
More About SPHEREx
SPHEREx is managed by JPL for NASA’s Astrophysics Division within the Science Mission Directorate in Washington. BAE Systems (formerly Ball Aerospace) built the telescope and the spacecraft bus. The science analysis of the SPHEREx data will be conducted by a team of scientists located at 10 institutions across the U.S. and in South Korea. Data will be processed and archived at IPAC at Caltech, which manages JPL for NASA. The mission principal investigator is based at Caltech with a joint JPL appointment. The SPHEREx dataset will be publicly available.
For more information about the SPHEREx mission visit:
https://www.jpl.nasa.gov/missions/spherex/
News Media Contact
Calla Cofield
Jet Propulsion Laboratory, Pasadena, Calif.
626-808-2469
calla.e.cofield@jpl.nasa.gov
2024-152
Share
Details
Last Updated Oct 31, 2024 Related Terms
SPHEREx (Spectro-Photometer for the History of the Universe and Ices Explorer) Astrophysics Galaxies Jet Propulsion Laboratory The Search for Life The Universe Explore More
5 min read ‘Blood-Soaked’ Eyes: NASA’s Webb, Hubble Examine Galaxy Pair
Stare deeply at these galaxies. They appear as if blood is pumping through the top…
Article 1 hour ago 3 min read Buckle Up: NASA-Funded Study Explores Turbulence in Molecular Clouds
On an airplane, motions of the air on both small and large scales contribute to…
Article 21 hours ago 4 min read NASA’s Perseverance Captures ‘Googly Eye’ During Solar Eclipse
Article 22 hours ago Keep Exploring Discover Related Topics
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
The study of X-ray emission from astronomical objects reveals secrets about the Universe at the largest and smallest spatial scales. Celestial X-rays are produced by black holes consuming nearby stars, emitted by the million-degree gas that traces the structure between galaxies, and can be used to predict whether stars may be able to host planets hospitable to life. X-ray observations have shown that most of the visible matter in the universe exists as hot gas between galaxies and have conclusively demonstrated that the presence of “dark matter” is needed to explain galaxy cluster dynamics, that dark matter dominates the mass of galaxy clusters, and that it governs the expansion of the cosmos.
X-ray observations also enable us to probe mysteries of the Universe on the smallest scales. X-ray observations of compact objects such as white dwarfs, neutron stars, and black holes allow us to use the Universe as a physics laboratory to study conditions that are orders of magnitude more extreme in terms of density, pressure, temperature, and magnetic field strength than anything that can be produced on Earth. In this astrophysical laboratory, researchers expect to reveal new physics at the subatomic scale by conducting investigations such as probing the neutron star equation of state and testing quantum electrodynamics with observations of neutron star atmospheres. At NASA’s Marshall Space Flight Center, a team of scientists and engineers is building, testing, and flying innovative optics that bring the Universe’s X-ray mysteries into sharper focus.
A composite X-ray/Optical/Infrared image of the Crab Pulsar. The X-ray image from the Chandra X-ray Observatory (blue and white), reveals exquisite details in the central ring structures and gas flowing out of the polar jets. Optical light from the Hubble Space Telescope (purple) shows foreground and background stars as pinpoints of light. Infrared light from the Spitzer Space Telescope (pink) traces cooler gas in the nebula. Finally, magnetic field direction derived from X-ray polarization observed by the Imaging X-ray Polarimetry Explorer is shown as orange lines. Magnetic field lines: NASA/Bucciantini et al; X-ray: NASA/CXC/SAO; Optical: NASA/STScI; Infrared: NASA-JPL-Caltech Unlike optical telescopes that create images by reflecting or refracting light at near-90-degree angles (normal incidence), focusing X-ray optics must be designed to reflect light at very small angles (grazing incidence). At normal incidence, X-rays are either absorbed by the surface of a mirror or penetrate it entirely. However, at grazing angles of incidence, X-rays reflect very efficiently due to an effect called total external reflection. In grazing incidence, X-rays reflect off the surface of a mirror like rocks skipping on the surface of a pond.
A classic design for astronomical grazing incidence optics is the Wolter-I prescription, which consists of two reflecting surfaces, a parabola and hyperbola (see figure below). This optical prescription is revolved around the optical axis to produce a full-shell mirror (i.e., the mirror spans the full circumference) that resembles a gently tapered cone. To increase the light collecting area, multiple mirror shells with incrementally larger diameters and a common focus are fabricated and nested concentrically to comprise a mirror module assembly (MMA).
Focusing optics are critical to studying the X-ray universe because, in contrast to other optical systems like collimators or coded masks, they produce high signal-to-noise images with low background noise. Two key metrics that characterize the performance of X-ray optics are angular resolution, which is the ability of an optical system to discriminate between closely spaced objects, and effective area, which is the light collecting area of the telescope, typically quoted in units of cm2. Angular resolution is typically measured as the half-power diameter (HPD) of a focused spot in units of arcseconds. The HPD encircles half of the incident photons in a focused spot and measures the sharpness of the final image; a smaller number is better.
Schematic of a full-shell Wolter-I X-ray optic mirror module assembly with five concentrically nested mirror shells. Parallel rays of light enter from the left, reflect twice off the reflective inside surface of the shell (first off the parabolic segment and then off the hyperbolic segment), and converge at the focal plane. NASA MSFC NASA Marshall Space Flight Center (MSFC) has been building and flying lightweight, full-shell, focusing X-ray optics for over three decades, always meeting or exceeding angular resolution and effective area requirements. MSFC utilizes an electroformed nickel replication (ENR) technique to make these thin full-shell X-ray optics from nickel alloy.
X-ray optics development at MSFC began in the early 1990s with the fabrication of optics to support NASA’s Advanced X-ray Astrophysics Facility (AXAF-S) and then continued via the Constellation-X technology development programs. In 2001, MSFC launched a balloon payload that included two modules each with three mirrors, which produced the first focused hard X-ray (>10 keV) images of an astrophysical source by imaging Cygnus X-1, GRS 1915, and the Crab Nebula. This initial effort resulted in several follow-up missions over the next 12 years, and became known as the High Energy Replicated Optics (HERO) balloon program.
In 2012, the first of four sounding rocket flights of the Focusing Optics X-ray Solar Imager (FOXSI) flew with MSFC optics onboard, producing the first focused images of the Sun at energies greater than 5 keV. In 2019 the Astronomical Roentgen Telescope X-ray Concentrator (ART-XC) instrument on the Spectr-Roentgen-Gamma Mission launched with seven MSFC-fabricated X-ray MMAs, each containing 28 mirror shells. ART-XC is currently mapping the sky in the 4-30 keV hard X-ray energy range, studying exotic objects like neutron stars in our own galaxy as well as active galactic nuclei, which are spread across the visible universe. In 2021, the Imaging X-ray Polarimetry Explorer (IXPE), flew and is now performing extraordinary science with an MSFC-led team using three, 24-shell MMAs that were fabricated and calibrated in-house.
Most recently, in 2024, the fourth FOXSI sounding rocket campaign launched with a high-resolution MSFC MMA. The optics achieved 9.5 arcsecond HPD angular resolution during pre-flight test with an expected 7 arcsecond HPD in gravity-free flight, making this the highest angular resolution flight observation made with a nickel-replicated X-ray optic. Currently MSFC is fabricating an MMA for the Rocket Experiment Demonstration of a Soft X-ray (REDSoX) polarimeter, a sounding rocket mission that will fly a novel soft X-ray polarimeter instrument to observe active galactic nuclei. The REDSoX MMA optic will be 444 mm in diameter, which will make it the largest MMA ever produced by MSFC and the second largest replicated nickel X-ray optic in the world.
Scientists Wayne Baumgartner (left, crouched) and Nick Thomas (left, standing) calibrate an IXPE MMA in the MSFC 100 m Beamline. Scientist Stephen Bongiorno (right) applies epoxy to an IXPE shell during MMA assembly. NASA MSFC The ultimate performance of an X-ray optic is determined by errors in the shape, position, and roughness of the optical surface. To push the performance of X-ray optics toward even higher angular resolution and achieve more ambitious science goals, MSFC is currently engaged in a fundamental research and development effort to improve all aspects of full-shell optics fabrication.
Given that these optics are made with the Electroformed Nickel Replication technique, the fabrication process begins with creation of a replication master, called the mandrel, which is a negative of the desired optical surface. First, the mandrel is figured and polished to specification, then a thin layer of nickel alloy is electroformed onto the mandrel surface. Next, the nickel alloy layer is removed to produce a replicated optical shell, and finally the thin shell is attached to a stiff holding structure for use.
Each step in this process imparts some degree of error into the final replicated shell. Research and development efforts at MSFC are currently concentrating on reducing distortion induced during the electroforming metal deposition and release steps. Electroforming-induced distortion is caused by material stress built into the electroformed material as it deposits onto the mandrel. Decreasing release-induced distortion is a matter of reducing adhesion strength between the shell and mandrel, increasing strength of the shell material to prevent yielding, and reducing point defects in the release layer.
Additionally, verifying the performance of these advanced optics requires world-class test facilities. The basic premise of testing an optic designed for X-ray astrophysics is to place a small, bright X-ray source far away from the optic. If the angular size of the source, as viewed from the optic, is smaller than the angular resolution of the optic, the source is effectively simulating X-ray starlight. Due to the absorption of X-rays by air, the entire test facility light path must be placed inside a vacuum chamber.
At MSFC, a group of scientists and engineers operate the Marshall 100-meter X-ray beamline, a world-class end-to-end test facility for flight and laboratory X-ray optics, instruments, and telescopes. As per the name, it consists of a 100-meter-long vacuum tube with an 8-meter-long, 3-meter-diameter instrument chamber and a variety of X-ray sources ranging from 0.25 – 114 keV. Across the street sits the X-Ray and Cryogenic Facility (XRCF), a 527-meter-long beamline with an 18-meter-long, 6-meter-diameter instrument chamber. These facilities are available for the scientific community to use and highlight the comprehensive optics development and test capability that Marshall is known for.
Within the X-ray astrophysics community there exist a variety of angular resolution and effective area needs for focusing optics. Given its storied history in X-ray optics, MSFC is uniquely poised to fulfill requirements for large or small, medium- or high-angular-resolution X-ray optics. To help guide technology development, the astrophysics community convenes once per decade to produce a decadal survey. The need for high-angular-resolution and high-throughput X-ray optics is strongly endorsed by the National Academies of Sciences, Engineering, and Medicine report, Pathways to Discovery in Astronomy and Astrophysics for the 2020s.In pursuit of this goal, MSFC is continuing to advance the state of the art in full-shell optics. This work will enable the extraordinary mysteries of the X-ray universe to be revealed.
Project Leads
Dr. Jessica Gaskin and Dr. Stephen Bongiorno, NASA Marshall Space Flight Center (MSFC)
Sponsoring Organizations
The NASA Astrophysics Division supports this work primarily through the Internal Scientist Funding Model Direct Work Package and competed solicitations. This work is also supported by the Heliophysics Division through competed solicitations, as well as by directed work from other government entities.
Share
Details
Last Updated Oct 15, 2024 Related Terms
Astrophysics Astrophysics Division Marshall Astrophysics Marshall Space Flight Center Science-enabling Technology Technology Highlights Explore More
2 min read Hubble Spots a Grand Spiral of Starbursts
Article
4 days ago
6 min read NASA’s Hubble, New Horizons Team Up for a Simultaneous Look at Uranus
Article
6 days ago
4 min read NASA’s Hubble Watches Jupiter’s Great Red Spot Behave Like a Stress Ball
Article
6 days ago
View the full article
-
By NASA
Hubble Space Telescope Home NASA’s Hubble Finds More… Missions Hubble Home Overview About Hubble The History of Hubble Hubble Timeline Why Have a Telescope in Space? Hubble by the Numbers At the Museum FAQs Impact & Benefits Hubble’s Impact & Benefits Science Impacts Cultural Impact Technology Benefits Impact on Human Spaceflight Astro Community Impacts Science Hubble Science Science Themes Science Highlights Science Behind Discoveries Hubble’s Partners in Science Universe Uncovered Explore the Night Sky Observatory Hubble Observatory Hubble Design Mission Operations Missions to Hubble Hubble vs Webb Team Hubble Team Career Aspirations Hubble Astronauts News Hubble News Hubble News Archive Social Media Media Resources Multimedia Multimedia Images Videos Sonifications Podcasts E-books Lithographs Fact Sheets Glossary Posters Hubble on the NASA App More Online Activities 4 Min Read NASA’s Hubble Finds More Black Holes than Expected in the Early Universe
The Hubble Ultra Deep Field of nearly 10,000 galaxies is the deepest visible-light image of the cosmos. The image required 800 exposures taken over 400 Hubble orbits around Earth. The total amount of exposure time was 11.3 days, taken between Sept. 24, 2003 and Jan. 16, 2004. Credits:
NASA, ESA, S. Beckwith (STScI) and the HUDF Team With the help of NASA’s Hubble Space Telescope, an international team of researchers led by scientists in the Department of Astronomy at Stockholm University has found more black holes in the early universe than has previously been reported. The new result can help scientists understand how supermassive black holes were created.
Currently, scientists do not have a complete picture of how the first black holes formed not long after the big bang. It is known that supermassive black holes, that can weigh more than a billion suns, exist at the center of several galaxies less than a billion years after the big bang.
“Many of these objects seem to be more massive than we originally thought they could be at such early times — either they formed very massive or they grew extremely quickly,” said Alice Young, a PhD student from Stockholm University and co-author of the study published in The Astrophysical Journal Letters.
This is a new image of the Hubble Ultra Deep Field. The first deep imaging of the field was done with Hubble in 2004. The same survey field was observed again by Hubble several years later, and was then reimaged in 2023. By comparing Hubble Wide Field Camera 3 near-infrared exposures taken in 2009, 2012, and 2023, astronomers found evidence for flickering supermassive black holes in the hearts of early galaxies. One example is seen as a bright object in the inset. Some supermassive black holes do not swallow surrounding material constantly, but in fits and bursts, making their brightness flicker. This can be detected by comparing Hubble Ultra Deep Field frames taken at different epochs. The survey found more black holes than predicted. NASA, ESA, Matthew Hayes (Stockholm University); Acknowledgment: Steven V.W. Beckwith (UC Berkeley), Garth Illingworth (UC Santa Cruz), Richard Ellis (UCL); Image Processing: Joseph DePasquale (STScI)
Download this image
Black holes play an important role in the lifecycle of all galaxies, but there are major uncertainties in our understanding of how galaxies evolve. In order to gain a complete picture of the link between galaxy and black hole evolution, the researchers used Hubble to survey how many black holes exist among a population of faint galaxies when the universe was just a few percent of its current age.
Initial observations of the survey region were re-photographed by Hubble after several years. This allowed the team to measure variations in the brightness of galaxies. These variations are a telltale sign of black holes. The team identified more black holes than previously found by other methods.
The new observational results suggest that some black holes likely formed by the collapse of massive, pristine stars during the first billion years of cosmic time. These types of stars can only exist at very early times in the universe, because later-generation stars are polluted by the remnants of stars that have already lived and died. Other alternatives for black hole formation include collapsing gas clouds, mergers of stars in massive clusters, and “primordial” black holes that formed (by physically speculative mechanisms) in the first few seconds after the big bang. With this new information about black hole formation, more accurate models of galaxy formation can be constructed.
“The formation mechanism of early black holes is an important part of the puzzle of galaxy evolution,” said Matthew Hayes from the Department of Astronomy at Stockholm University and lead author of the study. “Together with models for how black holes grow, galaxy evolution calculations can now be placed on a more physically motivated footing, with an accurate scheme for how black holes came into existence from collapsing massive stars.”
Image Before/After Astronomers are also making observations with NASA’s James Webb Space Telescope to search for galactic black holes that formed soon after the big bang, to understand how massive they were and where they were located.
The Hubble Space Telescope has been operating for over three decades and continues to make ground-breaking discoveries that shape our fundamental understanding of the universe. Hubble is a project of international cooperation between NASA and ESA (European Space Agency). NASA’s Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope and mission operations. Lockheed Martin Space, based in Denver, Colorado, also supports mission operations at Goddard. The Space Telescope Science Institute in Baltimore, Maryland, which is operated by the Association of Universities for Research in Astronomy, conducts Hubble science operations for NASA.
Facebook logo @NASAHubble @NASAHubble Instagram logo @NASAHubble Media Contact:
Claire Andreoli
NASA’s Goddard Space Flight Center, Greenbelt, MD
claire.andreoli@nasa.gov
Ray Villard
Space Telescope Science Institute, Baltimore, MD
Science Contact:
Matthew Hayes
Stockholm University, Stockholm, Sweden
Share
Details
Last Updated Sep 17, 2024 Editor Andrea Gianopoulos Location NASA Goddard Space Flight Center Related Terms
Astrophysics Astrophysics Division Black Holes Goddard Space Flight Center Hubble Space Telescope Missions The Universe Keep Exploring Discover More Topics From Hubble
Hubble Space Telescope
Since its 1990 launch, the Hubble Space Telescope has changed our fundamental understanding of the universe.
Hubble Science Highlights
Hubble Online Activities
Hubble Focus: Dark Universe
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.