Jump to content

Recommended Posts

Posted
Webb_s_infrared_universe_card_full.jpg Video: 00:01:00

The James Webb Space Telescope (Webb) will observe the Universe in the near-infrared and mid-infrared – at wavelengths longer than visible light.

By viewing the Universe at infrared wavelengths with an unprecedented sensitivity Webb will open up a new window to the cosmos. With infrared wavelengths it can see the first stars and galaxies forming after the Big Bang. Its infrared vision also allows Webb to study stars and planetary systems forming inside thick clouds of gas and dust that are opaque to visible light.

The primary goals of Webb are to study galaxy, star and planet formation in the Universe. To see the very first stars and galaxies that formed in the early Universe, we have to look deep into space to look back in time (because it takes light time to travel from there to here, the farther out we look, the further we look back in time).

The Universe is expanding, and therefore the farther we look, the faster objects are moving away from us, redshifting the light. Redshift means that light that is emitted as ultraviolet or visible light is shifted more and more to redder wavelengths, into the near- and mid-infrared part of the electromagnetic spectrum for very high redshifts. Therefore, to study the earliest star and galaxy formation in the Universe, we have to observe infrared light and use a telescope and instruments optimised for this light like Webb.

Star formation in the local universe takes place in the centres of dense, dusty clouds, obscured from our eyes at normal visible wavelengths. Near-infrared light, with its longer wavelength, is less hindered by the small dust particles, allowing near-infrared light to seep through the dust clouds. By observing the emitted near-infrared light we can penetrate the dust and see the processes leading to star and planet formation.

Objects of about Earth's temperature emit most of their light at mid-infrared wavelengths. These temperatures are also found in dusty regions forming stars and planets, so with mid-infrared radiation we can see directly the glow of this slightly warm dust and study its distribution and properties.

Webb is an international partnership between NASA, the European Space Agency (ESA) and the Canadian Space Agency (CSA).

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      X-rays are radiated by matter hotter than one million Kelvin, and high-resolution X-ray spectroscopy can tell us about the composition of the matter and how fast and in what direction it is moving. Quantum calorimeters are opening this new window on the Universe. First promised four decades ago, the quantum-calorimeter era of X-ray astronomy has finally dawned.
      Photo of the XRISM/Resolve quantum-calorimeter array in its storage container prior to integration into the instrument. The 6×6 array, 5 mm on a side, consists of independent detectors – each one a thermally isolated silicon thermistor with a HgTe absorber. The spectrometer consisting of this detector and other essential technologies separates astrophysical X-ray spectra into about 2400 resolution elements, which can be thought of as X-ray colors.NASA GSFC A quantum calorimeter is a device that makes precise measurements of energy quanta by measuring the temperature change that occurs when a quantum of energy is deposited in an absorber with low heat capacity. The absorber is attached to a thermometer that is somewhat decoupled from a heat sink so that the sensor can heat up and then cool back down again. To reduce thermodynamic noise and the heat capacity of the sensor, operation at temperatures less than 0.1 K is required. 
      The idea for thermal measurement of small amounts of energy occurred in several places in the world independently when scientists observed pulses in the readout of low-temperature thermometers and infrared detectors. They attributed these spurious signals to passing cosmic-ray particles, and considered optimizing detectors for sensitive measurement of the energy of particles and photons.
      The idea to develop such sensors for X-ray astronomy was conceived at Goddard Space Flight Center in 1982 when X-ray astronomers were considering instruments to propose for NASA’s planned Advanced X-ray Astrophysics Facility (AXAF). In a fateful conversation, infrared astronomer Harvey Moseley suggested thermal detection could offer substantial improvement over existing solid-state detectors. Using Goddard internal research and development funding, development advanced sufficiently to justify, just two years later, proposing a quantum-calorimeter X-ray Spectrometer (XRS) for inclusion on AXAF. Despite its technical immaturity at the time, the revolutionary potential of the XRS was acknowledged, and the proposal was accepted.
      The AXAF design evolved over the subsequent years, however, and the XRS was eliminated from its complement of instruments. After discussions between NASA and the Japanese Institute of Space and Astronautical Science (ISAS), a new XRS was included in the instrument suite of the Japanese Astro-E X-ray observatory. Astro-E launched in 2000 but did not reach orbit due to an anomaly in the first stage of the rocket. Astro-E2, a rebuild of Astro-E, was successfully placed in orbit in 2005 and renamed Suzaku, but the XRS instrument ceased operation before observations started due to loss of the liquid helium, an essential part of the detector cooling system, caused by a faulty storage system.
      A redesigned mission, Astro-H, that included a quantum-calorimeter instrument with a redundant cooling system was successfully launched in 2016 and renamed Hitomi. Hitomi’s Soft X-ray Spectrometer (SXS) obtained high resolution spectra of the Perseus cluster of galaxies and a few other sources before a problem with the attitude control system caused the mission to be lost roughly one month after launch. Even so, Hitomi was the first orbiting observatory to obtain a scientific result using X-ray quantum calorimeters. The spectacular Perseus spectrum generated by the SXS motivated yet another attempt to implement a spaceborne quantum-calorimeter spectrometer.
      The X-ray Imaging and Spectroscopy Mission (XRISM) was launched in September 2023, with the spectrometer aboard renamed Resolve to represent not only its function but also the resolve of the U.S./Japan collaboration to study the Universe through the window of this new capability. XRISM has been operating well in orbit for over a year.  
      Development of the Sensor Technology
      Development of the sensor technology employed in Resolve began four decades ago. Note that an X-ray quantum-calorimeter spectrometer requires more than the sensor technology. Other technologies, such as the coolers that provide a
      The sensors used from XRS through Resolve were all based on silicon-thermistor thermometers and mercury telluride (HgTe) X-ray absorbers. They used arrays consisting of 32 to 36 pixels, each of which was an independent quantum calorimeter.  Between Astro-E and Astro-E2, a new method of making the thermistor was developed that significantly reduced its low-frequency noise. Other fabrication advances made it possible to make reproducible connections between absorbers and thermistors and to fit each thermistor and its thermal isolation under its X-ray absorber, making square arrays feasible.
      Through a Small Business Innovation Research (SBIR) contract executed after the Astro-E2 mission, EPIR Technologies Inc. reduced the specific heat of the HgTe absorbers. Additional improvements made to the cooler of the detector heat sink allowed operation at a lower temperature, which further reduced the specific heat. Together, these changes enabled the pixel width to be increased from 0.64 mm to 0.83 mm while still achieving a lower heat capacity, and thus improving the energy resolution. From Astro-E through Astro-H, the energy resolution for X-rays of energy around 6000 eV improved from 11 eV, to 5.5 eV, to 4 eV. No changes to the array design were made between Astro-H and XRISM.
      Resolve detector scientist Caroline Kilbourne installing the flight Resolve quantum-calorimeter array into the assembly that provides its electrical, thermal, and mechanical interfaces.NASA GSFC Over the same period, other approaches to quantum-calorimeter arrays optimized for the needs of future missions were developed. The use of superconducting transition-edge sensors (TES) instead of silicon (Si) thermistors led to improved energy resolution, more pixels per array, and multiplexing (a technique that allows multiple signals to be carried on a single wire). Quantum-calorimeter arrays with thousands of pixels are now standard, such as in the NASA contribution to the future European New Advanced Telescope for High-ENergy Astrophysics (newAthena) mission. And quantum calorimeters using paramagnetic thermometers — which unlike TES and Si thermistors require no dissipation of heat in the thermometer for it to be read out — combined with high-density wiring are a promising route for realizing even larger arrays. (See Astrophysics Technology Highlight on these latest developments.)
      The Resolve instrument aboard XRISM (X-ray Imaging and Spectroscopy Mission) captured data from the center of galaxy NGC 4151, where a supermassive black hole is slowly consuming material from the surrounding accretion disk. The resulting spectrum reveals the presence of iron in the peak around 6.5 keV and the dips around 7 keV, light thousands of times more energetic that what our eyes can see. Background: An image of NGC 4151 constructed from a combination of X-ray, optical, and radio light.Spectrum: JAXA/NASA/XRISM Resolve. Background: X-rays, NASA/CXC/CfA/J.Wang et al.; optical, Isaac Newton Group of Telescopes, La Palma/Jacobus Kapteyn Telescope; radio, NSF/NRAO/VLA Results from Resolve
      So, what is Resolve revealing about the Universe? Through spectroscopy alone, Resolve allows us to construct images of complex environments where collections of gas and dust with various attributes exist, emitting and absorbing X-rays at energies characteristic of their various compositions, velocities, and temperatures. For example, in the middle of the galaxy known as NCG 4151 (see figure above), matter spiraling into the central massive black hole forms a circular structure that is flat near the black hole, more donut-shaped further out, and, according to the Resolve data, a bit lumpy. Matter near the black hole is heated up to X-ray-emitting temperatures and irradiates the matter in the circular structure. The Resolve spectrum has a bright narrow emission line (peak) from neutral iron atoms that must be coming from colder matter in the circular structure, because hotter material would be ionized, and would have a different emission signature. Nonetheless, the shape of the iron line needs three components to describe it, each coming from a different lump in the circular structure. The presence of absorption lines (dips) in the spectrum provides further detail about the structure of the infalling matter.
      A second example is the detection of X-ray emission by Resolve from the debris of stars that have exploded, such as N132D (see figure below), that will improve our understanding of the explosion mechanism and how the elements produced in stars get distributed, and allow us to infer the type of star each was before ending in a supernova. Elements are identified by their characteristic emission lines, and shifts of those lines via the Doppler effect tell us how fast the material is moving.
      XRISM’s Resolve instrument captured data from supernova remnant N132D in the Large Magellanic Cloud to create the most detailed X-ray spectrum of the object ever made. The spectrum reveals peaks associated with silicon, sulfur, argon, calcium, and iron. Inset at right is an image of N132D captured by XRISM’s Xtend instrument.JAXA/NASA/XRISM Resolve and Xtend These results are just the beginning. The rich Resolve data sets are identifying complex velocity structures, rare elements, and multiple temperature components in a diverse ensemble of cosmic objects. Welcome to the quantum calorimeter era! Stay tuned for more revelations!
      Project Leads: Dr. Caroline Kilbourne, NASA Goddard Space Flight Center (GSFC), for silicon-thermistor quantum calorimeter development from Astro-E2 through XRISM and early TES development. Foundational and other essential leadership provided by Dr. Harvey Moseley, Dr. John Mather, Dr. Richard Kelley, Dr. Andrew Szymkowiak, Mr. Brent Mott, Dr. F. Scott Porter, Ms. Christine Jhabvala, Dr. James Chervenak (GSFC at the time of the work) and Dr. Dan McCammon (U. Wisconsin).
      Sponsoring Organizations and Programs:  The NASA Headquarters Astrophysics Division sponsored the projects, missions, and other efforts that culminated in the development of the Resolve instrument.
      Explore More
      7 min read NASA’s Webb Finds Planet-Forming Disks Lived Longer in Early Universe
      Article 1 day ago 5 min read NASA DAVINCI Mission’s Many ‘Firsts’ to Unlock Venus’ Hidden Secrets
      NASA’s DAVINCI probe will be first in the 21st century to brave Venus’ atmosphere as…
      Article 1 day ago 2 min read Hubble Images a Grand Spiral
      Article 4 days ago View the full article
    • By European Space Agency
      Our understanding of planet formation in the Universe’s early days is challenged by new data from the NASA/ESA/CSA James Webb Space Telescope. Webb solved a puzzle by proving a controversial finding made with the NASA/ESA Hubble Space Telescope more than 20 years ago.
      View the full article
    • By NASA
      Webb Webb News Latest News Latest Images Blog (offsite) Awards X (offsite – login reqd) Instagram (offsite – login reqd) Facebook (offsite- login reqd) Youtube (offsite) Overview About Who is James Webb? Fact Sheet Impacts+Benefits FAQ Science Overview and Goals Early Universe Galaxies Over Time Star Lifecycle Other Worlds Observatory Overview Launch Orbit Mirrors Sunshield Instrument: NIRCam Instrument: MIRI Instrument: NIRSpec Instrument: FGS/NIRISS Optical Telescope Element Backplane Spacecraft Bus Instrument Module Multimedia About Webb Images Images Videos What is Webb Observing? 3d Webb in 3d Solar System Podcasts Webb Image Sonifications Team International Team People Of Webb More For the Media For Scientists For Educators For Fun/Learning 7 Min Read NASA’s Webb Finds Planet-Forming Disks Lived Longer in Early Universe
      This is a James Webb Space Telescope image of NGC 346, a massive star cluster in the Small Magellanic Cloud, a dwarf galaxy that is one of the Milky Way’s nearest neighbors. Credits:
      NASA, ESA, CSA, STScI, Olivia C. Jones (UK ATC), Guido De Marchi (ESTEC), Margaret Meixner (USRA) NASA’s James Webb Space Telescope just solved a conundrum by proving a controversial finding made with the agency’s Hubble Space Telescope more than 20 years ago.
      In 2003, Hubble provided evidence of a massive planet around a very old star, almost as old as the universe. Such stars possess only small amounts of heavier elements that are the building blocks of planets. This implied that some planet formation happened when our universe was very young, and those planets had time to form and grow big inside their primordial disks, even bigger than Jupiter. But how? This was puzzling.
      To answer this question, researchers used Webb to study stars in a nearby galaxy that, much like the early universe, lacks large amounts of heavy elements. They found that not only do some stars there have planet-forming disks, but that those disks are longer-lived than those seen around young stars in our Milky Way galaxy.
      “With Webb, we have a really strong confirmation of what we saw with Hubble, and we must rethink how we model planet formation and early evolution in the young universe,” said study leader Guido De Marchi of the European Space Research and Technology Centre in Noordwijk, Netherlands.
      Image A: Protoplanetary Disks in NGC 346 (NIRCam Image)
      This is a James Webb Space Telescope image of NGC 346, a massive star cluster in the Small Magellanic Cloud, a dwarf galaxy that is one of the Milky Way’s nearest neighbors. With its relative lack of elements heavier than hydrogen and helium, the NGC 346 cluster serves as a nearby proxy for studying stellar environments with similar conditions in the early, distant universe. Ten, small, yellow circles overlaid on the image indicate the positions of the ten stars surveyed in this study. NASA, ESA, CSA, STScI, Olivia C. Jones (UK ATC), Guido De Marchi (ESTEC), Margaret Meixner (USRA) A Different Environment in Early Times
      In the early universe, stars formed from mostly hydrogen and helium, and very few heavier elements such as carbon and iron, which came later through supernova explosions.
      “Current models predict that with so few heavier elements, the disks around stars have a short lifetime, so short in fact that planets cannot grow big,” said the Webb study’s co-investigator Elena Sabbi, chief scientist for Gemini Observatory at the National Science Foundation’s NOIRLab in Tucson. “But Hubble did see those planets, so what if the models were not correct and disks could live longer?”
      To test this idea, scientists trained Webb on the Small Magellanic Cloud, a dwarf galaxy that is one of the Milky Way’s nearest neighbors. In particular, they examined the massive, star-forming cluster NGC 346, which also has a relative lack of heavier elements. The cluster served as a nearby proxy for studying stellar environments with similar conditions in the early, distant universe.
      Hubble observations of NGC 346 from the mid 2000s revealed many stars about 20 to 30 million years old that seemed to still have planet-forming disks around them. This went against the conventional belief that such disks would dissipate after 2 or 3 million years.
      “The Hubble findings were controversial, going against not only empirical evidence in our galaxy but also against the current models,” said De Marchi. “This was intriguing, but without a way to obtain spectra of those stars, we could not really establish whether we were witnessing genuine accretion and the presence of disks, or just some artificial effects.”
      Now, thanks to Webb’s sensitivity and resolution, scientists have the first-ever spectra of forming, Sun-like stars and their immediate environments in a nearby galaxy.
      “We see that these stars are indeed surrounded by disks and are still in the process of gobbling material, even at the relatively old age of 20 or 30 million years,” said De Marchi. “This also implies that planets have more time to form and grow around these stars than in nearby star-forming regions in our own galaxy.”
      Image B: Protoplanetary Disks in NGC 346 Spectra (NIRSpec)
      This graph shows, on the bottom left in yellow, a spectrum of one of the 10 target stars in this study (as well as accompanying light from the immediate background environment). Spectral fingerprints of hot atomic helium, cold molecular hydrogen, and hot atomic hydrogen are highlighted. On the top left in magenta is a spectrum slightly offset from the star that includes only light from the background environment. This second spectrum lacks a spectral line of cold molecular hydrogen.
      On the right is the comparison of the top and bottom lines. This comparison shows a large peak in the cold molecular hydrogen coming from the star but not its nebular environment. Also, atomic hydrogen shows a larger peak from the star. This indicates the presence of a protoplanetary disk immediately surrounding the star. The data was taken with the microshutter array on the James Webb Space Telescope’s NIRSpec (Near-Infrared Spectrometer) instrument. Illustration: NASA, ESA, CSA, Joseph Olmsted (STScI) A New Way of Thinking
      This finding refutes previous theoretical predictions that when there are very few heavier elements in the gas around the disk, the star would very quickly blow away the disk. So the disk’s life would be very short, even less than a million years. But if a disk doesn’t stay around the star long enough for the dust grains to stick together and pebbles to form and become the core of a planet, how can planets form?
      The researchers explained that there could be two distinct mechanisms, or even a combination, for planet-forming disks to persist in environments scarce in heavier elements.
      First, to be able to blow away the disk, the star applies radiation pressure. For this pressure to be effective, elements heavier than hydrogen and helium would have to reside in the gas. But the massive star cluster NGC 346 only has about ten percent of the heavier elements that are present in the chemical composition of our Sun. Perhaps it simply takes longer for a star in this cluster to disperse its disk.
      The second possibility is that, for a Sun-like star to form when there are few heavier elements, it would have to start from a larger cloud of gas. A bigger gas cloud will produce a bigger disk. So there is more mass in the disk and therefore it would take longer to blow the disk away, even if the radiation pressure were working in the same way.
      “With more matter around the stars, the accretion lasts for a longer time,” said Sabbi. “The disks take ten times longer to disappear. This has implications for how you form a planet, and the type of system architecture that you can have in these different environments. This is so exciting.”
      The science team’s paper appears in the Dec. 16 issue of The Astrophysical Journal.
      Image C: NGC 346: Hubble and Webb Observations
      Image Before/After The James Webb Space Telescope is the world’s premier space science observatory. Webb is solving mysteries in our solar system, looking beyond to distant worlds around other stars, and probing the mysterious structures and origins of our universe and our place in it. Webb is an international program led by NASA with its partners, ESA (European Space Agency) and CSA (Canadian Space Agency).
      The Hubble Space Telescope has been operating for over three decades and continues to make ground-breaking discoveries that shape our fundamental understanding of the universe. Hubble is a project of international cooperation between NASA and ESA (European Space Agency). NASA’s Goddard Space Flight Center in Greenbelt manages the telescope and mission operations. Lockheed Martin Space, based in Denver also supports mission operations at Goddard. The Space Telescope Science Institute in Baltimore, which is operated by the Association of Universities for Research in Astronomy, conducts Hubble science operations for NASA.
      Downloads
      Right click any image to save it or open a larger version in a new tab/window via the browser’s popup menu.
      View/Download all image products at all resolutions for this article from the Space Telescope Science Institute.
      View/Download the science paper from the The Astrophysical Journal.
      Media Contacts
      Laura Betz – laura.e.betz@nasa.gov
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Ann Jenkins – jenkins@stsci.edu, Christine Pulliam – cpulliam@stsci.edu
      Space Telescope Science Institute, Baltimore, Md.
      Related Information
      Past releases on NGC 346: Webb NIRCam image and MIRI image
      Article: Highlighting other Webb Star Formation Discoveries
      Simulation Video: Planetary Systems and Origins of Life
      Animation Video: Exploring star and planet formation (English), and in Spanish
      More Images of NGC 346 on AstroPix
      More Webb News
      More Webb Images
      Webb Science Themes
      Webb Mission Page
      Related For Kids
      What is a planet?
      What is the Webb Telescope?
      SpacePlace for Kids
      En Español
      ¿Qué es un planeta?
      Ciencia de la NASA
      NASA en español 
      Space Place para niños
      Keep Exploring Related Topics
      James Webb Space Telescope


      Webb is the premier observatory of the next decade, serving thousands of astronomers worldwide. It studies every phase in the…


      Stars



      Galaxies



      Universe


      Share








      Details
      Last Updated Dec 15, 2024 Editor Marty McCoy Contact Laura Betz laura.e.betz@nasa.gov Related Terms
      Astrophysics Galaxies Galaxies, Stars, & Black Holes Goddard Space Flight Center James Webb Space Telescope (JWST) Science & Research Stars The Universe View the full article
    • By European Space Agency
      Image: Webb traces spiral arms in infrared View the full article
    • By NASA
      At NASA, high-end computing is essential for many agency missions. This technology helps us advance our understanding of the universe – from our planet to the farthest reaches of the cosmos. Supercomputers enable projects across diverse research, such as making discoveries about the Sun’s activity that affects technologies in space and life on Earth, building artificial intelligence-based models for innovative weather and climate science, and helping redesign the launch pad that will send astronauts to space with Artemis II. 
      These projects are just a sample of the many on display in NASA’s exhibit during the International Conference for High Performance Computing, Networking, Storage and Analysis, or SC24. NASA’s Dr. Nicola “Nicky” Fox, associate administrator for the agency’s Science Mission Directorate, will deliver the keynote address, “NASA’s Vision for High Impact Science and Exploration,” on Tuesday, Nov. 19, where she’ll share more about the ways NASA uses supercomputing to explore the universe for the benefit of all. Here’s a little more about the work NASA will share at the conference: 
      1. Simulations Help in Redesign of the Artemis Launch Environment
      To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
      This simulation of the Artemis I launch shows how the Space Launch System rocket's exhaust plumes interact with the air, water, and the launchpad. Colors on surfaces indicate pressure levels—red for high pressure and blue for low pressure. The teal contours illustrate where water is present. NASA/Chris DeGrendele, Timothy Sandstrom Researchers at NASA Ames are helping ensure astronauts launch safely on the Artemis II test flight, the first crewed mission of the Space Launch System (SLS) rocket and Orion spacecraft, scheduled for 2025. Using the Launch Ascent and Vehicle Aerodynamics software, they simulated the complex interactions between the rocket plume and the water-based sound suppression system used during the Artemis I launch, which resulted in damage to the mobile launcher platform that supported the rocket before liftoff.
      Comparing simulations with and without the water systems activated revealed that the sound suppression system effectively reduces pressure waves, but exhaust gases can redirect water and cause significant pressure increases. 
      The simulations, run on the Aitken supercomputer at the NASA Advanced Supercomputing facility at Ames, generated about 400 terabytes of data. This data was provided to aerospace engineers at NASA’s Kennedy Space Center in Florida, who are redesigning the flame deflector and mobile launcher for the Artemis II launch.
      2. Airplane Design Optimization for Fuel Efficiency
      In this comparison of aircraft designs, the left wing models the aircraft’s initial geometry, while the right wing models an optimized shape. The surface is colored by the air pressure on the aircraft, with orange surfaces representing shock waves in the airflow. The optimized design modeled on the right wing reduces drag by 4% compared to the original, leading to improved fuel efficiency. NASA/Brandon Lowe To help make commercial flight more efficient and sustainable, researchers and engineers at NASA’s Ames Research Center in California’s Silicon Valley are working to refine aircraft designs to reduce air resistance, or drag, by fine-tuning the shape of wings, fuselages, and other aircraft structural components. These changes would lower the energy required for flight and reduce the amount of fuel needed, produce fewer emissions, enhance overall performance of aircraft, and could help reduce noise levels around airports. 
      Using NASA’s Launch, Ascent, and Vehicle Aerodynamics computational modeling software, developed at Ames, researchers are leveraging the power of agency supercomputers to run hundreds of simulations to explore a variety of design possibilities – on existing aircraft and future vehicle concepts. Their work has shown the potential to reduce drag on an existing commercial aircraft design by 4%, translating to significant fuel savings in real-world applications.
      3. Applying AI to Weather and Climate
      This visualization compares the track of the Category 4 hurricane, Ida, from MERRA-2 reanalysis data (left) with a prediction made without specific training, from NASA and IBM’s Prithvi WxC foundation model (right). Both models were initialized at 00 UTC on 2021-08-27.The University of Alabama in Huntsville/Ankur Kumar; NASA/Sujit Roy Traditional weather and climate models produce global and regional results by solving mathematical equations for millions of small areas (grid boxes) across Earth’s atmosphere and oceans. NASA and partners are now exploring newer approaches using artificial intelligence (AI) techniques to train a foundation model. 
      Foundation models are developed using large, unlabeled datasets so researchers can fine-tune results for different applications, such as creating forecasts or predicting weather patterns or climate changes, independently with minimal additional training. 
      NASA developed the open source, publicly available Prithvi Weather-Climate foundation model (Prithvi WxC), in collaboration with IBM Research. Prithvi WxC was pretrained using 160 variables from  NASA’s Modern-era Retrospective analysis for Research and Applications (MERRA-2) dataset on the newest NVIDIA A100 GPUs at the NASA Advanced Supercomputing facility. 
      Armed with 2.3 billion parameters, Prithvi WxC can model a variety of weather and climate phenomena – such as hurricane tracks – at fine resolutions. Applications include targeted weather prediction and climate projection, as well as representing physical processes like gravity waves. 
      4. Simulations and AI Reveal the Fascinating World of Neutron Stars
      3D simulation of pulsar magnetospheres, run on NASA’s Aitken supercomputer using data from the agency‘s Fermi space telescope. The red arrow shows the direction of the star’s magnetic field. Blue lines trace high-energy particles, producing gamma rays, in yellow. Green lines represent light particles hitting the observer’s plane, illustrating how Fermi detects pulsar gamma rays. NASA/Constantinos Kalapotharakos To explore the extreme conditions inside neutron stars, researchers at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, are using a blend of simulation, observation, and AI to unravel the mysteries of these extraordinary cosmic objects. Neutron stars are the dead cores of stars that have exploded and represent some of the densest objects in the universe.
      Cutting-edge simulations, run on supercomputers at the NASA Advanced Supercomputing facility, help explain phenomena observed by NASA’s Fermi Gamma-ray Space Telescope and Neutron star Interior Composition Explorer (NICER) observatory. These phenomena include the rapidly spinning, highly magnetized neutron stars known as pulsars, whose detailed physical mechanisms have remained mysterious since their discovery. By applying AI tools such as deep neural networks, the scientists can infer the stars’ mass, radius, magnetic field structure, and other properties from data obtained by the NICER and Fermi observatories. 
      The simulations’ unprecedented results will guide similar studies of black holes and other space environments, as well as play a pivotal role in shaping future scientific space missions and mission concepts.
      5. Modeling the Sun in Action – From Tiny to Large Scales 
      Image from a 3D simulation showing the evolution of flows in the upper layers of the Sun, with the most vigorous motions shown in red. These turbulent flows can generate magnetic fields and excite sound waves, shock waves, and eruptions. NASA/Irina Kitiashvili and Timothy A. Sandstrom The Sun’s activity, producing events such as solar flares and coronal mass ejections, influences the space environment and cause space weather disturbances that can interfere with satellite electronics, radio communications, GPS signals, and power grids on Earth. Scientists at NASA Ames produced highly realistic 3D models that – for the first time – allow them to examine the physics of solar plasma in action, from very small to very large scales. These models help interpret observations from NASA spacecraft like the Solar Dynamics Observatory (SDO). 
      Using NASA’s StellarBox code on supercomputers at NASA’s Advanced Supercomputing facility, the scientists improved our understanding of the origins of solar jets and tornadoes – bursts of extremely hot, charged plasma in the solar atmosphere. These models allow the science community to address long-standing questions of solar magnetic activity and how it affects space weather.
      6. Scientific Visualization Makes NASA Data Understandable
      This global map is a frame from an animation showing how wind patterns and atmospheric circulation moved carbon dioxide through Earth’s atmosphere from January to March 2020. The DYAMOND model’s high resolution shows unique sources of carbon dioxide emissions and how they spread across continents and oceans.NASA/Scientific Visualization Studio NASA simulations and observations can yield petabytes of data that are difficult to comprehend in their original form. The Scientific Visualization Studio (SVS), based at NASA Goddard, turns data into insight by collaborating closely with scientists to create cinematic, high-fidelity visualizations.
      Key infrastructure for these SVS creations includes the NASA Center for Climate Simulation’s Discover supercomputer at Goddard, which hosts a variety of simulations and provides data analysis and image-rendering capabilities. Recent data-driven visualizations show a coronal mass ejection from the Sun hitting Earth’s magnetosphere using the Multiscale Atmosphere-Geospace Environment (MAGE) model; global carbon dioxide emissions circling the planet in the DYnamics of the Atmospheric general circulation Modeled On Non-hydrostatic Domains (DYAMOND) model; and representations of La Niña and El Niño weather patterns using the El Niño-Southern Oscillation (ENSO) model. 
      For more information about NASA’s virtual exhibit at the International Conference for High Performance Computing, Networking, Storage and Analysis, being held in Atlanta, Nov. 17-22, 2024, visit: 
      https://www.nas.nasa.gov/SC24
      For more information about supercomputers run by NASA High-End Computing, visit: 
      https://hec.nasa.gov
      For news media:
      Members of the news media interested in covering this topic should reach out to the NASA Ames newsroom.
      Authors: Jill Dunbar, Michelle Moyer, and Katie Pitta, NASA’s Ames Research Center; and Jarrett Cohen, NASA’s Goddard Space Flight Center
      View the full article
  • Check out these Videos

×
×
  • Create New...