Jump to content

Expanded AI Model with Global Data Enhances Earth Science Applications 


Recommended Posts

  • Publishers
Posted

4 min read

Expanded AI Model with Global Data Enhances Earth Science Applications 

A false-color image of the East Peak fire burning in southern Colorado near Trinidad, captured by the Operational Land Imager on the Landsat 8 satellite. Dark green forests and light green grasslands cover most of the image, but a red patch in the middle represents a burn scar, and some orange spots around it represent actively burning areas.
On June 22, 2013, the Operational Land Imager (OLI) on Landsat 8 captured this false-color image of the East Peak fire burning in southern Colorado near Trinidad. Burned areas appear dark red, while actively burning areas look orange. Dark green areas are forests; light green areas are grasslands. Data from Landsat 8 were used to train the Prithvi artificial intelligence model, which can help detect burn scars.
NASA Earth Observatory

NASA, IBM, and Forschungszentrum Jülich have released an expanded version of the open-source Prithvi Geospatial artificial intelligence (AI) foundation model to support a broader range of geographical applications. Now, with the inclusion of global data, the foundation model can support tracking changes in land use, monitoring disasters, and predicting crop yields worldwide. 

The Prithvi Geospatial foundation model, first released in August 2023 by NASA and IBM, is pre-trained on NASA’s Harmonized Landsat and Sentinel-2 (HLS) dataset and learns by filling in masked information. The model is available on Hugging Face, a data science platform where machine learning developers openly build, train, deploy, and share models. Because NASA releases data, products, and research in the open, businesses and commercial entities can take these models and transform them into marketable products and services that generate economic value. 

“We’re excited about the downstream applications that are made possible with the addition of global HLS data to the Prithvi Geospatial foundation model. We’ve embedded NASA’s scientific expertise directly into these foundation models, enabling them to quickly translate petabytes of data into actionable insights,” said Kevin Murphy, NASA chief science data officer. “It’s like having a powerful assistant that leverages NASA’s knowledge to help make faster, more informed decisions, leading to economic and societal benefits.”

AI foundation models are pre-trained on large datasets with self-supervised learning techniques, providing flexible base models that can be fine-tuned for domain-specific downstream tasks.

Three images show the process of crop classification with NASA and IBM’s open-source Prithvi Geospatial artificial intelligence model. The first shows a true color composite image of a cropland area. The second shows a Ground Truth Mask of the types of land cover in the image — natural vegetation is colored red, forest is orange, corn is yellow, soybeans are light green, wetlands are mid-green, developed or barren land is darker green, open water is light blue, winter wheat is mid-blue, alfalfa is dark blue, fallow/idle cropland is dark purple, cotton is pink, and sorghum is dark pink. The third image shows a Predicted Mask that closely matches the Ground Truth Mask.
Crop classification prediction generated by NASA and IBM’s open-source Prithvi Geospatial artificial intelligence model.

Focusing on diverse land use and ecosystems, researchers selected HLS satellite images that represented various landscapes while avoiding lower-quality data caused by clouds or gaps. Urban areas were emphasized to ensure better coverage, and strict quality controls were applied to create a large, well-balanced dataset. The final dataset is significantly larger than previous versions, offering improved global representation and reliability for environmental analysis. These methods created a robust and representative dataset, ideal for reliable model training and analysis. 

The Prithvi Geospatial foundation model has already proven valuable in several applications, including post-disaster flood mapping and detecting burn scars caused by fires.

One application, the Multi-Temporal Cloud Gap Imputation, leverages the foundation model to reconstruct the gaps in satellite imagery caused by cloud cover, enabling a clearer view of Earth’s surface over time. This approach supports a variety of applications, including environmental monitoring and agricultural planning.  

Another application, Multi-Temporal Crop Segmentation, uses satellite imagery to classify and map different crop types and land cover across the United States. By analyzing time-sequenced data and layering U.S. Department of Agriculture’s Crop Data, Prithvi Geospatial can accurately identify crop patterns, which in turn could improve agricultural monitoring and resource management on a large scale. 

The flood mapping dataset can classify flood water and permanent water across diverse biomes and ecosystems, supporting flood management by training models to detect surface water. 

Wildfire scar mapping combines satellite imagery with wildfire data to capture detailed views of wildfire scars shortly after fires occurred. This approach provides valuable data for training models to map fire-affected areas, aiding in wildfire management and recovery efforts.

Three images show the process of burn scar mapping with NASA and IBM’s open-source Prithvi Geospatial artificial intelligence model.The first shows a true color composite satellite image, which contains shades of green and purple. The second shows a Ground Truth Mask, which shows the true extent of a burn scar on the land by blacking out the area around the burn scar. The third shows a Predicted Mask, which almost exactly matches the Ground Truth Mask.
Burn scar mapping generated by NASA and IBM’s open-source Prithvi Geospatial artificial intelligence model.

This model has also been tested with additional downstream applications including estimation of gross primary productivity, above ground biomass estimation, landslide detection, and burn intensity estimations. 

“The updates to this Prithvi Geospatial model have been driven by valuable feedback from users of the initial version,” said Rahul Ramachandran, AI foundation model for science lead and senior data science strategist at NASA’s Marshall Space Flight Center in Huntsville, Alabama. “This enhanced model has also undergone rigorous testing across a broader range of downstream use cases, ensuring improved versatility and performance, resulting in a version of the model that will empower diverse environmental monitoring applications, delivering significant societal benefits.”

The Prithvi Geospatial Foundation Model was developed as part of an initiative of NASA’s Office of the Chief Science Data Officer to unlock the value of NASA’s vast collection of science data using AI. NASA’s Interagency Implementation and Advanced Concepts Team (IMPACT), based at Marshall, IBM Research, and the Jülich Supercomputing Centre, Forschungszentrum, Jülich, designed the foundation model on the supercomputer Jülich Wizard for European Leadership Science (JUWELS), operated by Jülich Supercomputing Centre. This collaboration was facilitated by IEEE Geoscience and Remote Sensing Society.  

For more information about NASA’s strategy of developing foundation models for science, visit https://science.nasa.gov/artificial-intelligence-science.

Share

Details

Last Updated
Dec 04, 2024

Related Terms

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      1 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      On April 16, 2025, the Earth Science Division at NASA’s Ames Research Center in Silicon Valley held an Earth Science Showcase to share its work with the center and their families. As part of this event, kids were invited to share something they like about the Earth. These are their masterpieces.

      Sora U. Age 9. “Wildlife”

      Sora U. Age 9. “Wildlife” Wesley P. Age 2.5. “Pale Blue”

      Wesley P. Age 2.5. “Pale Blue” Kira U. Age 5. “Hawaii”

      Kira U. Age 5. “Hawaii” Anonymous. “eARTh”

      Anonymous. “eARTh” Brooks P. Age 8mo. “Squiggles”

      Brooks P. Age 8mo. “Squiggles” About the Author
      Milan Loiacono
      Science Communication SpecialistMilan Loiacono is a science communication specialist for the Earth Science Division at NASA Ames Research Center.
      Share
      Details
      Last Updated Apr 25, 2025 Related Terms
      Earth Science Ames Research Center Ames Research Center's Science Directorate Earth Science Division Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By European Space Agency
      Image: Copernicus Sentinel-1 captured this radar image over French Guiana – home to Europe’s Spaceport in Kourou, where ESA’s Biomass mission is being prepared for liftoff on 29 April onboard a Vega-C rocket. View the full article
    • By NASA
      3 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      NASA’s Curiosity rover appears as a dark speck in this contrast-enhanced view captured on Feb. 28, 2025, by the HiRISE camera aboard NASA’s Mars Reconnaissance Orbiter. Trailing Curiosity are the rover’s tracks, which can linger on the Martian surface for months before being erased by the wind. NASA/JPL-Caltech/University of Arizona The image marks what may be the first time one of the agency’s Mars orbiters has captured the rover driving.
      NASA’s Curiosity Mars rover has never been camera shy, having been seen in selfies and images taken from space. But on Feb. 28 — the 4,466th Martian day, or sol, of the mission — Curiosity was captured in what is believed to be the first orbital image of the rover mid-drive across the Red Planet.
      Taken by the HiRISE (High-Resolution Imaging Science Experiment) camera aboard NASA’s Mars Reconnaissance Orbiter, the image shows Curiosity as a dark speck at the front of a long trail of rover tracks. Likely to last for months before being erased by wind, the tracks span about 1,050 feet (320 meters). They represent roughly 11 drives starting on Feb. 2 as Curiosity trucked along at a top speed of 0.1 mph (0.16 kph) from Gediz Vallis channel on the journey to its next science stop: a region with potential boxwork formations, possibly made by groundwater billions of years ago.
      How quickly the rover reaches the area depends on a number of factors, including how its software navigates the surface and how challenging the terrain is to climb. Engineers at NASA’s Jet Propulsion Laboratory in Southern California, which leads Curiosity’s mission, work with scientists to plan each day’s trek.
      “By comparing the time HiRISE took the image to the rover’s commands for the day, we can see it was nearly done with a 69-foot drive,” said Doug Ellison, Curiosity’s planning team chief at JPL.
      Designed to ensure the best spatial resolution, HiRISE takes an image with the majority of the scene in black and white and a strip of color down the middle. While the camera has captured Curiosity in color before, this time the rover happened to fall within the black-and-white part of the image.
      In the new image, Curiosity’s tracks lead to the base of a steep slope. The rover has since ascended that slope since then, and it is expected to reach its new science location within a month or so.
      More About Curiosity and MRO
      NASA’s Curiosity Mars rover was built at JPL, which is managed for the agency by Caltech in Pasadena, California. JPL manages both the Curiosity and Mars Reconnaissance Orbiter missions on behalf of NASA’s Science Mission Directorate in Washington as part of the agency’s Mars Exploration Program portfolio. The University of Arizona, in Tucson, operates HiRISE, which was built by BAE Systems in Boulder, Colorado.
      For more about the missions, visit:
      science.nasa.gov/mission/msl-curiosity
      science.nasa.gov/mission/mars-reconnaissance-orbiter
      News Media Contacts
      Andrew Good
      Jet Propulsion Laboratory, Pasadena, Calif.
      818-393-2433
      andrew.c.good@jpl.nasa.gov
      Karen Fox / Molly Wasser
      NASA Headquarters, Washington
      202-358-1600
      karen.c.fox@nasa.gov / molly.l.wasser@nasa.gov
      2025-059
      Share
      Details
      Last Updated Apr 24, 2025 Related Terms
      Mars Science Laboratory (MSL) Curiosity (Rover) Mars Mars Reconnaissance Orbiter (MRO) Explore More
      5 min read Eye on Infinity: NASA Celebrates Hubble’s 35th Year in Orbit
      In celebration of the Hubble Space Telescope’s 35 years in Earth orbit, NASA is releasing…
      Article 1 day ago 3 min read NASA’s Curiosity Rover May Have Solved Mars’ Missing Carbonate Mystery
      Article 7 days ago 6 min read NASA’s Perseverance Mars Rover Studies Trove of Rocks on Crater Rim
      Article 2 weeks ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      5 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      NASA’s AVIRIS-3 airborne imaging spectrometer was used to map a wildfire near Cas-tleberry, Alabama, on March 19. Within minutes, the image was transmitted to firefighters on the ground, who used it to contain the blaze. NASA/JPL-Caltech, NASA Earth Observatory The map visualizes three wavelengths of infrared light, which are invisible to the human eye. Orange and red areas show cooler-burning areas, while yellow indicates the most intense flames. Burned areas show up as dark red or brown.NASA/JPL-Caltech, NASA Earth Observatory Data from the AVIRIS-3 sensor was recently used to create detailed fire maps in minutes, enabling firefighters in Alabama to limit the spread of wildfires and save buildings.
      A NASA sensor recently brought a new approach to battling wildfire, providing real-time data that helped firefighters in the field contain a blaze in Alabama. Called AVIRIS-3, which is short for Airborne Visible Infrared Imaging Spectrometer 3, the instrument detected a 120-acre fire on March 19 that had not yet been reported to officials.
      As AVIRIS-3 flew aboard a King Air B200 research plane over the fire about 3 miles (5 kilometers) east of Castleberry, Alabama, a scientist on the plane analyzed the data in real time and identified where the blaze was burning most intensely. The information was then sent via satellite internet to fire officials and researchers on the ground, who distributed images showing the fire’s perimeter to firefighters’ phones in the field.
      All told, the process from detection during the flyover to alert on handheld devices took a few minutes. In addition to pinpointing the location and extent of the fire, the data showed firefighters its perimeter, helping them gauge whether it was likely to spread and decide where to add personnel and equipment.
      As firefighters worked to prevent a wildfire near Perdido, Alabama, from reaching nearby buildings, they saw in an infrared fire map from NASA’s AVIRIS-3 sensor that showed the fire’s hot spot was inside its perimeter. With that intelligence, they shifted some resources to fires in nearby Mount Vernon.NASA/JPL-Caltech, NASA Earth Observatory “This is very agile science,” said Robert Green, the AVIRIS program’s principal investigator and a senior research scientist at NASA’s Jet Propulsion Laboratory in Southern California, noting AVIRIS-3 mapped the burn scar left near JPL by the Eaton Fire in January.
      Observing the ground from about 9,000 feet (3,000 meters) in altitude, AVIRIS-3 flew aboard several test flights over Alabama, Mississippi, Florida, and Texas for a NASA 2025 FireSense Airborne Campaign. Researchers flew in the second half of March to prepare for prescribed burn experiments that took place in the Geneva State Forest in Alabama on March 28 and at Fort Stewart-Hunter Army Airfield in Georgia from April 14 to 20. During the March span, the AVIRIS-3 team mapped at least 13 wildfires and prescribed burns, as well as dozens of small hot spots (places where heat is especially intense) — all in real time.
      At one of the Mount Vernon, Alabama, fires, firefighters used AVIRIS-3 maps to determine where to establish fire breaks beyond the northwestern end of the fire. They ultimately cut the blaze off within about 100 feet (30 meters) of four buildings.NASA/JPL-Caltech, NASA Earth Observatory Data from imaging spectrometers like AVIRIS-3 typically takes days or weeks to be processed into highly detailed, multilayer image products used for research. By simplifying the calibration algorithms, researchers were able to process data on a computer aboard the plane in a fraction of the time it otherwise would have taken. Airborne satellite internet connectivity enabled the images to be distributed almost immediately, while the plane was still in flight, rather than after it landed.
      The AVIRIS team generated its first real-time products during a February campaign covering parts of Panama and Costa Rica, and they have continued to improve the process, automating the mapping steps aboard the plane.
      ‘Fan Favorite’
      The AVIRIS-3 sensor belongs to a line of imaging spectrometers built at JPL since 1986. The instruments have been used to study a wide range of phenomena — including fire — by measuring sunlight reflecting from the planet’s surface.
      During the March flights, researchers created three types of maps. One, called the Fire Quicklook, combines brightness measurements at three wavelengths of infrared light, which is invisible to the human eye, to identify the relative intensity of burning. Orange and red areas on the Fire Quicklook map show cooler-burning areas, while yellow indicates the most intense flames. Previously burned areas show up as dark red or brown.
      Another map type, the Fire 2400 nm Quicklook, looks solely at infrared light at a wavelength of 2,400 nanometers. The images are particularly useful for seeing hot spots and the perimeters of fires, which show brightly against a red background.
      A third type of map, called just Quicklook, shows burned areas and smoke.
      The Fire 2400 nm Quicklook was the “fan favorite” among the fire crews, said Ethan Barrett, fire analyst for the Forest Protection Division of the Alabama Forestry Commission. Seeing the outline of a wildfire from above helped Alabama Forestry Commission firefighters determine where to send bulldozers to stop the spread. 
      Additionally, FireSense personnel analyzed the AVIRIS-3 imagery to create digitized perimeters of the fires. This provided firefighters fast, comprehensive intelligence of the situation on the ground.
      That’s what happened with the Castleberry Fire. Having a clear picture of where it was burning most intensely enabled firefighters to focus on where they could make a difference — on the northeastern edge. 
      Then, two days after identifying Castleberry Fire hot spots, the sensor spotted a fire about 4 miles (2.5 kilometers) southwest of Perdido, Alabama. As forestry officials worked to prevent flames from reaching six nearby buildings, they noticed that the fire’s main hot spot was inside the perimeter and contained. With that intelligence, they decided to shift some resources to fires 25 miles (40 kilometers) away near Mount Vernon, Alabama.
      To combat one of the Mount Vernon fires, crews used AVIRIS-3 maps to determine where to establish fire breaks beyond the northwestern end of the fire. They ultimately cut the blaze off within about 100 feet (30 meters) of four buildings. 
      “Fire moves a lot faster than a bulldozer, so we have to try to get around it before it overtakes us. These maps show us the hot spots,” Barrett said. “When I get out of the truck, I can say, ‘OK, here’s the perimeter.’ That puts me light-years ahead.”
      AVIRIS and the Firesense Airborne Campaign are part of NASA’s work to leverage its expertise to combat wildfires using solutions including airborne technologies. The agency also recently demonstrated a prototype from its Advanced Capabilities for Emergency Response Operations project that will provide reliable airspace management for drones and other aircraft operating in the air above wildfires.
      NASA Helps Spot Wine Grape Disease From Skies Above California News Media Contacts
      Andrew Wang / Jane J. Lee
      Jet Propulsion Laboratory, Pasadena, Calif.
      626-379-6874 / 818-354-0307
      andrew.wang@jpl.nasa.gov / jane.j.lee@jpl.nasa.gov
      2025-058
      Share
      Details
      Last Updated Apr 23, 2025 Related Terms
      Earth Science Airborne Science Earth Earth Science Division Electromagnetic Spectrum Wildfires Explore More
      4 min read Entrepreneurs Challenge Winner PRISM is Using AI to Enable Insights from Geospatial Data
      NASA sponsored Entrepreneurs Challenge events in 2020, 2021, and 2023 to invite small business start-ups…
      Article 1 day ago 3 min read Celebrating Earth as Only NASA Can
      Article 2 days ago 3 min read Testing in the Clouds: NASA Flies to Improve Satellite Data
      Article 7 days ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      In our modern wireless world, almost all radio frequency (RF) spectrum bands are shared among multiple users. In some domains, similar users technically coordinate to avoid interference. The spectrum management team, part of NASA’s SCaN (Space Communications and Navigation) Program, represents the collaborative efforts across U.S. agencies and the international community to protect and enable NASA’s current and future spectrum-dependent science, exploration, and innovation.     
      Coordination with Other Spectrum Stakeholders
      NASA works to promote the collaborative use of the RF spectrum around Earth, and beyond. For example, NASA coordinates closely with other U.S. government agencies, international civil space agencies, and the private sector to ensure missions that overlap in time, location, and frequency do not cause or receive interference that could jeopardize their success. The spectrum management team protects NASA’s various uses of the spectrum by collaborating with U.S. and international spectrum users on technical matters that inform regulatory discussions.  
      As a founding member of the Space Frequency Coordination Group, NASA works with members of governmental space- and science-focused agencies from more than 35 countries. The Space Frequency Coordination Group annual meetings provide a forum for multilateral discussion and consideration of international spectrum regulatory issues related to Earth, lunar, and deep space research and exploration. The Space Frequency Coordination Group also provides a forum for the exchange of technical information to facilitate coordination for specific missions and enable efficient use of limited spectrum resources in space. 
      Domestic and International Spectrum Regulators 
      Creating and maintaining the global spectrum regulations that govern spectrum sharing requires collaboration and negotiation among all its diverse users. The International Telecommunication Union manages the global spectrum regulatory framework to optimize the increasing, diverse uses of the RF spectrum and reduce the likelihood of RF systems experiencing interference. U.S. regulators at the National Telecommunications and Information Administration and the Federal Communications Commission are responsible for developing and administering domestic spectrum regulations.  Organizations across the world cooperatively plan and regulate spectrum use.  The spectrum management team participates on behalf of NASA at both national and international levels to ensure that the U.S. domestic and international spectrum regulatory framework supports and enables NASA’s current and future missions.  
      NASA collaborates with domestic and international spectrum stakeholders to provide technical expertise on space spectrum topics to ensure regulations continue to enable space exploration, science, and innovation.NASA Share
      Details
      Last Updated Apr 23, 2025 Related Terms
      General Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
  • Check out these Videos

×
×
  • Create New...