Jump to content

Article On Using the Lightning Imaging Sensor to Search for Gamma-ray Flashes from Thunderstorms Accepted for Publication


Recommended Posts

  • Publishers
Posted

Timothy Lang (ST11) is a coauthor on an article titled, “Employing Optical Lightning Data to identify lightning flashes associated to Terrestrial Gamma-ray Flashes,” which was recently accepted for publication in the Bulletin of Atmospheric Science and Technology. Rich Blakeslee, formerly of the NASA MSFC Emeritus program, is also a coauthor on the study. The study – which was led by Christoph Köhn of the Technical University of Denmark (DTU) – used data from the International Space Station Lightning Imaging Sensor (ISS LIS) and the Atmosphere-Space Interactions Monitor (ASIM; also on the ISS) to improve our understanding of what types of lightning flashes are associated with terrestrial gamma-ray flashes (TGFs), which emit high-energy radiation from thunderstorms. The team developed an algorithm that accurately reduced the total population of LIS-observed lightning to a much smaller population of candidate TGF-related flashes by looking for unique characteristics within the flashes. ASIM, which can observe TGFs, was used to validate the algorithm. This study is important because instruments like ASIM only observe 300-400 TGFs per year, while LIS observed on average ~1 million lightning flashes per year. This difference of four orders of magnitude in frequency of occurrence means that data-reduction algorithms are necessary to facilitate studying the relationships between TGFs and lightning. In addition, a recent NASA field campaign demonstrated that TGF occurrence may be significantly higher than what can be measured from space, particularly within tropical thunderstorms. Thus, an algorithm that identifies possible characteristics TGF-related lightning may help us later in understanding differences between lightning associated with strong TGFs (observable from space) and lightning associated with weaker TGFs (not currently observable from space).

Read the paper at: https://link.springer.com/article/10.1007/s42865-024-00065-y.

ISS -LIS Integration Graphic
Graphic showing the integration of the Lightning Imaging Sensor with the International Space Station.

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      5 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      NASA’s AVIRIS-3 airborne imaging spectrometer was used to map a wildfire near Cas-tleberry, Alabama, on March 19. Within minutes, the image was transmitted to firefighters on the ground, who used it to contain the blaze. NASA/JPL-Caltech, NASA Earth Observatory The map visualizes three wavelengths of infrared light, which are invisible to the human eye. Orange and red areas show cooler-burning areas, while yellow indicates the most intense flames. Burned areas show up as dark red or brown.NASA/JPL-Caltech, NASA Earth Observatory Data from the AVIRIS-3 sensor was recently used to create detailed fire maps in minutes, enabling firefighters in Alabama to limit the spread of wildfires and save buildings.
      A NASA sensor recently brought a new approach to battling wildfire, providing real-time data that helped firefighters in the field contain a blaze in Alabama. Called AVIRIS-3, which is short for Airborne Visible Infrared Imaging Spectrometer 3, the instrument detected a 120-acre fire on March 19 that had not yet been reported to officials.
      As AVIRIS-3 flew aboard a King Air B200 research plane over the fire about 3 miles (5 kilometers) east of Castleberry, Alabama, a scientist on the plane analyzed the data in real time and identified where the blaze was burning most intensely. The information was then sent via satellite internet to fire officials and researchers on the ground, who distributed images showing the fire’s perimeter to firefighters’ phones in the field.
      All told, the process from detection during the flyover to alert on handheld devices took a few minutes. In addition to pinpointing the location and extent of the fire, the data showed firefighters its perimeter, helping them gauge whether it was likely to spread and decide where to add personnel and equipment.
      As firefighters worked to prevent a wildfire near Perdido, Alabama, from reaching nearby buildings, they saw in an infrared fire map from NASA’s AVIRIS-3 sensor that showed the fire’s hot spot was inside its perimeter. With that intelligence, they shifted some resources to fires in nearby Mount Vernon.NASA/JPL-Caltech, NASA Earth Observatory “This is very agile science,” said Robert Green, the AVIRIS program’s principal investigator and a senior research scientist at NASA’s Jet Propulsion Laboratory in Southern California, noting AVIRIS-3 mapped the burn scar left near JPL by the Eaton Fire in January.
      Observing the ground from about 9,000 feet (3,000 meters) in altitude, AVIRIS-3 flew aboard several test flights over Alabama, Mississippi, Florida, and Texas for a NASA 2025 FireSense Airborne Campaign. Researchers flew in the second half of March to prepare for prescribed burn experiments that took place in the Geneva State Forest in Alabama on March 28 and at Fort Stewart-Hunter Army Airfield in Georgia from April 14 to 20. During the March span, the AVIRIS-3 team mapped at least 13 wildfires and prescribed burns, as well as dozens of small hot spots (places where heat is especially intense) — all in real time.
      At one of the Mount Vernon, Alabama, fires, firefighters used AVIRIS-3 maps to determine where to establish fire breaks beyond the northwestern end of the fire. They ultimately cut the blaze off within about 100 feet (30 meters) of four buildings.NASA/JPL-Caltech, NASA Earth Observatory Data from imaging spectrometers like AVIRIS-3 typically takes days or weeks to be processed into highly detailed, multilayer image products used for research. By simplifying the calibration algorithms, researchers were able to process data on a computer aboard the plane in a fraction of the time it otherwise would have taken. Airborne satellite internet connectivity enabled the images to be distributed almost immediately, while the plane was still in flight, rather than after it landed.
      The AVIRIS team generated its first real-time products during a February campaign covering parts of Panama and Costa Rica, and they have continued to improve the process, automating the mapping steps aboard the plane.
      ‘Fan Favorite’
      The AVIRIS-3 sensor belongs to a line of imaging spectrometers built at JPL since 1986. The instruments have been used to study a wide range of phenomena — including fire — by measuring sunlight reflecting from the planet’s surface.
      During the March flights, researchers created three types of maps. One, called the Fire Quicklook, combines brightness measurements at three wavelengths of infrared light, which is invisible to the human eye, to identify the relative intensity of burning. Orange and red areas on the Fire Quicklook map show cooler-burning areas, while yellow indicates the most intense flames. Previously burned areas show up as dark red or brown.
      Another map type, the Fire 2400 nm Quicklook, looks solely at infrared light at a wavelength of 2,400 nanometers. The images are particularly useful for seeing hot spots and the perimeters of fires, which show brightly against a red background.
      A third type of map, called just Quicklook, shows burned areas and smoke.
      The Fire 2400 nm Quicklook was the “fan favorite” among the fire crews, said Ethan Barrett, fire analyst for the Forest Protection Division of the Alabama Forestry Commission. Seeing the outline of a wildfire from above helped Alabama Forestry Commission firefighters determine where to send bulldozers to stop the spread. 
      Additionally, FireSense personnel analyzed the AVIRIS-3 imagery to create digitized perimeters of the fires. This provided firefighters fast, comprehensive intelligence of the situation on the ground.
      That’s what happened with the Castleberry Fire. Having a clear picture of where it was burning most intensely enabled firefighters to focus on where they could make a difference — on the northeastern edge. 
      Then, two days after identifying Castleberry Fire hot spots, the sensor spotted a fire about 4 miles (2.5 kilometers) southwest of Perdido, Alabama. As forestry officials worked to prevent flames from reaching six nearby buildings, they noticed that the fire’s main hot spot was inside the perimeter and contained. With that intelligence, they decided to shift some resources to fires 25 miles (40 kilometers) away near Mount Vernon, Alabama.
      To combat one of the Mount Vernon fires, crews used AVIRIS-3 maps to determine where to establish fire breaks beyond the northwestern end of the fire. They ultimately cut the blaze off within about 100 feet (30 meters) of four buildings. 
      “Fire moves a lot faster than a bulldozer, so we have to try to get around it before it overtakes us. These maps show us the hot spots,” Barrett said. “When I get out of the truck, I can say, ‘OK, here’s the perimeter.’ That puts me light-years ahead.”
      AVIRIS and the Firesense Airborne Campaign are part of NASA’s work to leverage its expertise to combat wildfires using solutions including airborne technologies. The agency also recently demonstrated a prototype from its Advanced Capabilities for Emergency Response Operations project that will provide reliable airspace management for drones and other aircraft operating in the air above wildfires.
      NASA Helps Spot Wine Grape Disease From Skies Above California News Media Contacts
      Andrew Wang / Jane J. Lee
      Jet Propulsion Laboratory, Pasadena, Calif.
      626-379-6874 / 818-354-0307
      andrew.wang@jpl.nasa.gov / jane.j.lee@jpl.nasa.gov
      2025-058
      Share
      Details
      Last Updated Apr 23, 2025 Related Terms
      Earth Science Airborne Science Earth Earth Science Division Electromagnetic Spectrum Wildfires Explore More
      4 min read Entrepreneurs Challenge Winner PRISM is Using AI to Enable Insights from Geospatial Data
      NASA sponsored Entrepreneurs Challenge events in 2020, 2021, and 2023 to invite small business start-ups…
      Article 1 day ago 3 min read Celebrating Earth as Only NASA Can
      Article 2 days ago 3 min read Testing in the Clouds: NASA Flies to Improve Satellite Data
      Article 7 days ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      4 min read
      Entrepreneurs Challenge Winner PRISM is Using AI to Enable Insights from Geospatial Data
      PRISM’s platform uses AI segmentation to identify and highlight residential structures in a neighborhood. NASA sponsored Entrepreneurs Challenge events in 2020, 2021, and 2023 to invite small business start-ups to showcase innovative ideas and technologies with the potential to advance the agency’s science goals. To potentially leverage external funding sources for the development of innovative technologies of interest to NASA, SMD involved the venture capital community in Entrepreneurs Challenge events. Challenge winners were awarded prize money, and in 2023 the total Entrepreneurs Challenge prize value was $1M. Numerous challenge winners have subsequently refined their products and/or received funding from NASA and external sources (e.g., other government agencies or the venture capital community) to further develop their technologies.
      One 2023 Entrepreneurs Challenge winner, PRISM Intelligence (formerly known as Pegasus Intelligence and Space), is using artificial intelligence (AI) and other advances in computer vision to create a new platform that could provide geospatial insights to a broad community.
      Every day, vast amounts of remote sensing data are collected through satellites, drones, and aerial imagery, but for most businesses and individuals, accessing and extracting meaningful insights from this data is nearly impossible.  
      The company’s product—Personal Real-time Insight from Spatial Maps, a.k.a. PRISM—is transforming geospatial data into an easy-to-navigate, queryable world. By leveraging 3D computer vision, geospatial analytics, and AI-driven insights, PRISM creates photorealistic, up-to-date digital environments that anyone can interact with. Users can simply log in and ask natural-language questions to instantly retrieve insights—no advanced Geographic Information System (GIS) expertise is required.
      For example, a pool cleaner looking for business could use PRISM to search for all residential pools in a five-mile radius. A gardener could identify overgrown trees in a community. City officials could search for potholes in their jurisdiction to prioritize repairs, enhance public safety, and mitigate liability risks. This broad level of accessibility brings geospatial intelligence out of the hands of a few and into everyday decision making.
      The core of PRISM’s platform uses radiance fields to convert raw 2D imagery into high-fidelity, dynamic 3D visualizations. These models are then enhanced with AI-powered segmentation, which autonomously identifies and labels objects in the environment—such as roads, vehicles, buildings, and natural features—allowing for seamless search and analysis. The integration of machine learning enables PRISM to refine its reconstructions continuously, improving precision with each dataset. This advanced processing ensures that the platform remains scalable, efficient, and adaptable to various data sources, making it possible to produce large-scale, real-time digital twins of the physical world.
      The PRISM platform’s interface showcasing a 3D digital twin of California State Polytechnic University, Pomona, with AI-powered search and insights. “It’s great being able to push the state of the art in this relatively new domain of radiance fields, evolving it from research to applications that can impact common tasks. From large sets of images, PRISM creates detailed 3D captures that embed more information than the source pictures.” — Maximum Wilder-Smith, Chief Technology Officer, PRISM Intelligence
      Currently the PRISM platform uses proprietary data gathered from aerial imagery over selected areas. PRISM then generates high-resolution digital twins of cities in select regions. The team is aiming to eventually expand the platform to use NASA Earth science data and commercial data, which will enable high-resolution data capture over larger areas, significantly increasing efficiency, coverage, and update frequency. PRISM aims to use the detailed multiband imagery that NASA provides and the high-frequency data that commercial companies provide to make geospatial intelligence more accessible by providing fast, reliable, and up-to-date insights that can be used across multiple industries.
      What sets PRISM apart is its focus on usability. While traditional GIS platforms require specialized training to use, PRISM eliminates these barriers by allowing users to interact with geospatial data through a frictionless, conversational interface.
      The impact of this technology could extend across multiple industries. Professionals in the insurance and appraisal industries have informed the company how the ability to generate precise, 3D assessments of properties could streamline risk evaluations, reduce costs, and improve accuracy—replacing outdated or manual site visits. Similarly, local governments have indicated they could potentially use PRISM to better manage infrastructure, track zoning compliance, and allocate resources based on real-time, high-resolution urban insights. Additionally, scientists could use the consistent updates and layers of three-dimensional data that PRISM can provide to better understand changes to ecosystems and vegetation.
      As PRISM moves forward, the team’s focus remains on scaling its capabilities and expanding its applications. Currently, the team is working to enhance the technical performance of the platform while also adding data sources to enable coverage of more regions. Future iterations will further improve automation of data processing, increasing the speed and efficiency of real-time 3D reconstructions. The team’s goal is to expand access to geospatial insights, ensuring that anyone—from city planners to business owners—can make informed decisions using the best possible data.
      PRISM Intelligence founders Zachary Gaines, Hugo Delgado, and Maximum Wilder-Smith in their California State Polytechnic University, Pomona lab, where the company was first formed. Share








      Details
      Last Updated Apr 21, 2025 Related Terms
      Earth Science Division Earth Science Science-enabling Technology Technology Highlights Explore More
      4 min read NASA Aims to Fly First Quantum Sensor for Gravity Measurements


      Article


      7 days ago
      4 min read GLOBE Mission Earth Supports Career Technical Education


      Article


      2 weeks ago
      4 min read New York Math Teacher Measures Trees & Grows Scientists with GLOBE


      Article


      2 weeks ago
      View the full article
    • By NASA
      Researchers from NASA’s Jet Propulsion Laboratory in Southern California, private companies, and academic institutions are developing the first space-based quantum sensor for measuring gravity. Supported by NASA’s Earth Science Technology Office (ESTO), this mission will mark a first for quantum sensing and will pave the way for groundbreaking observations of everything from petroleum reserves to global supplies of fresh water.
      A map of Earth’s gravity. Red indicates areas of the world that exert greater gravitational pull, while blue indicates areas that exert less. A science-grade quantum gravity gradiometer could one day make maps like this with unprecedented accuracy. Image Credit: NASA Earth’s gravitational field is dynamic, changing each day as geologic processes redistribute mass across our planet’s surface. The greater the mass, the greater the gravity.
      You wouldn’t notice these subtle changes in gravity as you go about your day, but with sensitive tools called gravity gradiometers, scientists can map the nuances of Earth’s gravitational field and correlate them to subterranean features like aquifers and mineral deposits. These gravity maps are essential for navigation, resource management, and national security.
      “We could determine the mass of the Himalayas using atoms,” said Jason Hyon, chief technologist for Earth Science at JPL and director of JPL’s Quantum Space Innovation Center. Hyon and colleagues laid out the concepts behind their Quantum Gravity Gradiometer Pathfinder (QGGPf) instrument in a recent paper in EPJ Quantum Technology.
      Gravity gradiometers track how fast an object in one location falls compared to an object falling just a short distance away. The difference in acceleration between these two free-falling objects, also known as test masses, corresponds to differences in gravitational strength. Test masses fall faster where gravity is stronger.
      QGGPf will use two clouds of ultra-cold rubidium atoms as test masses. Cooled to a temperature near absolute zero, the particles in these clouds behave like waves. The quantum gravity gradiometer will measure the difference in acceleration between these matter waves to locate gravitational anomalies.
      Using clouds of ultra-cold atoms as test masses is ideal for ensuring that space-based gravity measurements remain accurate over long periods of time, explained Sheng-wey Chiow, an experimental physicist at JPL. “With atoms, I can guarantee that every measurement will be the same. We are less sensitive to environmental effects.”
      Using atoms as test masses also makes it possible to measure gravity with a compact instrument aboard a single spacecraft. QGGPf will be around 0.3 cubic yards (0.25 cubic meters) in volume and weigh only about 275 pounds (125 kilograms), smaller and lighter than traditional space-based gravity instruments.
      Quantum sensors also have the potential for increased sensitivity. By some estimates, a science-grade quantum gravity gradiometer instrument could be as much as ten times more sensitive at measuring gravity than classical sensors.
      The main purpose of this technology validation mission, scheduled to launch near the end of the decade, will be to test a collection of novel technologies for manipulating interactions between light and matter at the atomic scale.
      “No one has tried to fly one of these instruments yet,” said Ben Stray, a postdoctoral researcher at JPL. “We need to fly it so that we can figure out how well it will operate, and that will allow us to not only advance the quantum gravity gradiometer, but also quantum technology in general.”
      This technology development project involves significant collaborations between NASA and small businesses. The team at JPL is working with AOSense and Infleqtion to advance the sensor head technology, while NASA’s Goddard Space Flight Center in Greenbelt, Maryland is working with Vector Atomic to advance the laser optical system.
      Ultimately, the innovations achieved during this pathfinder mission could enhance our ability to study Earth, and our ability to understand distant planets and the role gravity plays in shaping the cosmos. “The QGGPf instrument will lead to planetary science applications and fundamental physics applications,” said Hyon.
      To learn more about ESTO visit: https://esto.nasa.gov
      Share








      Details
      Last Updated Apr 15, 2025 Editor NASA Science Editorial Team Contact Gage Taylor gage.taylor@nasa.gov Location NASA Goddard Space Flight Center Related Terms
      Science-enabling Technology Earth Science Technology Office Technology Highlights Explore More
      5 min read Atomic Layer Processing Coating Techniques Enable Missions to See Further into the Ultraviolet


      Article


      4 weeks ago
      4 min read Novel Metasurface Optical Element Could Shed New Light on Atmospheric Aerosols


      Article


      1 month ago
      5 min read Ultra-low-noise Infrared Detectors for Exoplanet Imaging


      Article


      2 months ago
      View the full article
    • By NASA
      Explore This Section Earth Earth Observer Editor’s Corner Feature Articles Meeting Summaries News Science in the News Calendars In Memoriam More Archives Conference Schedules Style Guide 1 min read
      Kudos Test Article
      The Global Learning and Observations to Benefit the Environment (GLOBE) Program is calling on volunteers of all ages to help students and citizen scientists document seasonal change through leaf color and land cover. The data collection event will support students across North America, Latin America, Central America, and Europe, who are working together to document the seasonal changes taking place from September through December – see Figure. The observations will also provide vital data for GLOBE students creating student research projects for the GLOBE 2025 International Virtual Science Symposium (IVSS). The project is part of GLOBE’s Intensive Observation Period (IOP), which collects data during a focused period to assess how climate change is unfolding in different regions of the world.
      Share








      Details
      Last Updated Apr 11, 2025 Related Terms
      Earth Science View the full article
    • By NASA
      The 2025 Spinoff publication features more than 40 commercial infusions of NASA technologies. Credit: NASA  The work NASA conducts in space leads to ongoing innovations benefiting people on Earth. Some of these latest technologies, which have been successfully transferred from NASA to the commercial sector, are featured in the latest edition of NASA’s Spinoff 2025 publication now available online. 
      The publication features more than 40 commercial infusions of NASA technologies, including research originated at NASA’s Glenn Research Center in Cleveland. 
      Parallel Flight Technologies’ Firefly aircraft is designed to run for 100 minutes while fully loaded, allowing the aircraft to perform agricultural surveys as well as assist in the aftermath of natural disasters. Credit: Parallel Flight Technologies Inc.  Bringing Hybrid Power to the Rescue 
      A NASA-funded hybrid power system makes drones more capable in disasters. 
      With Small Business Innovation Research funding from NASA Glenn, Parallel Flight Technologies of La Selva Beach, California, was able to test its hybrid propulsion technology, enabling longer-running, remotely piloted aircraft for use in agricultural and rescue applications. See the full Spinoff article for more information.

      EnerVenue Inc. brought down the cost of nickel-hydrogen technology and encased it in safe, robust vessels, like the battery pictured here. These batteries store renewable energy in a wide range of terrestrial situations. Credit: EnerVenue Inc.  Hubble Battery Tech Holds Power on Earth 
      Nickel-hydrogen technology is safe, durable, and long-lasting – and now it’s affordable, too.
      Nickel-hydrogen batteries store renewable energy for power plants, businesses, and homes, thanks to innovations from Fremont, California-based EnerVenue, informed by papers published by NASA Glenn about the technology’s performance on the Hubble Space Telescope, International Space Station, and more. See the full Spinoff article for more information. 
      Spinoff 2025 also features 20 technologies available for licensing with the potential for commercialization. Check out the Spinoffs of Tomorrow section to learn more.
      Return to Newsletter Explore More
      1 min read NASA Glenn Experts Join Law College to Talk Human Spaceflight 
      Article 3 mins ago 1 min read NASA Glenn Welcomes Spring 2025 Interns
      Article 4 mins ago 5 min read NASA’s Chevron Technology Quiets the Skies
      Article 22 hours ago View the full article
  • Check out these Videos

×
×
  • Create New...