Jump to content

Article On Using the Lightning Imaging Sensor to Search for Gamma-ray Flashes from Thunderstorms Accepted for Publication


Recommended Posts

  • Publishers
Posted

Timothy Lang (ST11) is a coauthor on an article titled, “Employing Optical Lightning Data to identify lightning flashes associated to Terrestrial Gamma-ray Flashes,” which was recently accepted for publication in the Bulletin of Atmospheric Science and Technology. Rich Blakeslee, formerly of the NASA MSFC Emeritus program, is also a coauthor on the study. The study – which was led by Christoph Köhn of the Technical University of Denmark (DTU) – used data from the International Space Station Lightning Imaging Sensor (ISS LIS) and the Atmosphere-Space Interactions Monitor (ASIM; also on the ISS) to improve our understanding of what types of lightning flashes are associated with terrestrial gamma-ray flashes (TGFs), which emit high-energy radiation from thunderstorms. The team developed an algorithm that accurately reduced the total population of LIS-observed lightning to a much smaller population of candidate TGF-related flashes by looking for unique characteristics within the flashes. ASIM, which can observe TGFs, was used to validate the algorithm. This study is important because instruments like ASIM only observe 300-400 TGFs per year, while LIS observed on average ~1 million lightning flashes per year. This difference of four orders of magnitude in frequency of occurrence means that data-reduction algorithms are necessary to facilitate studying the relationships between TGFs and lightning. In addition, a recent NASA field campaign demonstrated that TGF occurrence may be significantly higher than what can be measured from space, particularly within tropical thunderstorms. Thus, an algorithm that identifies possible characteristics TGF-related lightning may help us later in understanding differences between lightning associated with strong TGFs (observable from space) and lightning associated with weaker TGFs (not currently observable from space).

Read the paper at: https://link.springer.com/article/10.1007/s42865-024-00065-y.

ISS -LIS Integration Graphic
Graphic showing the integration of the Lightning Imaging Sensor with the International Space Station.

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      The 2025 Spinoff publication features more than 40 commercial infusions of NASA technologies. Credit: NASA  The work NASA conducts in space leads to ongoing innovations benefiting people on Earth. Some of these latest technologies, which have been successfully transferred from NASA to the commercial sector, are featured in the latest edition of NASA’s Spinoff 2025 publication now available online. 
      The publication features more than 40 commercial infusions of NASA technologies, including research originated at NASA’s Glenn Research Center in Cleveland. 
      Parallel Flight Technologies’ Firefly aircraft is designed to run for 100 minutes while fully loaded, allowing the aircraft to perform agricultural surveys as well as assist in the aftermath of natural disasters. Credit: Parallel Flight Technologies Inc.  Bringing Hybrid Power to the Rescue 
      A NASA-funded hybrid power system makes drones more capable in disasters. 
      With Small Business Innovation Research funding from NASA Glenn, Parallel Flight Technologies of La Selva Beach, California, was able to test its hybrid propulsion technology, enabling longer-running, remotely piloted aircraft for use in agricultural and rescue applications. See the full Spinoff article for more information.

      EnerVenue Inc. brought down the cost of nickel-hydrogen technology and encased it in safe, robust vessels, like the battery pictured here. These batteries store renewable energy in a wide range of terrestrial situations. Credit: EnerVenue Inc.  Hubble Battery Tech Holds Power on Earth 
      Nickel-hydrogen technology is safe, durable, and long-lasting – and now it’s affordable, too.
      Nickel-hydrogen batteries store renewable energy for power plants, businesses, and homes, thanks to innovations from Fremont, California-based EnerVenue, informed by papers published by NASA Glenn about the technology’s performance on the Hubble Space Telescope, International Space Station, and more. See the full Spinoff article for more information. 
      Spinoff 2025 also features 20 technologies available for licensing with the potential for commercialization. Check out the Spinoffs of Tomorrow section to learn more.
      Return to Newsletter Explore More
      1 min read NASA Glenn Experts Join Law College to Talk Human Spaceflight 
      Article 3 mins ago 1 min read NASA Glenn Welcomes Spring 2025 Interns
      Article 4 mins ago 5 min read NASA’s Chevron Technology Quiets the Skies
      Article 22 hours ago View the full article
    • By NASA
      Tess Caswell, a stand-in crew member for the Artemis III Virtual Reality Mini-Simulation, executes a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. The simulation was a test of using VR as a training method for flight controllers and science teams’ collaboration on science-focused traverses on the lunar surface. Credit: NASA/Robert Markowitz When astronauts walk on the Moon, they’ll serve as the eyes, hands, and boots-on-the-ground interpreters supporting the broader teams of scientists on Earth. NASA is leveraging virtual reality to provide high-fidelity, cost-effective support to prepare crew members, flight control teams, and science teams for a return to the Moon through its Artemis campaign.
      The Artemis III Geology Team, led by principal investigator Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, participated in an Artemis III Surface Extra-Vehicular VR Mini-Simulation, or “sim” at NASA’s Johnson Space Center in Houston in the fall of 2024. The sim brought together science teams and flight directors and controllers from Mission Control to carry out science-focused moonwalks and test the way the teams communicate with each other and the astronauts.
      “There are two worlds colliding,” said Dr. Matthew Miller, co-lead for the simulation and exploration engineer, Amentum/JETSII contract with NASA. “There is the operational world and the scientific world, and they are becoming one.”
      NASA mission training can include field tests covering areas from navigation and communication to astronaut physical and psychological workloads. Many of these tests take place in remote locations and can require up to a year to plan and large teams to execute. VR may provide an additional option for training that can be planned and executed more quickly to keep up with the demands of preparing to land on the Moon in an environment where time, budgets, and travel resources are limited.
      VR helps us break down some of those limitations and allows us to do more immersive, high-fidelity training without having to go into the field. It provides us with a lot of different, and significantly more, training opportunities.
      BRI SPARKS
      NASA co-lead for the simulation and Extra Vehicular Activity Extended Reality team at Johnson.
      Field testing won’t be going away. Nothing can fully replace the experience crew members gain by being in an environment that puts literal rocks in their hands and incudes the physical challenges that come with moonwalks, but VR has competitive advantages.
      The virtual environment used in the Artemis III VR Mini-Sim was built using actual lunar surface data from one of the Artemis III candidate regions. This allowed the science team to focus on Artemis III science objectives and traverse planning directly applicable to the Moon. Eddie Paddock, engineering VR technical discipline lead at NASA Johnson, and his team used data from NASA’s Lunar Reconnaissance Orbiter and planet position and velocity over time to develop a virtual software representation of a site within the Nobile Rim 1 region near the south pole of the Moon. Two stand-in crew members performed moonwalk traverses in virtual reality in the Prototype Immersive Technology lab at Johnson, and streamed suit-mounted virtual video camera views, hand-held virtual camera imagery, and audio to another location where flight controllers and science support teams simulated ground communications.
      A screen capture of a virtual reality view during the Artemis III VR Mini-Simulation. The lunar surface virtual environment was built using actual lunar surface data from one of the Artemis III candidate regions. Credit: Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. The crew stand-ins were immersed in the lunar environment and could then share the experience with the science and flight control teams. That quick and direct feedback could prove critical to the science and flight control teams as they work to build cohesive teams despite very different approaches to their work.
      The flight operations team and the science team are learning how to work together and speak a shared language. Both teams are pivotal parts of the overall mission operations. The flight control team focuses on maintaining crew and vehicle safety and minimizing risk as much as possible. The science team, as Miller explains, is “relentlessly thirsty” for as much science as possible. Training sessions like this simulation allow the teams to hone their relationships and processes.
      Members of the Artemis III Geology Team and science support team work in a mock Science Evaluation Room during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Video feeds from the stand-in crew members’ VR headsets allow the science team to follow, assess, and direct moonwalks and science activities. Credit: NASA/Robert Markowitz Denevi described the flight control team as a “well-oiled machine” and praised their dedication to getting it right for the science team. Many members of the flight control team have participated in field and classroom training to learn more about geology and better understand the science objectives for Artemis.
      “They have invested a lot of their own effort into understanding the science background and science objectives, and the science team really appreciates that and wants to make sure they are also learning to operate in the best way we can to support the flight control team, because there’s a lot for us to learn as well,” Denevi said. “It’s a joy to get to share the science with them and have them be excited to help us implement it all.”
      Artemis III Geology Team lead Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, left, Artemis III Geology Team member, Dr. Jose Hurtado, University of Texas at El Paso, and simulation co-lead, Bri Sparks, work together during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz This simulation, Sparks said, was just the beginning for how virtual reality could supplement training opportunities for Artemis science. In the future, using mixed reality could help take the experience to the next level, allowing crew members to be fully immersed in the virtual environment while interacting with real objects they can hold in their hands. Now that the Nobile Rim 1 landing site is built in VR, it can continue to be improved and used for crew training, something that Sparks said can’t be done with field training on Earth.
      While “virtual” was part of the title for this exercise, its applications are very real.
      “We are uncovering a lot of things that people probably had in the back of their head as something we’d need to deal with in the future,” Miller said. “But guess what? The future is now. This is now.”
      Test subject crew members for the Artemis III Virtual Reality Mini-Simulation, including Grier Wilt, left, and Tess Caswell, center, execute a moonwalk in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Grier Wilt, left, and Tess Caswell, crew stand-ins for the Artemis III Virtual Reality Mini-Simulation, execute a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Engineering VR technical discipline lead Eddie Paddock works with team members to facilitate the virtual reality components of the Artemis III Virtual Reality Mini-Simulation in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: Robert Markowitz Flight director Paul Konyha follows moonwalk activities during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz




      Rachel Barry
      NASA’s Johnson Space Center
      Keep Exploring Discover More Topics From NASA
      Astromaterials



      Artemis Science


      A Time Capsule The Moon is a 4.5-billion-year-old time capsule, pristinely preserved by the cold vacuum of space. It is…


      Lunar Craters


      Earth’s Moon is covered in craters. Lunar craters tell us the history not only of the Moon, but of our…


      Solar System


      View the full article
    • By NASA
      5 min read
      Ultra-low-noise Infrared Detectors for Exoplanet Imaging
      A linear-mode avalanche photodiode array in the test dewar. The detector is the dark square in the center. Michael Bottom, University of Hawai’i One of the ultimate goals in astrophysics is the discovery of Earth-like planets that are capable of hosting life. While thousands of planets have been discovered around other stars, the vast majority of these detections have been made via indirect methods, that is, by detecting the effect of the planet on the star’s light, rather than detecting the planet’s light directly. For example, when a planet passes in front of its host star, the brightness of the star decreases slightly.
      However, indirect methods do not allow for characterization of the planet itself, including its temperature, pressure, gravity, and atmospheric composition. Planetary atmospheres may include “biosignature” gases like oxygen, water vapor, carbon dioxide, etc., which are known to be key ingredients needed to support life as we know it. As such, direct imaging of a planet and characterization of its atmosphere are key to understanding its potential habitability.
      But the technical challenges involved in imaging Earth-like extrasolar planets are extreme. First such planets are detected only by observing light they reflect from their parent star, and so they typically appear fainter than the stars they orbit by factors of about 10 billion. Furthermore, at the cosmic distances involved, the planets appear right next to the stars. A popular expression is that exoplanet imaging is like trying to detect a firefly three feet from a searchlight from a distance of 300 miles.
      Tremendous effort has gone into developing starlight suppression technologies to block the bright glare of the star, but detecting the light of the planet is challenging in its own right, as planets are incredibly faint. One way to quantify the faintness of planetary light is to understand the photon flux rate. A photon is an indivisible particle of light, that is, the minimum detectable amount of light. On a sunny day, approximately 10 thousand trillion photons enter your eye every second. The rate of photons entering your eye from an Earth-like exoplanet around a nearby star would be around 10 to 100 per year. Telescopes with large mirrors can help collect as much of this light as possible, but ultra-sensitive detectors are also needed, particularly for infrared light, where the biosignature gases have their strongest effects. Unfortunately, state-of-the-art infrared detectors are far too noisy to detect the low level of light emitted from exoplanets.
      With support from NASA’s Astrophysics Division and industrial partners, researchers at the University of Hawai’i are developing a promising detector technology to meet these stringent sensitivity requirements. These detectors, known as avalanche photodiode arrays, are constructed out of the same semiconductor material as conventional infrared sensors. However, these new sensors employ an extra “avalanche” layer that takes the signal from a single photon and multiplies it, much like an avalanche can start with a single snowball and quickly grow it to the size of a boulder. This signal amplification occurs before any noise from the detector is introduced, so the effective noise is proportionally reduced. However, at high avalanche levels, photodiodes start to behave badly, with noise exponentially increasing, which negates any benefits of the signal amplification. Late University of Hawai’i faculty member Donald Hall, who was a key figure in driving technology for infrared astronomy, realized the potential use of avalanche photodiodes for ultra-low-noise infrared astronomy with some modifications to the material properties.
      University of Hawai’i team members with cryogenic dewar used to test the sensors. From left to right, Angelu Ramos, Michael Bottom, Shane Jacobson, Charles-Antoine Claveau. Michael Bottom, University of Hawai’i The most recent sensors benefit from a new design including a graded semiconductor bandgap that allows for excellent noise performance at moderate amplification, a mesa pixel geometry to reduce electronic crosstalk, and a read-out integrated circuit to allow for short readout times. “It was actually challenging figuring out just how sensitive these detectors are,” said Michael Bottom, associate professor at the University of Hawai’i and lead of development effort. “Our ‘light-tight’ test chamber, which was designed to evaluate the infrared sensors on the James Webb Space Telescope, was supposed to be completely dark. But when we put these avalanche photodiodes in the chamber, we started seeing light leaks at the level of a photon an hour, which you would never be able to detect using the previous generation of sensors.”
      The new designs have a format of one megapixel, more than ten times larger than the previous iteration of sensors, and circuitry that allows for tracking and subtracting any electronic drifts. Additionally, the pixel size and control electronics are such that these new sensors could be drop-in replacements for the most common infrared sensors used on the ground, which would give new capabilities to existing instruments.
      Image of the Palomar-2 globular cluster located in the constellation of Auriga, taken with the linear-mode avalanche photodiode arrays, taken from the first on-sky testing of the sensors using the University of Hawai’i’s 2.2 meter telescope. Michael Bottom, University of Hawai’i Last year, the team took the first on-sky images from the detectors, using the University of Hawai’i’s 2.2-meter telescope. “It was impressive to see the avalanche process on sky. When we turned up the gain, we could see more stars appear,” said Guillaume Huber, a graduate student working on the project. “The on-sky demonstration was important to prove the detectors could perform well in an operational environment,” added Michael Bottom.
      According to the research team, while the current sensors are a major step forward, the megapixel format is still too small for many science applications, particularly those involving spectroscopy. Further tasks include improving detector uniformity and decreasing persistence. The next generation of sensors will be four times larger, meeting the size requirements for the Habitable Worlds Observatory, NASA’s next envisioned flagship mission, with the goals of imaging and characterizing Earth-like exoplanets.
      Project Lead: Dr. Michael Bottom, University of Hawai’i
      Sponsoring Organization:  NASA Strategic Astrophysics Technology (SAT) Program
      Share








      Details
      Last Updated Feb 18, 2025 Related Terms
      Technology Highlights Astrophysics Astrophysics Division Science-enabling Technology Explore More
      6 min read Webb Reveals Rapid-Fire Light Show From Milky Way’s Central Black Hole


      Article


      5 mins ago
      2 min read Hubble Captures a Cosmic Cloudscape


      Article


      4 days ago
      5 min read Webb Maps Full Picture of How Phoenix Galaxy Cluster Forms Stars


      Article


      5 days ago
      View the full article
    • By Space Force
      The Department of the Air Force released the memorandum DEI and Gender Ideology Publication Review.
      View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      2 Min Read More Than 400 Lives Saved with NASA’s Search and Rescue Tech in 2024
      NASA Artemis II crew members are assisted by U.S. Navy personnel as they exit a mockup of the Orion spacecraft in the Pacific Ocean during Underway Recovery Test 11 (URT-11) on Feb. 25, 2024. Credits: NASA/Kenny Allen NASA’s Search and Rescue technologies enabled hundreds of lives saved in 2024.NASA/Dave Ryan Did you know that the same search and rescue technologies developed by NASA for astronaut missions to space help locate and rescue people across the United States and around the world? 
      NASA’s collaboration with the international satellite-aided search and rescue effort known as Cospas-Sarsat has enabled the development of multiple emergency location beacons for explorers on land, sea, and air. 
      Of the 407 lives saved in 2024 through search and rescue efforts in the United States, NOAA (National Oceanic and Atmospheric Administration) reports that 52 rescues were the result of activated personal locator beacons, 314 from emergency position-indicating radio beacons, and 41 from emergency locator transmitters. Since 1982, more than 50,000 lives have been saved across the world. 
      Using GPS satellites, these beacons transmit their location to the Cospas-Sarsat network once activated. The beacons then provide the activation coordinates to the network, allowing first responders to rescue lost or distressed explorers.  
      NASA Artemis II crew members are assisted by U.S. Navy personnel as they exit a mockup of the Orion spacecraft in the Pacific Ocean during Underway Recovery Test 11 (URT-11) on Feb. 25, 2024, while his crewmates look on. URT-11 is the eleventh in a series of Artemis recovery tests, and the first time NASA and its partners put their Artemis II recovery procedures to the test with the astronauts.NASA/Kenny Allen The Search and Rescue Office, part of NASA’s SCaN (Space Communications and Navigation) Program, has assisted in search and rescue services since its formation in 1979 Now, the office is building on their long legacy of Earth-based beacon development to support crewed missions to space. 
      The beacons also are used for emergency location, if needed, as part of NASA’s crew launches to and from the International Space Station, and will support NASA’s Artemis campaign crew recovery preparations during future missions returning from deep space. Systems being tested, like the ANGEL (Advanced Next-Generation Emergency Locator) beacon, are benefitting life on Earth and missions to the Moon and Mars. Most recently, NASA partnered with the Department of Defense to practice Artemis II recovery procedures – including ANGEL beacon activation – during URT-11 (Underway Recovery Test 11).  
      Miniaturized Advanced Next-Generation Emergency Locator (ANGEL) beacons will be attached to the astronauts’ life preserver units. When astronauts Reid Wiseman, Victor Glover, Christina Koch, and CSA (Canadian Space Agency) astronaut Jeremy Hanse splash back down to Earth — or in the unlikely event of a launch abort scenario — these beacons will allow them to be found if they need to egress from the Orion capsule.NASA The SCaN program at NASA Headquarters in Washington provides strategic oversight to the Search and Rescue office. NOAA manages the U.S. network region for Cospas-Sarsat, which relies on flight and ground technologies originally developed at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. U.S. region rescue efforts are led by the U.S. Coast Guard, U.S. Air Force, and many other local rescue authorities. 

      About the Author
      Kendall Murphy
      Technical WriterKendall Murphy is a technical writer for the Space Communications and Navigation program office. She specializes in internal and external engagement, educating readers about space communications and navigation technology.
      Share
      Details
      Last Updated Feb 06, 2025 EditorGoddard Digital TeamContactKatherine Schauerkatherine.s.schauer@nasa.govLocationNASA Goddard Space Flight Center Related Terms
      Goddard Space Flight Center Artemis Communicating and Navigating with Missions Space Communications & Navigation Program Space Communications Technology Explore More
      4 min read NASA Search and Rescue Team Prepares for Safe Return of Artemis II Crew
      When Artemis II NASA astronauts Reid Wiseman, Victor Glover, Christina Hammock Koch, and Canadian Space…
      Article 2 years ago 3 min read NASA Search and Rescue Technology Saves Explorers, Enables Exploration
      Article 1 year ago 4 min read NASA Tests Beacon for Safe Recovery of Astronauts on Artemis Missions
      Article 3 years ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
  • Check out these Videos

×
×
  • Create New...