Jump to content

Tech Today: Remote Sensing Technology Fights Forest Fires


Recommended Posts

  • Publishers
Posted

3 min read

Preparations for Next Moonwalk Simulations Underway (and Underwater)

NASA’s Ikhana Unmanned Aircraft System (UAS).
NASA used its remotely piloted Ikhana aircraft to test technology it helped develop or recommended to the U.S. Forest Service, including a system to send sensor data to decision makers on the ground in near real time.
Credit: NASA

It’s not easy to predict the path of forest fires—a lot depends on constantly changing factors like wind. But it is crucial to be as accurate as possible because the lives, homes, and businesses of the tens of thousands of people living and working in fire-prone areas depend on the reliability of these predictions. Sensors mounted on airplanes or drones that provide a picture of the fire from above are an important tool, and that’s where NASA comes in. 

In partnership with the U.S. Forest Service, local and state firefighting agencies, and the Bureau of Land Management, NASA plays a pivotal role in battling infernos. The agency’s extensive experience and technical expertise in remote sensing technology have significantly improved the speed and accuracy of information relayed to firefighting decision-makers.

According to Don Sullivan, who specialized in information technology design at the time, the Airborne Science Program at NASA’s Ames Research Center in Silicon Valley, California, was integral to that effort.

In the 1990s, NASA began a project to adapt uncrewed aircraft for environmental research. The researchers at Ames wanted to ensure the technology would be useful to the broadest possible spectrum of potential end users. One concept tested during the project was sending data in real-time to the ground via communications links installed on the aircraft.

That link sent data faster and to multiple recipients at once—not just the team on the fire front line, but also the commanders organizing the teams and decision makers looking at the big picture across the entire region throughout the fire season, explained Sullivan.

For the Forest Service, this was a much-needed upgrade to the original system on their crewed jets: rolling up a printout and later thumb drives with thermal sensor data placed into a plastic tube attached to a parachute and dropped out of the airplane. NASA’s remotely piloted aircraft called Ikhana tested the technology, and it’s still used by the agency to collect data on wildfires.

Since the introduction of this technology, wildfires have gotten bigger, burn hotter, and set new records every year. But in California in 2008, this technology helped fight what was then the worst fire season on record. A NASA test flight using a data downlink system provided updated information to the incident managers that was crucial in determining where to send firefighting resources and whether a full evacuation of the town of Paradise was needed.

Without that timely information, said Sullivan, “there likely would have been injuries and certainly property damage that was worse than it turned out to be.”

Share

Details

Last Updated
Jul 31, 2024

Related Terms

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By Amazing Space
      The Sun Today - 1st April - Close Up View.
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      A NASA F/A-18 research aircraft flies above California near NASA’s Armstrong Flight Research Center in Edwards, California, testing a commercial precision landing technology for future space missions. The Psionic Space Navigation Doppler Lidar (PSNDL) system is installed in a pod located under the right wing of the aircraft.NASA Nestled in a pod under an F/A-18 Hornet aircraft wing, flying above California, and traveling up to the speed of sound, NASA put a commercial sensor technology to the test. The flight tests demonstrated the sensor accuracy and navigation precision in challenging conditions, helping prepare the technology to land robots and astronauts on the Moon and Mars. 
      The Psionic Space Navigation Doppler Lidar (PSNDL) system is rooted in NASA technology that Psionic, Inc. of Hampton, Virginia, licensed and further developed. They miniaturized the NASA technology, added further functionality, and incorporated component redundancies that make it more rugged for spaceflight. The PSNDL navigation system also includes cameras and an inertial measurement unit to make it a complete navigation system capable of accurately determining a vehicle’s position and velocity for precision landing and other spaceflight applications. 
      NASA engineers and technicians install the Psionic Space Navigation Doppler Lidar (PSNDL) system into a testing pod on a NASA F/A-18 research aircraft ahead of February 2025 flight tests at NASA’s Armstrong Flight Research Center in Edwards, California.NASA The aircraft departed from NASA’s Armstrong Flight Research Center in Edwards, California, and conducted a variety of flight paths over several days in February 2025. It flew a large figure-8 loop and conducted several highly dynamic maneuvers over Death Valley, California, to collect navigation data at various altitudes, velocities, and orientations relevant for lunar and Mars entry and descent. Refurbished for these tests, the NASA F/A-18 pod can support critical data collection for other technologies and users at a low cost. 
      Doppler Lidar sensors provide a highly accurate measurement of speed by measuring the frequency shift between laser light emitted from the sensor reflected from the ground. Lidar are extremely useful in sunlight-challenged areas that may have long shadows and stark contrasts, such as the lunar South Pole. Pairing PSNDL with cameras adds the ability to visually compare pictures with surface reconnaissance maps of rocky terrain and navigate to landing at interesting locations on Mars. All the data is fed into a computer to make quick, real-time decisions to enable precise touchdowns at safe locations. 
      Psionic Space Navigation Doppler Lidar (PSNDL) system installed in a testing pod on a NASA F/A-18 research aircraft ahead of February 2025 flight tests at NASA’s Armstrong Flight Research Center in Edwards, California.NASA Since licensing NDL in 2016, Psionic has received funding and development support from NASA’s Space Technology Mission Directorate through its Small Business Innovative Research program and Tipping Point initiative. The company has also tested PSNDL prototypes on suborbital vehicles via the Flight Opportunities program. In 2024, onboard a commercial lunar lander, NASA successfully demonstrated the predecessor NDL system developed by the agency’s Langley Research Center in Hampton, Virginia. 
      Explore More
      4 min read NASA Starling and SpaceX Starlink Improve Space Traffic Coordination
      Article 10 mins ago 6 min read How NASA’s Perseverance Is Helping Prepare Astronauts for Mars
      Article 36 mins ago 2 min read NASA Cloud Software Helps Companies Find their Place in Space 
      Article 20 hours ago Facebook logo @NASATechnology @NASA_Technology Share
      Details
      Last Updated Mar 26, 2025 EditorLoura Hall Related Terms
      Armstrong Flight Research Center Game Changing Development Program Space Communications Technology Space Technology Mission Directorate Technology Technology for Living in Space Technology for Space Travel View the full article
    • By NASA
      Curiosity Navigation Curiosity Home Mission Overview Where is Curiosity? Mission Updates Science Overview Instruments Highlights Exploration Goals News and Features Multimedia Curiosity Raw Images Images Videos Audio Mosaics More Resources Mars Missions Mars Sample Return Mars Perseverance Rover Mars Curiosity Rover MAVEN Mars Reconnaissance Orbiter Mars Odyssey More Mars Missions Mars Home 4 min read
      Sols 4484-4485: Remote Sensing on a Monday
      NASA’s Mars rover Curiosity acquired this image using its Left Navigation Camera on March 17, 2025 — sol 4483, or Martian day 4,483 of the Mars Science Laboratory mission — at 09:38:17 UTC. NASA/JPL-Caltech Written by Conor Hayes, Graduate Student at York University
      Earth planning date: Monday, March 17, 2025
      Last week I was in Houston, Texas, at the Lunar and Planetary Science Conference. The mid-March weather in Houston is often more like mid-summer weather here in Toronto, so it has been a bit of a shock coming home to temperatures that are hovering around freezing rather than being in the upper 20s (degrees Celsius, or the low to mid 80s for those of you still using Fahrenheit). Still, Toronto is positively balmy compared to Gale Crater, where temperatures usually range between minus 80°C and minus 20°C (or minus 110°F to minus 5°F) during this part of the year. These cold temperatures and their associated higher demands on the rover’s available power for heating are continuing to motivate many of the decisions that we make during planning.
      We received the double good news this morning that the weekend’s drive completed successfully, including the mid-drive imaging of the other side of “Humber Park” that Michelle mentioned in Friday’s blog, and that our estimates of the weekend plan’s power consumption ended up being a little conservative. So we started planning exactly where we wanted to be, and with more power to play around with than we had expected. Yay!
      The weekend’s drive left us parked in front of some rocks with excellent layering and interesting ripples that we really wanted to get a closer look at with MAHLI. (See the cover image for a look at these rocks as seen by Navcam.) Sadly, we also ended up parked in such a way that presented a slip hazard if the arm was unstowed. As much as we would have loved to get close-up images of these rocks, we love keeping Curiosity’s arm safe even more, so we had to settle for a remote sensing-only plan instead.
      Both the geology and mineralogy (GEO) and the environmental science (ENV) teams took full advantage of the extra power gifted to us today to create a plan packed full of remote sensing observations. Because we’re driving on the first sol of this two-sol plan, any “targeted” observations, i.e. those where we know exactly where we want to point the rover’s cameras, must take place before the drive. The first sol is thus packed full of Mastcam and ChemCam observations, starting with a 14×3 Mastcam mosaic of the area in front of us that’s outside of today’s workspace. Individual targets then get some Mastcam love with mosaics of various ripple and layering features at “Verdugo Peak,” “Silver Moccasin Trail,” and “Jones Peak.” Mastcam and ChemCam also team up on a LIBS target, “Trancas Canyon,” and some more long-distance mosaics of Gould Mesa, a feature about 100 meters away from us (about 328 feet) that we’ll be driving to the south of as we continue to head toward the “boxwork” structures.
      After a drive, there often aren’t many activities scheduled other than the imaging of our new location that we’ll need for the next planning day. However, in this plan ENV decided to take advantage of the fact that Navcam observations can take place at the same time that the rover is talking to one of the spacecraft that orbit Mars. This is a useful trick when power is tight as it allows us to do more science without adding additional awake time (since the rover needs to be awake anyway to communicate with the orbiters). Today, it’s being used to get some extra cloud observations right before sunset, a time that we don’t often get to observe. These observations include a zenith movie that looks straight up over the rover and a “phase function sky survey,” which takes a series of nine movies that form a dome around the rover to examine the properties of the clouds’ ice crystals. 
      The second sol of this plan is much more relaxed, as post-drive sols often are because we don’t know exactly where we’ll be after a drive. Today, we’ve just got our usual ChemCam AEGIS activity, followed by a pair of Navcam cloud and cloud shadow movies to measure the altitude of clouds over Gale. As always, we’ve also got our usual set of REMS, RAD, and DAN activities throughout this plan.
      Share








      Details
      Last Updated Mar 20, 2025 Related Terms
      Blogs Explore More
      2 min read Sols 4481-4483: Humber Pie


      Article


      2 days ago
      3 min read Sols 4479-4480: What IS That Lumpy, Bumpy Rock?


      Article


      6 days ago
      3 min read Navigating a Slanted River


      Article


      1 week ago
      Keep Exploring Discover More Topics From NASA
      Mars


      Mars is the fourth planet from the Sun, and the seventh largest. It’s the only planet we know of inhabited…


      All Mars Resources


      Explore this collection of Mars images, videos, resources, PDFs, and toolkits. Discover valuable content designed to inform, educate, and inspire,…


      Rover Basics


      Each robotic explorer sent to the Red Planet has its own unique capabilities driven by science. Many attributes of a…


      Mars Exploration: Science Goals


      The key to understanding the past, present or future potential for life on Mars can be found in NASA’s four…

      View the full article
    • By Space Force
      DAF guidance on Return to In-Person Work for the purpose of creating a more capable and lethal force.
      View the full article
    • By NASA
      As part of NASA’s Advanced Capabilities for Emergency Response Operations flight tests in November 2024, Overwatch Aero flies a vertical takeoff and landing aircraft in Watsonville, California.Credit: NASA NASA will conduct a live flight test of aircraft performing simulated wildland fire response operations using a newly developed airspace management system at 9 a.m. PDT on Tuesday, March 25, in Salinas, California.
      NASA’s new portable airspace management system, part of the agency’s Advanced Capabilities for Emergency Response Operations (ACERO) project, aims to significantly expand the window of time crews have to respond to wildland fires. The system provides the air traffic awareness needed to safely send aircraft – including drones and remotely piloted helicopters – into wildland fire operations, even during low-visibility conditions. Current aerial firefighting operations are limited to times when pilots have clear visibility, which lowers the risk of flying into the surrounding terrain or colliding with other aircraft. This restriction grounds most aircraft at night and during periods of heavy smoke.
      During this inaugural flight test, researchers will use the airspace management system to coordinate the flight operations of two small drones, an electric vertical takeoff and landing aircraft, and a remotely piloted aircraft that will have a backup pilot aboard. The drones and aircraft will execute examples of critical tasks for wildland fire management, including weather data sharing, simulated aerial ignition flights, and communications relay.
      Media interested in viewing the ACERO flight testing must RSVP by 4 p.m. Friday, March 21, to the NASA Ames Office of Communications by email at: arc-dl-newsroom@mail.nasa.gov or by phone at 650-604-4789. NASA will release additional details, including address and arrival logistics, to media credentialed for the event. A copy of NASA’s media accreditation policy is online.
      NASA’s ACERO researchers will use data from the flight test to refine the airspace management system. The project aims to eventually provide this technology to wildland fire crews for use in the field, helping to save lives and property. This project is managed at NASA’s Ames Research Center in California’s Silicon Valley.
      For more information on ACERO, visit:
      https://go.nasa.gov/4bYEzsD
      -end-
      Rob Margetta
      Headquarters, Washington
      202-358-1600
      robert.j.margetta@nasa.gov
      Hillary Smith
      Ames Research Center, Silicon Valley
      650-604-4789
      hillary.smith@nasa.gov
      Share
      Details
      Last Updated Mar 18, 2025 EditorJessica TaveauLocationNASA Headquarters Related Terms
      Ames Research Center Advanced Capabilities for Emergency Response Operations Aeronautics Aeronautics Research Mission Directorate Flight Innovation View the full article
  • Check out these Videos

×
×
  • Create New...