Jump to content

Tech Today: Remote Sensing Technology Fights Forest Fires


NASA

Recommended Posts

  • Publishers

3 min read

Preparations for Next Moonwalk Simulations Underway (and Underwater)

NASA’s Ikhana Unmanned Aircraft System (UAS).
NASA used its remotely piloted Ikhana aircraft to test technology it helped develop or recommended to the U.S. Forest Service, including a system to send sensor data to decision makers on the ground in near real time.
Credit: NASA

It’s not easy to predict the path of forest fires—a lot depends on constantly changing factors like wind. But it is crucial to be as accurate as possible because the lives, homes, and businesses of the tens of thousands of people living and working in fire-prone areas depend on the reliability of these predictions. Sensors mounted on airplanes or drones that provide a picture of the fire from above are an important tool, and that’s where NASA comes in. 

In partnership with the U.S. Forest Service, local and state firefighting agencies, and the Bureau of Land Management, NASA plays a pivotal role in battling infernos. The agency’s extensive experience and technical expertise in remote sensing technology have significantly improved the speed and accuracy of information relayed to firefighting decision-makers.

According to Don Sullivan, who specialized in information technology design at the time, the Airborne Science Program at NASA’s Ames Research Center in Silicon Valley, California, was integral to that effort.

In the 1990s, NASA began a project to adapt uncrewed aircraft for environmental research. The researchers at Ames wanted to ensure the technology would be useful to the broadest possible spectrum of potential end users. One concept tested during the project was sending data in real-time to the ground via communications links installed on the aircraft.

That link sent data faster and to multiple recipients at once—not just the team on the fire front line, but also the commanders organizing the teams and decision makers looking at the big picture across the entire region throughout the fire season, explained Sullivan.

For the Forest Service, this was a much-needed upgrade to the original system on their crewed jets: rolling up a printout and later thumb drives with thermal sensor data placed into a plastic tube attached to a parachute and dropped out of the airplane. NASA’s remotely piloted aircraft called Ikhana tested the technology, and it’s still used by the agency to collect data on wildfires.

Since the introduction of this technology, wildfires have gotten bigger, burn hotter, and set new records every year. But in California in 2008, this technology helped fight what was then the worst fire season on record. A NASA test flight using a data downlink system provided updated information to the incident managers that was crucial in determining where to send firefighting resources and whether a full evacuation of the town of Paradise was needed.

Without that timely information, said Sullivan, “there likely would have been injuries and certainly property damage that was worse than it turned out to be.”

Share

Details

Last Updated
Jul 31, 2024

Related Terms

View the full article

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By USH
      In the remote wilderness of the Shoria Mountains in southern Siberia, a long-hidden secret has remained untouched for millennia. Far from the reach of modern civilization, a discovery was made that would challenge our understanding of ancient human history. 

      In 2013, a team of 19 researchers, led by Georgy Sidorov, embarked on an expedition to explore this mysterious region. Their destination was Gora Shoria, a mountain towering 3,600 feet above sea level in a remote part of Russia. Intrigued by reports of strange megalithic structures, the team ventured into this secluded terrain. 
      What they found was extraordinary: an immense super-megalith dating back roughly 100,000 years that defied conventional history. These massive stone blocks, later known as the Gornaya Shoria Megaliths, appeared to be made of granite, featuring flat surfaces and precise right angles. The most astounding detail was the weight of the stones, exceeding 3,000 tons—making them the largest megaliths ever discovered. 
      The arrangement of these granite blocks suggested a deliberate design, far beyond what could be explained by natural formations. The blocks were carefully stacked, reaching a height of approximately 140 feet. This raised profound questions: how were such massive stones carved, transported, and assembled in this remote and rugged landscape? 
      Some researchers have speculated about the existence of a pre-flood civilization, a sophisticated society wiped out by a cataclysmic event. 
      Also a deep, narrow vertical shaft was uncovered. The shaft, lined with parallel stone slabs, appeared to be human-made. 
      The walls of the shaft were straight and polished, descending 40 meters (around 130 feet) before opening into a vast underground hall, 36 meters (around 118 feet) high. These walls were constructed from large megalithic blocks, perfectly fitted with minimal gaps. Some of the stones resembled columns, reinforcing the idea of deliberate design. The full explored length of the shaft spanned over 100 meters (approximately 350 feet). 
      The precision and scale of this structure left no doubt that it was an artificial creation of immense proportions. The polished walls and massive blocks bore a striking resemblance to the shafts within the Great Pyramid of Khufu in Egypt, suggesting a level of architectural sophistication that defies conventional explanations.  
      Speculation abounds regarding the shaft’s original purpose. Some believe it served an advanced technological function or was part of a larger, undiscovered structure. The exploration team took over an hour to reach the bottom of the shaft, which required significant climbing expertise and endurance. It is believed that additional chambers and channels, still unexplored, may lie even deeper underground. 
      How could these gigantic 200-ton stone blocks have been assembled with such accuracy, deep underground? What kind of technology was used to construct the shaft and underground chamber?  
      Some researchers have speculated that it may have been part of an ancient factory, a seismological research device, or even an energy generator. Others believe it was the underground portion of a long-lost pyramid that once stood on the surface of the mountain. 
      Despite differing theories, we may wonder what ancient forces or lost civilizations left their mark on this remote corner of the world?
        View the full article
    • By NASA
      4 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      An astronaut aboard the International Space Station photographed wildfire smoke from Nova Scotia billowing over the Atlantic Ocean in May 2023. Warm weather and lack of rain fueled blazes across Canada last year, burning 5% of the country’s forests.NASA Extreme wildfires like these will continue to have a large impact on global climate.
      Stoked by Canada’s warmest and driest conditions in decades, extreme forest fires in 2023 released about 640 million metric tons of carbon, NASA scientists have found. That’s comparable in magnitude to the annual fossil fuel emissions of a large industrialized nation. NASA funded the study as part of its ongoing mission to understand our changing planet.
      The research team used satellite observations and advanced computing to quantify the carbon emissions of the fires, which burned an area roughly the size of North Dakota from May to September 2023. The new study, published on Aug. 28 in the journal Nature, was led by scientists at NASA’s Jet Propulsion Laboratory in Southern California.
      To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
      Carbon monoxide from Canada wildfires curls thousands of miles across North America in this animation showing data from summer 2023. Lower concentrations are shown in purple; higher concentrations are in yellow. Red triangles indicate fire hotspots.NASA’s Goddard Space Flight Center They found that the Canadian fires released more carbon in five months than Russia or Japan emitted from fossil fuels in all of 2022 (about 480 million and 291 million metric tons, respectively). While the carbon dioxide (CO2) emitted from both wildfires and fossil fuel combustion cause extra warming immediately, there’s an important distinction, the scientists noted. As the forest regrows, the amount of carbon emitted from fires will be reabsorbed by Earth’s ecosystems. The CO2 emitted from the burning of fossil fuels is not readily offset by any natural processes.
      An ESA (European Space Agency) instrument designed to measure air pollution observed the fire plumes over Canada. The TROPOspheric Monitoring Instrument, or TROPOMI, flies aboard the Sentinel 5P satellite, which has been orbiting Earth since 2017. TROPOMI has four spectrometers that measure and map trace gases and fine particles (aerosols) in the atmosphere.
      The scientists started with the end result of the fires: the amount of carbon monoxide (CO) in the atmosphere during the fire season. Then they “back-calculated” how large the emissions must have been to produce that amount of CO. They were able to estimate how much CO2 was released based on ratios between the two gases in the fire plumes.  
      “What we found was that the fire emissions were bigger than anything in the record for Canada,” said Brendan Byrne, a JPL scientist and lead author of the new study. “We wanted to understand why.”
      Warmest Conditions Since at Least 1980
      Wildfire is essential to the health of forests, clearing undergrowth and brush and making way for new plant life. In recent decades, however, the number, severity, and overall size of wildfires have increased, according to the U.S. Department of Agriculture. Contributing factors include extended drought, past fire management strategies, invasive species, and the spread of residential communities into formerly less developed areas.
      To explain why Canada’s fire season was so intense in 2023, the authors of the new study cited tinderbox conditions across its forests. Climate data revealed the warmest and driest fire season since at least 1980. Temperatures in the northwest part of the country — where 61% of fire emissions occurred — were more than 4.5 degrees Fahrenheit (2.6 degrees Celsius) above average from May through September. Precipitation was also more than 3 inches (8 centimeters) below average for much of the year.
      Driven in large part by these conditions, many of the fires grew to enormous sizes. The fires were also unusually widespread, charring some 18 million hectares of forest from British Columbia in the west to Quebec and the Atlantic provinces in the east. The area of land that burned was more than eight times the 40-year average and accounted for 5% of Canadian forests.
      “Some climate models project that the temperatures we experienced last year will become the norm by the 2050s,” Byrne said. “The warming, coupled with lack of moisture, is likely to trigger fire activity in the future.”
      If events like the 2023 Canadian forest fires become more typical, they could impact global climate. That’s because Canada’s vast forests compose one of the planet’s important carbon sinks, meaning that they absorb more CO2 from the atmosphere than they release. The scientists said that it remains to be seen whether Canadian forests will continue to absorb carbon at a rapid rate or whether increasing fire activity could offset some of the uptake, diminishing the forests’ capacity to forestall climate warming.
      News Media Contacts
      Jane J. Lee / Andrew Wang
      Jet Propulsion Laboratory, Pasadena, Calif.
      818-354-0307 / 626-379-6874
      jane.j.lee@jpl.nasa.gov / andrew.wang@jpl.nasa.gov
      Written by Sally Younger
      2024-113
      Share
      Details
      Last Updated Aug 28, 2024 Related Terms
      Earth Climate Change Earth Science Water on Earth Explore More
      3 min read Eclipse Soundscapes AudioMoth Donations Will Study Nature at Night
      During the April 8, 2024 total solar eclipse, approximately 770 AudioMoth recording devices were used…
      Article 45 mins ago 9 min read Looking Back on Looking Up: The 2024 Total Solar Eclipse
      Introduction First as a bite, then a half Moon, until crescent-shaped shadows dance through the…
      Article 6 days ago 3 min read Entrepreneurs Challenge Prize Winner Uses Artificial Intelligence to Identify Methane Emissions
      The NASA Science Mission Directorate (SMD) instituted the Entrepreneurs Challenge to identify innovative ideas and…
      Article 1 week ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Engineer Adam Gannon works on the development of Cognitive Engine-1 in the Cognitive Communications Lab at NASA’s Glenn Research Center.Credit: NASA  Automated technology developed in Cleveland has launched to space aboard the Technology Education Satellite 11 mission. The flight test aims to confirm the precision and accuracy of this new technology developed at NASA’s Glenn Research Center. 
      The Cognitive Communications Project was founded by NASA in 2016 to develop autonomous space communications systems for the agency. Autonomous systems use technology that can react to its environment to implement updates during a mission, without needing any human interaction.  
      The project first collaborated with the Technology Education Satellite (TES) program at NASA’s Ames Research Center in California’s Silicon Valley back in 2022 to launch the TES-13 CubeSat, which sent the first neuromorphic processor to space. A neuromorphic processor is a piece of technology built to act in ways that replicate how the human brain functions. Through TES-13, the cognitive team was able to test their advanced technology in space successfully for the first time.  
      Researchers at NASA’s Ames Research Center in California’s Silicon Valley assemble the Technology Education Satellite-11 CubeSat inside of a laboratory.Credit: NASA  After the success of TES-13, the team compiled each of their unique capabilities into one end-to-end system, called Cognitive Engine 1, or CE-1. CE-1 is a space and ground software system that automates normal aspects of spacecraft communications, like service scheduling and planning reliable priority-based data transfers.  
      Cognitive technology launched to space for the second time on July 3 on TES-11 aboard Firefly Aerospace’s Noise of Summer mission. TES-11 was one of eight small satellites launched during the mission. It was created as a part of the Technology Education Satellite program at NASA Ames, which organizes collaborative projects and missions that pair college and university students with NASA researchers to evaluate how new technologies work on small satellites, known as CubeSats.  
      Image of various CubeSats deployed in space from the International Space Station. Credit: NASA  TES-11 is testing the components of CE-1 that allow satellites to independently schedule time with ground stations and download data without human interaction. Results from the TES-11 mission will be used by the Cognitive Communications team to finalize their CE-1 design, to ensure that the technology is ready to be adopted by future NASA missions.  
      The Cognitive Communications Project is funded by the Space Communications and Navigation program at NASA Headquarters in Washington and managed out of NASA’s Glenn Research Center in Cleveland.  
      Return to Newsletter Explore More
      1 min read Cleveland High School Students Land STEM Career Exploration Experience 
      Article 5 mins ago 1 min read NASA Lands at National Cherry Festival 
      Article 5 mins ago 1 min read Local Creators Learn About NASA’s Iconic Logo 
      Article 5 mins ago View the full article
    • By NASA
      5 Min Read NASA Optical Navigation Tech Could Streamline Planetary Exploration
      Optical navigation technology could help astronauts and robots find their ways using data from cameras and other sensors. Credits: NASA As astronauts and rovers explore uncharted worlds, finding new ways of navigating these bodies is essential in the absence of traditional navigation systems like GPS. Optical navigation relying on data from cameras and other sensors can help spacecraft — and in some cases, astronauts themselves — find their way in areas that would be difficult to navigate with the naked eye. Three NASA researchers are pushing optical navigation tech further, by making cutting edge advancements in 3D environment modeling, navigation using photography, and deep learning image analysis. In a dim, barren landscape like the surface of the Moon, it can be easy to get lost. With few discernable landmarks to navigate with the naked eye, astronauts and rovers must rely on other means to plot a course.
      As NASA pursues its Moon to Mars missions, encompassing exploration of the lunar surface and the first steps on the Red Planet, finding novel and efficient ways of navigating these new terrains will be essential. That’s where optical navigation comes in — a technology that helps map out new areas using sensor data.
      NASA’s Goddard Space Flight Center in Greenbelt, Maryland, is a leading developer of optical navigation technology. For example, GIANT (the Goddard Image Analysis and Navigation Tool) helped guide the OSIRIS-REx mission to a safe sample collection at asteroid Bennu by generating 3D maps of the surface and calculating precise distances to targets.
      Now, three research teams at Goddard are pushing optical navigation technology even further.
      Virtual World Development
      Chris Gnam, an intern at NASA Goddard, leads development on a modeling engine called Vira that already renders large, 3D environments about 100 times faster than GIANT. These digital environments can be used to evaluate potential landing areas, simulate solar radiation, and more.
      While consumer-grade graphics engines, like those used for video game development, quickly render large environments, most cannot provide the detail necessary for scientific analysis. For scientists planning a planetary landing, every detail is critical.
      Vira can quickly and efficiently render an environment in great detail.NASA “Vira combines the speed and efficiency of consumer graphics modelers with the scientific accuracy of GIANT,” Gnam said. “This tool will allow scientists to quickly model complex environments like planetary surfaces.”
      The Vira modeling engine is being used to assist with the development of LuNaMaps (Lunar Navigation Maps). This project seeks to improve the quality of maps of the lunar South Pole region which are a key exploration target of NASA’s Artemis missions.
      Vira also uses ray tracing to model how light will behave in a simulated environment. While ray tracing is often used in video game development, Vira utilizes it to model solar radiation pressure, which refers to changes in momentum to a spacecraft caused by sunlight.
      Vira can accurately render indirect lighting, which is when an area is still lit up even though it is not directly facing a light source.NASA Find Your Way with a Photo
      Another team at Goddard is developing a tool to enable navigation based on images of the horizon. Andrew Liounis, an optical navigation product design lead, leads the team, working alongside NASA Interns Andrew Tennenbaum and Will Driessen, as well as Alvin Yew, the gas processing lead for NASA’s DAVINCI mission.
      An astronaut or rover using this algorithm could take one picture of the horizon, which the program would compare to a map of the explored area. The algorithm would then output the estimated location of where the photo was taken.
      Using one photo, the algorithm can output with accuracy around hundreds of feet. Current work is attempting to prove that using two or more pictures, the algorithm can pinpoint the location with accuracy around tens of feet.
      “We take the data points from the image and compare them to the data points on a map of the area,” Liounis explained. “It’s almost like how GPS uses triangulation, but instead  of having multiple observers to triangulate one object, you have multiple observations from a single observer, so we’re figuring out where the lines of sight intersect.”
      This type of technology could be useful for lunar exploration, where it is difficult to rely on GPS signals for location determination.
      A Visual Perception Algorithm to Detect Craters
      To automate optical navigation and visual perception processes, Goddard intern Timothy Chase is developing a programming tool called GAVIN (Goddard AI Verification and Integration) Tool Suit.
      This tool helps build deep learning models, a type of machine learning algorithm that is trained to process inputs like a human brain. In addition to developing the tool itself, Chase and his team are building a deep learning algorithm using GAVIN that will identify craters in poorly lit areas, such as the Moon.
      “As we’re developing GAVIN, we want to test it out,” Chase explained. “This model that will identify craters in low-light bodies will not only help us learn how to improve GAVIN, but it will also prove useful for missions like Artemis, which will see astronauts exploring the Moon’s south pole region — a dark area with large craters — for the first time.”
      As NASA continues to explore previously uncharted areas of our solar system, technologies like these could help make planetary exploration at least a little bit simpler. Whether by developing detailed 3D maps of new worlds, navigating with photos, or building deep learning algorithms, the work of these teams could bring the ease of Earth navigation to new worlds.
      By Matthew Kaufman
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Share
      Details
      Last Updated Aug 07, 2024 EditorRob GarnerContactRob Garnerrob.garner@nasa.govLocationGoddard Space Flight Center Related Terms
      Goddard Technology Artificial Intelligence (AI) Goddard Space Flight Center Technology Explore More
      4 min read NASA Improves GIANT Optical Navigation Technology for Future Missions
      Goddard's GIANT optical navigation software helped guide the OSIRIS-REx mission to the Asteroid Bennu. Today…
      Article 10 months ago 4 min read Space Station Research Contributes to Navigation Systems for Moon Voyages
      Article 2 years ago 5 min read NASA, Industry Improve Lidars for Exploration, Science
      NASA engineers will test a suite of new laser technologies from an aircraft this summer…
      Article 5 months ago View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Akeem Shannon showcasing Flipstik attached to a smartphone. The product’s design was improved by looking at NASA research to inform its gecko-inspired method of adhering to surfacesCredit: Flipstik Inc. When it comes to innovative technologies, inventors often find inspiration in the most unexpected places. A former salesman, Akeem Shannon, was inspired by his uncle, who worked as an engineer at NASA’s Marshall Space Flight Center in Huntsville, Alabama, to research the agency’s published technologies. He came across a sticky NASA invention that would help him launch his breakout product.  

      In the early 2010s, a team of roboticists at NASA’s Jet Propulsion Laboratory in Southern California were exploring methods to enhance robots’ gripping capabilities. They came across the Van Der Waals force – a weak electrostatic bond that forms at the molecular level when points on two surfaces make contact. This is the same force that geckos use to climb along walls.  

      Much like a gecko’s foot, this apparatus developed at the Jet Propulsion Laboratory uses tiny fibers to grip objects and hold them tight. This work later inspired and informed the development of Flipstik.Credit: NASA The microscopic hairs on gecko toe pads are called setae, which gives the technology the nickname of “synthetic setae.” While Shannon couldn’t use this NASA technology to hang a TV on a wall, he saw a way to mount a much smaller screen – a cellphone. 

      A synthetic setae attachment on a cellphone case could stick to most surfaces, such as mirrors or the back of airplane seats. With a product design in hand, Shannon founded St. Louis-based Flipstik Inc. in 2018. Shannon wanted to make a reliable product that could be used multiple times in various situations. He said the published NASA research, which describes methods of molding and casting the tiny hairs to be more durable, was indispensable to making his product portable and reusable. 

      Flipstik has made an impact on the mobile device industry. In addition to people using it to mount their phones to watch videos, it has become popular among content creators to capture camera angles. Flipstik also allows deaf users to keep their hands free, enabling them to make video calls in sign language. From geckos to NASA research, Shannon’s innovation is a reminder that inspiration can come from anywhere. 

      Read More Share
      Details
      Last Updated Aug 06, 2024 Related Terms
      Technology Transfer & Spinoffs Jet Propulsion Laboratory Robotics Spinoffs Technology Transfer Explore More
      6 min read Quantum Scale Sensors used to Measure Planetary Scale Magnetic Fields
      Magnetic fields are everywhere in our solar system. They originate from the Sun, planets, and…
      Article 1 hour ago 4 min read AstroViz: Iconic Pillars of Creation Star in NASA’s New 3D Visualization
      NASA’s Universe of Learning – a partnership among the Space Telescope Science Institute (STScI), Caltech/IPAC,…
      Article 20 hours ago 7 min read NASA’s Perseverance Rover Scientists Find Intriguing Mars Rock
      Article 2 weeks ago Keep Exploring Discover Related Topics
      Robotics
      Jet Propulsion Laboratory
      Technology Transfer & Spinoffs
      Technology
      View the full article
  • Check out these Videos

×
×
  • Create New...