Jump to content

Understanding Risk, Artificial Intelligence, and Improving Software Quality


NASA

Recommended Posts

  • Publishers

The software discipline has broad involvement across each of the NASA Mission Directorates. Some recent discipline focus and development areas are highlighted below, along with a look at the Software Technical Discipline Team’s (TDT) approach to evolving discipline best practices toward the future.

Understanding Automation Risk

Software creates automation. Reliance on that automation is increasing the amount of software in NASA programs. This year, the software team examined historical software incidents in aerospace to characterize how, why, and where software or automation is mostly likely to fail. The goal is to better engineer software to minimize the risk of errors, improve software processes, and better architect software for resilience to errors (or improve fault-tolerance should errors occur).

techup2023-pg50-51-art1.png

Some key findings shown in the above charts, indicate that software more often does the wrong thing rather than just crash. Rebooting was found to be ineffective when software behaves erroneously. Unexpected behavior was mostly attributed to the code or logic itself, and about half of those instances were the result of missing software—software not present due to unanticipated situations or missing requirements. This may indicate that even fully tested software is exposed to this significant class of error. Data misconfiguration was a sizeable factor that continues to grow with the advent of more modern data-driven systems. A final subjective category assessed was “unknown unknowns”—things that could not have been reasonably anticipated. These accounted for 19% of software incidents studied.

The software team is using and sharing these findings to improve best practices. More emphasis is being placed on the importance of complete requirements, off-nominal test campaigns, and “test as you fly” using real hardware in the loop. When designing systems for fault tolerance, more consideration should be given to detecting and correcting for erroneous behavior versus just checking for a crash. Less confidence should be placed on rebooting as an effective recovery strategy. Backup strategies for automations should be employed for critical applications—considering the historic prevalence of absent software and unknown unknowns. More information can be found in NASA/TP-20230012154, Software Error Incident Categorizations in Aerospace.

Employing AI and Machine Learning Techniques

The rise of artificial intelligence (AI) and machine learning (ML) techniques has allowed NASA to examine data in new ways that were not previously possible. While NASA has been employing autonomy since its inception, AI/ML techniques provide teams the ability to expand the use of autonomy outside of previous bounds. The Agency has been working on AI ethics frameworks and examining standards, procedures, and practices, taking security implications into account. While AI/ML generally uses nondeterministic statistical algorithms that currently limit its use in safety-critical flight applications, it is used by NASA in more than 400 AI/ML projects aiding research and science. The Agency also uses AI/ML Communities of Practice for sharing knowledge across the centers. The TDT surveyed AI/ML work across the Agency and summarized it for trends and lessons.

Common usages of AI/ML include image recognition and identification. NASA Earth science missions use AI/ML to identify marine debris, measure cloud thickness, and identify wildfire smoke (examples are shown in the satellite images below). This reduces the workload on personnel. There are many applications of AI/ML being used to predict atmospheric physics. One example is hurricane track and intensity prediction. Another example is predicting planetary boundary layer thickness and comparing it against measurements, and those predictions are being fused with live data to improve the performance over previous boundary layer models.

techup2023-pg50-51-art2.png?w=1815
Examples of how NASA uses AI/ML. Satellite images of clouds with estimation of cloud thickness (left) and wildfire detection (right).
techup2023-pg50-51-art3.png?w=2048
NASA-HDBK-2203, NASA Software Engineering and Assurance Handbook (https://swehb.nasa.gov)

The Code Analysis Pipeline: Static Analysis Tool for IV&V and Software Quality Improvement

The Code Analysis Pipeline (CAP) is an open-source tool architecture that supports software development and assurance activities, improving overall software quality. The Independent Verification and Validation (IV&V) Program is using CAP to support software assurance on the Human Landing System, Gateway, Exploration Ground Systems, Orion, and Roman. CAP supports the configuration and automated execution of multiple static code analysis tools to identify potential code defects, generate code metrics that indicate potential areas of quality concern (e.g., cyclomatic complexity), and execute any other tool that analyzes or processes source code. The TDT is focused on integrating Modified Condition/Decision Coverage analysis support for coverage testing. Results from tools are consolidated into a central database and presented in context through a user interface that supports review, query, reporting, and analysis of results as the code matures.

The tool architecture is based on an industry standard DevOps approach for continuous building of source code and running of tools. CAP integrates with GitHub for source code control, uses Jenkins to support automation of analysis builds, and leverages Docker to create standard and custom build environments that support unique mission needs and use cases.

Improving Software Process & Sharing Best Practices

The TDT has captured the best practice knowledge from across the centers in NPR 7150.2, NASA Software Engineering Requirements, and NASA-HDBK-2203, NASA Software Engineering and Assurance Handbook (https://swehb.nasa.gov.) Two APPEL training classes have been developed and shared with several organizations to give them the foundations in the NPR and software engineering management. The TDT established several subteams to help programs/projects as they tackle software architecture, project management, requirements, cybersecurity, testing and verification, and programmable logic controllers. Many of these teams have developed guidance and best practices, which are documented in NASA-HDBK-2203 and on the NASA Engineering Network.

NPR 7150.2 and the handbook outline best practices over the full lifecycle for all NASA software. This includes requirements development, architecture, design, implementation, and verification. Also covered, and equally important, are the supporting activities/functions that improve quality, including software assurance, safety configuration management, reuse, and software acquisition. Rationale and guidance for the requirements are addressed in the handbook that is internally and externally accessible and regularly updated as new information, tools, and techniques are found and used.

The Software TDT deputies train software engineers, systems engineers, chief engineers, and project managers on the NPR requirements and their role in ensuring these requirements are implemented across NASA centers. Additionally, the TDT deputies train software technical leads on many of the advanced management aspects of a software engineering effort, including planning, cost estimating, negotiating, and handling change management.

View the full article

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By Space Force
      SSC and USC partnered up to pair USC Trojans with SSC Guardians to work within real USSF programs. This partnership team acted as a “living laboratory” to identify strategies for implementing agile development into complex defense projects.

      View the full article
    • By NASA
      At NASA, high-end computing is essential for many agency missions. This technology helps us advance our understanding of the universe – from our planet to the farthest reaches of the cosmos. Supercomputers enable projects across diverse research, such as making discoveries about the Sun’s activity that affects technologies in space and life on Earth, building artificial intelligence-based models for innovative weather and climate science, and helping redesign the launch pad that will send astronauts to space with Artemis II. 
      These projects are just a sample of the many on display in NASA’s exhibit during the International Conference for High Performance Computing, Networking, Storage and Analysis, or SC24. NASA’s Dr. Nicola “Nicky” Fox, associate administrator for the agency’s Science Mission Directorate, will deliver the keynote address, “NASA’s Vision for High Impact Science and Exploration,” on Tuesday, Nov. 19, where she’ll share more about the ways NASA uses supercomputing to explore the universe for the benefit of all. Here’s a little more about the work NASA will share at the conference: 
      1. Simulations Help in Redesign of the Artemis Launch Environment
      To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
      This simulation of the Artemis I launch shows how the Space Launch System rocket's exhaust plumes interact with the air, water, and the launchpad. Colors on surfaces indicate pressure levels—red for high pressure and blue for low pressure. The teal contours illustrate where water is present. NASA/Chris DeGrendele, Timothy Sandstrom Researchers at NASA Ames are helping ensure astronauts launch safely on the Artemis II test flight, the first crewed mission of the Space Launch System (SLS) rocket and Orion spacecraft, scheduled for 2025. Using the Launch Ascent and Vehicle Aerodynamics software, they simulated the complex interactions between the rocket plume and the water-based sound suppression system used during the Artemis I launch, which resulted in damage to the mobile launcher platform that supported the rocket before liftoff.
      Comparing simulations with and without the water systems activated revealed that the sound suppression system effectively reduces pressure waves, but exhaust gases can redirect water and cause significant pressure increases. 
      The simulations, run on the Aitken supercomputer at the NASA Advanced Supercomputing facility at Ames, generated about 400 terabytes of data. This data was provided to aerospace engineers at NASA’s Kennedy Space Center in Florida, who are redesigning the flame deflector and mobile launcher for the Artemis II launch.
      2. Airplane Design Optimization for Fuel Efficiency
      In this comparison of aircraft designs, the left wing models the aircraft’s initial geometry, while the right wing models an optimized shape. The surface is colored by the air pressure on the aircraft, with orange surfaces representing shock waves in the airflow. The optimized design modeled on the right wing reduces drag by 4% compared to the original, leading to improved fuel efficiency. NASA/Brandon Lowe To help make commercial flight more efficient and sustainable, researchers and engineers at NASA’s Ames Research Center in California’s Silicon Valley are working to refine aircraft designs to reduce air resistance, or drag, by fine-tuning the shape of wings, fuselages, and other aircraft structural components. These changes would lower the energy required for flight and reduce the amount of fuel needed, produce fewer emissions, enhance overall performance of aircraft, and could help reduce noise levels around airports. 
      Using NASA’s Launch, Ascent, and Vehicle Aerodynamics computational modeling software, developed at Ames, researchers are leveraging the power of agency supercomputers to run hundreds of simulations to explore a variety of design possibilities – on existing aircraft and future vehicle concepts. Their work has shown the potential to reduce drag on an existing commercial aircraft design by 4%, translating to significant fuel savings in real-world applications.
      3. Applying AI to Weather and Climate
      This visualization compares the track of the Category 4 hurricane, Ida, from MERRA-2 reanalysis data (left) with a prediction made without specific training, from NASA and IBM’s Prithvi WxC foundation model (right). Both models were initialized at 00 UTC on 2021-08-27.The University of Alabama in Huntsville/Ankur Kumar; NASA/Sujit Roy Traditional weather and climate models produce global and regional results by solving mathematical equations for millions of small areas (grid boxes) across Earth’s atmosphere and oceans. NASA and partners are now exploring newer approaches using artificial intelligence (AI) techniques to train a foundation model. 
      Foundation models are developed using large, unlabeled datasets so researchers can fine-tune results for different applications, such as creating forecasts or predicting weather patterns or climate changes, independently with minimal additional training. 
      NASA developed the open source, publicly available Prithvi Weather-Climate foundation model (Prithvi WxC), in collaboration with IBM Research. Prithvi WxC was pretrained using 160 variables from  NASA’s Modern-era Retrospective analysis for Research and Applications (MERRA-2) dataset on the newest NVIDIA A100 GPUs at the NASA Advanced Supercomputing facility. 
      Armed with 2.3 billion parameters, Prithvi WxC can model a variety of weather and climate phenomena – such as hurricane tracks – at fine resolutions. Applications include targeted weather prediction and climate projection, as well as representing physical processes like gravity waves. 
      4. Simulations and AI Reveal the Fascinating World of Neutron Stars
      3D simulation of pulsar magnetospheres, run on NASA’s Aitken supercomputer using data from the agency‘s Fermi space telescope. The red arrow shows the direction of the star’s magnetic field. Blue lines trace high-energy particles, producing gamma rays, in yellow. Green lines represent light particles hitting the observer’s plane, illustrating how Fermi detects pulsar gamma rays. NASA/Constantinos Kalapotharakos To explore the extreme conditions inside neutron stars, researchers at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, are using a blend of simulation, observation, and AI to unravel the mysteries of these extraordinary cosmic objects. Neutron stars are the dead cores of stars that have exploded and represent some of the densest objects in the universe.
      Cutting-edge simulations, run on supercomputers at the NASA Advanced Supercomputing facility, help explain phenomena observed by NASA’s Fermi Gamma-ray Space Telescope and Neutron star Interior Composition Explorer (NICER) observatory. These phenomena include the rapidly spinning, highly magnetized neutron stars known as pulsars, whose detailed physical mechanisms have remained mysterious since their discovery. By applying AI tools such as deep neural networks, the scientists can infer the stars’ mass, radius, magnetic field structure, and other properties from data obtained by the NICER and Fermi observatories. 
      The simulations’ unprecedented results will guide similar studies of black holes and other space environments, as well as play a pivotal role in shaping future scientific space missions and mission concepts.
      5. Modeling the Sun in Action – From Tiny to Large Scales 
      Image from a 3D simulation showing the evolution of flows in the upper layers of the Sun, with the most vigorous motions shown in red. These turbulent flows can generate magnetic fields and excite sound waves, shock waves, and eruptions. NASA/Irina Kitiashvili and Timothy A. Sandstrom The Sun’s activity, producing events such as solar flares and coronal mass ejections, influences the space environment and cause space weather disturbances that can interfere with satellite electronics, radio communications, GPS signals, and power grids on Earth. Scientists at NASA Ames produced highly realistic 3D models that – for the first time – allow them to examine the physics of solar plasma in action, from very small to very large scales. These models help interpret observations from NASA spacecraft like the Solar Dynamics Observatory (SDO). 
      Using NASA’s StellarBox code on supercomputers at NASA’s Advanced Supercomputing facility, the scientists improved our understanding of the origins of solar jets and tornadoes – bursts of extremely hot, charged plasma in the solar atmosphere. These models allow the science community to address long-standing questions of solar magnetic activity and how it affects space weather.
      6. Scientific Visualization Makes NASA Data Understandable
      This global map is a frame from an animation showing how wind patterns and atmospheric circulation moved carbon dioxide through Earth’s atmosphere from January to March 2020. The DYAMOND model’s high resolution shows unique sources of carbon dioxide emissions and how they spread across continents and oceans.NASA/Scientific Visualization Studio NASA simulations and observations can yield petabytes of data that are difficult to comprehend in their original form. The Scientific Visualization Studio (SVS), based at NASA Goddard, turns data into insight by collaborating closely with scientists to create cinematic, high-fidelity visualizations.
      Key infrastructure for these SVS creations includes the NASA Center for Climate Simulation’s Discover supercomputer at Goddard, which hosts a variety of simulations and provides data analysis and image-rendering capabilities. Recent data-driven visualizations show a coronal mass ejection from the Sun hitting Earth’s magnetosphere using the Multiscale Atmosphere-Geospace Environment (MAGE) model; global carbon dioxide emissions circling the planet in the DYnamics of the Atmospheric general circulation Modeled On Non-hydrostatic Domains (DYAMOND) model; and representations of La Niña and El Niño weather patterns using the El Niño-Southern Oscillation (ENSO) model. 
      For more information about NASA’s virtual exhibit at the International Conference for High Performance Computing, Networking, Storage and Analysis, being held in Atlanta, Nov. 17-22, 2024, visit: 
      https://www.nas.nasa.gov/SC24
      For more information about supercomputers run by NASA High-End Computing, visit: 
      https://hec.nasa.gov
      For news media:
      Members of the news media interested in covering this topic should reach out to the NASA Ames newsroom.
      Authors: Jill Dunbar, Michelle Moyer, and Katie Pitta, NASA’s Ames Research Center; and Jarrett Cohen, NASA’s Goddard Space Flight Center
      View the full article
    • By NASA
      NASA logo NASA has awarded $15.6 million in grant funding to 15 projects supporting the maintenance of open-source tools, frameworks, and libraries used by the NASA science community, for the benefit of all.
      The agency’s Open-Source Tools, Frameworks, and Libraries awards provide support for the sustainable development of tools freely available to everyone and critical for the goals of the agency’s Science Mission Directorate.
      “We received almost twice the number of proposals this year than we had in the previous call,” said Steve Crawford, program executive, Open Science implementation, Office of the Chief Science Data Officer, NASA Headquarters in Washington. “The NASA science community’s excitement for this program demonstrates the need for sustained support and maintenance of open-source software. These projects are integral to our missions, critical to our data infrastructure, underpin machine learning and data science tools, and are used by our researchers, every day, to advance science that protects our planet and broadens our understanding of the universe.”
      This award program is one of several cross-divisional opportunities at NASA focused on advancing open science practices. The grants are funded by NASA’s Office of the Chief Science Data Officer through the agency’s Research Opportunities for Space and Earth Science. The solicitation sought proposals through two types of awards:
      Foundational awards: cooperative agreements for up to five years for open-source tools, frameworks, and libraries that have a significant impact on two or more divisions of the Science Mission Directorate. Sustainment awards: grants or cooperative agreements of up to three years for open-source tools, frameworks, and libraries that have significant impact in one or more divisions of the Science Mission Directorate. 2024 awardees are:
      Foundation awards:
      NASA’s Ames Research Center, Silicon Valley, CaliforniaPrincipal investigator: Ross Beyer “Expanding and Maintaining the Ames Stereo Pipeline” Caltech, Pasadena, CaliforniaPrincipal investigator: Brigitta Sipocz “Enhancement of Infrastructure and Sustained Maintenance of Astroquery” Cornell University, Scarsdale, New YorkPrincipal investigator: Ramin Zabih “Modernize and Expand arXiv’s Essential Infrastructure” NASA’s Goddard Space Flight Center, Greenbelt, MarylandPrincipal investigator: D. Cooley “Enabling SMD Science Using the General Mission Analysis Tool” NumFOCUS, Austin, TexasPrincipal investigator: Thomas Caswell “Sustainment of Matplotlib and Cartopy” NumFOCUSPrincipal investigator: Erik Tollerud “Investing in the Astropy Project to Enable Research and Education in Astronomy” Sustainment awards:
      NASA’s Jet Propulsion Laboratory, Southern CaliforniaPrincipal investigator: Cedric David “Sustain NASA’s River Software for the Satellite Data Deluge,” three-year award Pennsylvania State University, University ParkPrincipal investigator: David Radice “AthenaK: A Performance Portable Simulation Infrastructure for Computational Astrophysics,” three-year award United States Geological Survey, Reston, VirginiaPrincipal investigator: Trent Hare “Planetary Updates for QGIS,” one-year award NASA JPLPrincipal investigator: Michael Starch “How To F Prime: Empowering Science Missions Through Documentation and Examples,” three-year award NASA GoddardPrincipal investigator: Albert Shih “Enhancing Consistency and Discoverability Across the SunPy Ecosystem,” three-year award Triad National Security, LLC, Los Alamos, New MexicoPrincipal investigator: Julia Kelliher “Enhancing Analysis Capabilities of Biological Data With the NASA EDGE Bioinformatics Platform,” four-year award iSciences LLC, Burlington, VermontPrincipal investigator: Daniel Baston “Sustaining the Geospatial Data Abstraction Library,” three-year award University of Maryland, College Park,Principal investigator: C Max Stevens “Sustaining the Community Firn Model,” three-year award Quansight, LLC, Austin, TexasPrincipal investigator: Dharhas Pothina “Ensuring a Fast and Secure Core for Scientific Python – Security, Accessibility and Performance of NumPy, SciPy and scikit-learn; Going Beyond NumPy With Accelerator Support,” three-year award For information about open science at NASA, visit:
      https://science.nasa.gov/open-science
      -end-
      Alise Fisher
      Headquarters, Washington
      202-617-4977
      alise.m.fisher@nasa.gov
      View the full article
    • By NASA
      A mentor of research scientist Meloë Kacenelenbogen once shared a sentiment from French author André Gide: “You cannot discover new oceans unless you have the courage to lose sight of the shore.” Kacenelenbogen pushes beyond her comfort zone to explore the unknown.
      Name: Meloë S. Kacenelenbogen
      Formal Job Classification: Research scientist
      Organization: Climate and Radiation Laboratory, Science Directorate (Code 613)
      Dr. Meloë S. Kacenelenbogen is a research scientist at NASA’s Goddard Space Flight Center in Greenbelt, Md. She studies the impact of aerosols on air quality and the Earth’s climate.Photo courtesy of Meloë Kacenelenbogen What do you do and what is most interesting about your role here at Goddard?
      I study the impact of aerosols — suspended particles from, for example, wildfire smoke, desert dust, urban pollution, and volcanic eruptions — on air quality and the Earth’s climate. I use space, air, and ground-based observations, as well as models.
      Why did you become a scientist? What is your educational background?
      I never made a deliberate choice to become a scientist. I started with very little confidence as a child and then built up my confidence by achieving things I thought I could not do. I chose the hardest fields to work on along the way. Science looked hard and so did fluid mechanics, remote sensing, and atmospheric physics. I have failed many times, but I always learn something and move on. I do get scared and maybe even paralyzed for a day or two, but I never let fear or failure immobilize me for long.
      I was born in Maryland, but my family moved to France when I was young, so I am fluent in French. I have a bachelor’s and master’s degree in mechanical engineering, and physical methods in remote sensing from the Université Pierre et Marie Curie (Paris VI, Jussieu). In 2008, I got a Ph.D. in atmospheric physics for applying satellite remote sensing to air quality at the Université des Sciences et Technologies de Lille (USTL), France.
      What are some of your career highlights?
      After my Ph.D., I worked for the Atmospheric Lidar Group at the University of Maryland, Baltimore County (UMBC), on spaceborne and ground-based lidars. In 2009, I got a NASA Post-doctoral Program (NPP) fellowship at the agency’s Ames Research Center in California’s Silicon Valley, where I worked for 13 years on space-based, aircraft-based, and ground-based atmospheric aerosol vertical distribution and aerosol typing.
      In 2022, I came to work at the Climate and Radiation Lab at Goddard.
      What is most interesting about aerosols?
      Aerosols are very topical because they have a huge impact on the air we breathe and our Earth’s climate. The smaller the aerosol, the deeper it can get into our lungs. Among other sources, aerosols can come from cars, factories, or wildfires. We all know that wildfires are becoming bigger and more frequent. They are expected to happen even more frequently in the future due to climate change. Both when I was living in California and here in Maryland, I have experienced first-hand choking from the wildfire smoke. I will always remember how apocalyptic it felt back in the summer of 2020 in California when wildfire smoke was paired with COVID confinement, and the sky turned Mars-like orange.
      Please tell us about your involvement with the Atmosphere Observing System (AOS)?
      I am incredibly lucky to be able to contribute to the next generation of NASA’s satellites. I am working on AOS, which will observe aerosols, clouds, convention, and precipitation in the Earth’s atmosphere. I am part of the team that is helping design several instruments and algorithms.
      My role is to connect this spaceborne observing system to all our other space, ground, and air-based measurements at the time of launch. We are making a mesh of observations to address the science questions, run the algorithms, and validate the spaceborne measurements. I am constantly pushed to expand my horizon and my own knowledge.
      Why do you enjoy always challenging yourself intellectually?
      I started that way. I had no confidence, so I felt that the only way I could build my confidence was to try doing things that scared me. I may sometimes be a little scared, but I am never bored.
      What did you learn from your mentors?
      A few years ago, a mentor shared a quote from André Gide with me that encapsulates what we are talking about: “You cannot discover new oceans unless you have the courage to lose sight of the shore.” In other words, it is OK, maybe preferable, to be out of my comfort zone to explore the unknown as scary as it may be.
      Along the way, it has been extremely important for me to deliberately choose mentors. To me, a good mentor has earned the respect of all who have worked with them, is uplifting, reassuring, and gives me the invaluable guidance and support that I need. I deliberately try to surround myself with the right people. I have been very, very fortunate to find incredible people to encourage me.
      As a mentor, what do you advise?
      I tell them to deliberately choose their mentors. I also tell them that it is OK to be uncomfortable. Being uncomfortable is the nature of our field. To do great things, we often need to be uncomfortable.
      Why do you enjoy working on a team?
      I love working on teams, I love to feed off the positive energy of a team whether I lead it or am part of it. In my field, teamwork with a positive energy is incredibly satisfying. Everybody feeds off everybody’s energy, we go further, are stronger, and achieve more. This may not happen often, but when it does it makes it all worth it.
      What are the happiest moments in your career?
      I am always happiest when the team publishes a paper and all our efforts, are encapsulated in that one well-wrapped and satisfying peer-reviewed paper that is then accessible to everyone online. Every paper we publish feels, to me, the same as a Ph.D. in terms of the work, pain, energy, and then, finally, satisfaction involved.
      What do you hope to achieve in your career?
      I want to have been a major contributor to the mission by the time the AOS satellites launch.
      What do you do for fun?
      I do mixed martial arts. I love the ocean, diving, and sailing. I also love going to art galleries, especially to see impressionist paintings to reconnect with my Parisian past.
      Meloë Kacenelenbogen once shared a sentiment from French author André Gide: “You cannot discover new oceans unless you have the courage to lose sight of the shore.”Photo courtesy of Meloë Kacenelenbogen Who is your favorite author?
      I love Zweig, Kafka, Dostoyevsky, Saint-Exupéry, and Kessel. The latter two wrote a lot about aviators in the early 1900s back in the days when it was new and very dangerous. Those pilots, like Mermoz, were my heroes growing up.
      Who would you like to thank?
      I would like to thank my family for being my rock.
      What are your guiding principles?
      To paraphrase Dostoevsky, everyone is responsible to all men for all men and for everything. I have a strong sense of purpose, pride, justice, and honor. This is how I try to live my life for better or for worse.
      By Elizabeth M. Jarrell
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Conversations With Goddard is a collection of Q&A profiles highlighting the breadth and depth of NASA’s Goddard Space Flight Center’s talented and diverse workforce. The Conversations have been published twice a month on average since May 2011. Read past editions on Goddard’s “Our People” webpage.
      Explore More
      6 min read Christine Knudson Uses Earthly Experience to Study Martian Geology
      Geologist Christine Knudson works with the Curiosity rover to explore Mars — from about 250…
      Article 6 days ago 9 min read Systems Engineer Noosha Haghani Prepped PACE for Space
      Article 2 weeks ago 6 min read Astrophysicist Gioia Rau Explores Cosmic ‘Time Machines’
      Article 3 weeks ago Share
      Details
      Last Updated Oct 22, 2024 EditorMadison OlsonContactRob Garnerrob.garner@nasa.govLocationGoddard Space Flight Center Related Terms
      People of Goddard Goddard Space Flight Center People of NASA View the full article
    • By NASA
      5 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Clean air is essential for healthy living, but according to the World Health Organization (WHO), almost 99% of the global population breathes air exceeding their guideline limits of air pollution. “Air quality is a measure of how much stuff is in the air, which includes particulates and gaseous pollutants,” said Kristina Pistone, a research scientist at NASA Ames Research Center. Pistone’s research covers both atmospheric and climate areas, with a focus on the effect of atmospheric particles on climate and clouds. “It’s important to understand air quality because it affects your health and how well you can live your life and go about your day,” Pistone said. We sat down with Pistone to learn more about air quality and how it can have a noticeable impact on human health and the environment.

      What makes up air quality?

      There are six main air pollutants regulated by the Environmental Protection Agency (EPA) in the United States: particulate matter (PM), nitrogen oxides, ozone, sulfur oxides, carbon monoxide, and lead. These pollutants come from from natural sources, such as the particulate matter that rises into the atmosphere from fires and desert dust, or from human activity, such as the ozone generated from sunlight reacting to vehicle emissions.

      Satellite image showing wildfire smoke drifting down from Canada into the American Midwest, captured by the Moderate Resolution Imaging Spectroradiometer (MODIS) on June 09, 2015. NASA/Jeff Schmaltz
      What is the importance of air quality?

      Air quality influences health and quality of life. “Just like we need to ingest water, we need to breathe air,” Pistone said. “We have come to expect clean water because we understand that we need it to live and be healthy, and we should expect the same from our air.”

      Poor air quality has been tied to cardiovascular and respiratory effects in humans. Short-term exposure to nitrogen dioxide (NO2), for example, can cause respiratory symptoms like coughing and wheezing, and long-term exposure increases the risk of developing respiratory diseases such as asthma or respiratory infections. Exposure to ozone can aggravate the lungs and damage the airways. Exposure to PM2.5 (particulates 2.5 micrometers or smaller) causes lung irritation and has been linked to heart and lung diseases.

      In addition to its impacts on human health, poor air quality can damage the environment, polluting bodies of water through acidification and eutrophication. These processes kill plants, deplete soil nutrients, and harm animals.

      Measuring Air Quality: the Air Quality Index (AQI)

      Air quality is similar to the weather; it can change quickly, even within a matter of hours. To measure and report on air quality, the EPA uses the United States Air Quality Index (AQI). The AQI is calculated by measuring each of the six primary air pollutants on a scale from “Good” to “Hazardous,” to produce a combined AQI numeric value 0-500.

      “Usually when we’re talking about air quality, we’re saying that there are things in the atmosphere that we know are not good for humans to be breathing all the time,” Pistone said. “So to have good air quality, you need to be below a certain threshold of pollution.” Localities around the world use different thresholds for “good” air quality, which is often dependent on which pollutants their system measures. In the EPA’s system, an AQI value of 50 or lower is considered good, while 51-100 is considered moderate. An AQI value between 100 and 150 is considered unhealthy for sensitive groups, and higher values are unhealthy to everyone; a health alert is issued when the AQI reaches 200. Any value over 300 is considered hazardous, and is frequently associated with particulate pollution from wildfires.

      NASA Air Quality Research and Data Products

      Air quality sensors are a valuable resource for capturing air quality data on a local level.
      In 2022, the Trace Gas GRoup (TGGR) at NASA Ames Research Center deployed Inexpensive Network Sensor Technology for Exploring Pollution, or INSTEP: a new network of low-cost air quality sensors that measures a variety of pollutants. These sensors are capturing air quality data in certain areas in California, Colorado, and Mongolia, and have proven advantageous for monitoring air quality during California’s fire season.

      The 2024 Airborne and Satellite Investigation of Asian Air Quality (ASIA-AQ) mission integrated sensor data from aircraft, satellites, and ground-based platforms to evaluate air quality over several countries in Asia. The data captured from multiple instruments on these flights, such as the Meteorological Measurement System (MMS) from NASA Ames Atmospheric Science Branch, are used to refine air quality models to forecast and assess air quality conditions.

      Agency-wide, NASA has a range of Earth-observing satellites and other technology to capture and report air quality data. In 2023, NASA launched the Tropospheric Emissions: Monitoring of Pollution (TEMPO) mission, which measures air quality and pollution over North America. NASA’s Land, Atmosphere Near real-time Capability for Earth Observations (LANCE) tool provides air quality forecasters with measurements compiled from a multitude of NASA instruments, within three hours of its observation.
      Nitrogen dioxide levels over the D.C./Philadelphia/New York City region measured by TEMPO.NASA/Scientific Visualization Studio

      Air Quality Resources to Learn More

      In addition to the EPA’s website, which houses air-quality related sources, the EPA also has a platform called AirNow, which reports the local AQI across the United States and allows users to check air quality levels in their area. Pistone also recommends looking at Purple Air’s real-time map, which displays PM data taken from a crowd-sourced network of low-cost sensors and translates those measurements to estimate AQI. For those concerned about air quality, Pistone recommends checking out https://cleanaircrew.org/ for resources on indoor air quality, breathing safely with wildfire smoke, and even building your own box fan filter.

      To learn more about air quality research applications, see NASA’s Applied Sciences Program’s Health & Air Quality program area, which details the use of Earth observations to assess and address air quality concerns at local, regional, and national levels. Additionally, the NASA Health and Air Quality Applied Sciences Team (HAQAST) helps connect NASA data and tools with stakeholders to better share and understand the effects of air quality on human health.


      Written by Katera Lee, NASA Ames Research Center
      Share
      Details
      Last Updated Oct 18, 2024 Related Terms
      General Earth Science Earth Science Division Explore More
      4 min read Scientist Profile: Jacquelyn Shuman Blazes New Trails in Fire Science
      Article 16 hours ago 4 min read Navigating Space and Sound: Jesse Bazley Supports Station Integration and Colleagues With Disabilities
      Article 1 day ago 3 min read Sacrifice and Success: NASA Engineer Honors Family Roots
      Article 1 day ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
  • Check out these Videos

×
×
  • Create New...