Members Can Post Anonymously On This Site
Forest degradation primary driver of carbon loss in the Brazilian Amazon
-
Similar Topics
-
By NASA
Earth Observer Earth Home Earth Observer Home Editor’s Corner Feature Articles Meeting Summaries News Science in the News Calendars In Memoriam More Archives 22 min read
NASA’s BlueFlux Campaign Supports Blue Carbon Management in South Florida
Photo 1. A Mangrove stand lines the bank of Shark River, an Everglades distributary that carries water into the Gulf of Mexico’s Ponce De Leon Bay. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) Introduction
Along the southernmost rim of the Florida Peninsula, the arching prop roots or “knees” of red mangroves (Rhizophora mangle) line the coast – see Photo 1. Where they dip below the water’s surface, fish lay their eggs, enjoying the protection from predators that the trees provide. Among their branches, wading birds, such as the great blue heron and the roseate spoonbill establish rookeries to rear their young. The tangled matrix of roots collects organic matter and ocean-bound sediments, adding little-by-little to the coastline and shielding inland biology from the erosive force of the sea. In these ways, mangroves are equal parts products and engineers of their environment, but their ecological value extends far beyond this local sphere of influence.
Mangroves are an important carbon dioxide (CO2) sink – responsible for removing CO2 from the atmosphere with impressive efficiency. Current estimates suggest mangroves sequester CO2 10 times faster and store up to 5 times more carbon than rainforests and old-growth forests. But as part of the ever-changing line between land and sea, they’re exceptionally vulnerable to climate disturbances such as sea level rise, hurricanes, and changes in ocean salinity. As these threats intensify, Florida’s sub-tropical wetlands – and their role as a critical sink of CO2 – face an uncertain future.
NASA’s BlueFlux Campaign, a three-year (2021–2024), $1.5-million project operating under the agency’s Carbon Monitoring System, used field, aircraft, and satellite data to study the impact of both natural and anthropogenic pressures on South Florida’s coastal ecology. BlueFlux consists of a series of ground-based and airborne fieldwork campaigns, providing a framework for the development of a satellite-based data product that will estimate daily rates of surface-atmosphere gas transfer or gaseous flux across coastal ecosystems in Florida and the Caribbean. “The goal is to enhance our understanding of how blue-carbon ecosystems fit into the global carbon market,” said Ben Poulter [NASA’s Goddard Space Flight Center (GSFC)—Project Lead]. “BlueFlux will ultimately answer scientific questions and provide policy-related solutions on the role that coastal wetlands play in reducing atmospheric greenhouse gas (GHG) concentrations.”
This article provides an overview of BlueFlux fieldwork operations – see Figure 1 – and outlines how the project might help refine global GHG budgets and support the restoration of Florida’s wetland ecology.
Figure 1. A map of South Florida overlaying a true-color image captured by the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on board NASA’s Terra satellite. Red triangles mark locations of primary ground-based fieldwork operations described in this article. Figure Credit: NASA’s Goddard Space Flight Center (GSFC) BlueFlux Ground-based Fieldwork
Across the street from the Flamingo Visitors center, at the base of the Everglades National Park, there was once a thriving mangrove population. Now, the skeletal remains of the trees form one of the Everglades’ largest ghost forests – see Photo 2. When Hurricane Irma made landfall in September 2017, violent winds battered the shore and a storm surge swept across the coast, decimating large swaths of the mangrove forest. Most of Florida’s mangroves recovered swiftly. But seven years later, this site and others like it have seen little to no growth.
“At this point, I doubt they’ll ever recover,” said David Lagomasino [East Carolina University].
Photo 2. A mangrove ghost forest is all that remains of a once-thriving mangrove stand, preserving an image of Hurricane Irma’s lasting impact on South Florida’s wetland ecology. Most of the ghost forests in the region are a product of natural depressions in the landscape that collect saltwater following severe storms. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) Lagomasino was in the Everglades this summer conducting research as part of the fifth leg of BlueFlux fieldwork – see Photo 3. His team focused on measuring how changes in wetland ecology affect the sequestration and emission rates of both CO2 and methane (CH4). In areas where vegetative health is severely degraded, like in ghost forests, a general decline in CO2 uptake is accompanied by an increase in CH4 production, the net effect of which could dramatically amplify the atmosphere’s ability to trap heat. Ghost forests offer an example at one end of an extreme, but defining the way more subtle gradients among wetland variables – such as changes in water level, tree height, canopy coverage, ocean salinity, or mangrove species distribution – might influence flux is harder to tease out of the limited data available.
Photo 3. Assistant professor David Lagomasino and Ph.D. candidate Daystar Babanawo [both from East Carolina University] explore the lower Everglades by boat. Due to the relative inaccessibility of the region, measurements of flux in wetland ecosystems are limited. The plant life here consists almost entirely of Florida’s three Mangrove species (red, black, and white), which are among the only vegetation that can withstand the brackish waters characteristic of coastal wetlands. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) In the Everglades, flux measurements are confined to a handful of eddy covariance towers – or flux towers – constructed as part of the National Science Foundation’s (NSF) Long-Term Ecological Research (LTER) Network.
The first flux tower in this network, erected in June 2003, stands near the edge of Shark River at a research site called SRS-6, short for Shark River Slough site 6. A short walk from the riverbank, across a snaking path of rain-weathered, wooden planks, sits a small platform where the flux tower is anchored to the forest floor – see Photo 4. About 20 m (65 feet) above the platform, the tower breaches the canopy, where a suite of instruments continuously measures wind velocity, temperature, humidity, and the vertical movement of trace atmospheric gases, such as water vapor (H2Ov), CO2, and CH4. It’s these measurements collectively that are used to calculate flux.
Photo 4. At SRS-6, an eddy covariance tower measures C02 and CH4 flux among a dense grove of red, black, and white mangroves. The term eddy covariance refers to the statistical technique used to calculate gaseous flux based on the meteorological and scalar atmospheric data collected by the flux towers. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) “Hundreds of research papers have come from this site,” said Lagomasino. The abundance of research generated from the data captured at SRS-6 speaks in part to the value of the measurements that the tower makes. It also points to the gaps that exist just beyond each tower’s reach. A significant goal of the BlueFlux campaign is to explain flux on a scale that isn’t covered by existing data – to fill in the gaps between the towers.
One way to do that is by gathering data by hand.
On Lagomasino’s boat is a broad, black case carrying a tool called a Russian peat auger. The instrument is designed to extract core samples from soft soils – see Photo 5.
Everglades peat, which is made almost entirely of the partially decomposed roots, stems, and leaves of the surrounding mangroves, offers a perfect study subject. Each thin, half-cylinder sample gets sealed and shipped back to the lab, where it will be sliced into flat discs. The discs will be analyzed for their age and carbon content by Lagomasino’s team and partners at Yale University. These cores are like biomass time capsules. In Florida’s mangrove forests, a 1-m (3-ft) core might represent more than 300 years of carbon accumulation. On average, a 1 to 3 mm (0.04 to 0.12 in) layer of matter is added to the forest floor each year, building up over time like sand filling an hourglass.
Photo 5. David Lagomasino holds a Russian peat auger containing a sample of Everglades peat. The primary source of the soil’s elevated carbon content – evident from its coarse, fibrous texture – is the partially decayed plant tissue of the surrounding mangroves. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) Although coastal wetlands account for less than 2% of the planet’s land-surface area, they house a disproportionate amount of blue carbon – carbon stored in marine and coastal environments. In the Everglades, the source of this immense accumulation of organic material is the quick-growing vegetation – see Photo 6.
When a CO2 molecule finds its way through one of the many small, porous openings on a mangrove leaf – called stomata – its next step is one of creation, where it plays a part in the miraculous transformation of inorganic matter into living tissue. Inside the leaf’s chloroplasts, energy from stored sunlight kickstarts a long chain of chemical reactions that will ultimately divide CO2 into its constituent parts. Oxygen atoms are returned to the atmosphere as the byproduct of photosynthesis, but the carbon stays behind to help build the sugar molecules that will fuel new plant growth. In short, the same carbon that once flowed through the atmosphere defines the molecular structure of all wetland vegetation. When a plant dies or a gust of wind pulls a leaf to the forest floor, this carbon-based matter finds its way into the soil, where it can stay locked in place for thousands of years thanks to a critical wetland ingredient: water.
The inundated, anoxic – an environment deficient or absent of oxygen – peat soils characteristic of wetlands host microbial populations that are uniquely adapted to their environment. In these low- to no-oxygen conditions, the prevailing microbiota consumes organic material slowly, leading to an accumulation of carbon in the soil. As wetland conditions change, the soil’s microbial balance shifts. For example, a decline in water level, which can increase the oxygen-content of the soil, produces conditions favorable to aerobic bacteria. These oxygen-breathing lifeforms consume organic matter far more rapidly than their anaerobic counterparts – and release more CO2 into the atmosphere as a result.
Water level isn’t the only environmental condition that influences rates of carbon sequestration. The soil cores collected during the campaign will be analyzed alongside records of interrelated variables such as water salinity, sea surface height, and temperature to understand not just the timescales associated with blue carbon development in mangrove forests but how and why rates of soil deposition change in response to specific environmental pressures. In many parts of the Everglades, accumulated peat can reach depths of up to 3 m (9.8 feet) – holding thousands of years’ worth of insights that would otherwise be lost to time.
Photo 6. Mangroves are viviparous plants. Their propagules – or seedlings – germinate while still attached to their parent tree. Propagules that fall to the forest floor are primed to begin life as soon as they hit the ground. But even those that fall into bodies of water and are carried out to sea can float for months before finding a suitable place to lay their roots. The high growth rate of mangroves contributes to the efficiency with which mangrove forests remove CO2 from the atmosphere. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) Lola Fatoyinbo [NASA’s Goddard Space Flight Center (GSFC), Biospheric Sciences Lab] and Peter Raymond [Yale University’s School of the Environment] led additional fieldwork teams tasked with collecting forest inventory data in locations where vegetation was dead, regenerating, or recently disturbed by severe weather events. A terrestrial laser system was used to obtain three-dimensional (3D) images of mangrove forest structure, which provided maps of stem density, vertical distributions of biomass, and stand volume surface area. Spectroradiometers were also used to acquire visible, near infrared, and shortwave infrared spectra, delivering detailed information about species composition, vegetative health, water levels, and soil properties.
To tie these variables to flux, the researchers made measurements using chambers – see Figure 2 – designed to adhere neatly to points where significant rates of gas exchange occur, (i.e., mangrove lenticels—cell-sized breathing pores found on tree bark and root systems— and the forest floor). As an example, black mangroves (Avicennia germinans) possess unique aerial roots called pneumatophores that, similar to the prop roots of red mangroves, provide them with access to atmospheric oxygen. Pneumatophores sprout vertically from the forest floor and line up like matchsticks around the base of each tree. The team used cylindrical chambers to measure the transfer of gas between a single pneumatophore and the atmosphere – see Figure 2a.
These observations are archived in NASA’s Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC) and publicly available to researchers who wish to monitor and identify trends in the data. After nearly three years of field work, these data have already given scientists a more detailed picture of how Florida’s wetlands are responding to environmental pressures.
Research based on data from early BlueFlux fieldwork deployments confirms that aerobic, methanogenic microbes living in flooded, wetland soils naturally release a significant amount of CH4 as a byproduct of the process by which they create their own energy.
“We’re especially interested in this methane part,” said Fatoyinbo. “It’s the least understood, and there’s a lot more of it than we previously thought.” Fatoyinbo also noted a “significant difference in CO2 and CH4 fluxes between healthy mangroves and degraded ones.” In areas where mangrove health is in decline, due to reduced freshwater levels or as the result of damage sustained during severe weather events, “you can end up with more ‘bad’ gases in the atmosphere,” she said. Since CH4 is roughly 80 times more potent than CO2 over 100-year period, these emissions can undermine some of the net benefits that blue carbon ecosystems provide as a sink of atmospheric carbon.
Figure 2. To directly measure the emission and sequestration rates of CO2 and CH4 in mangrove forests, chambers were designed to adhere to specific targets where gas exchange occurs (i.e. mangrove lenticles, root systems, and the forest floor). Credit: GSFC Airborne Research Teams Measure GHG Flux from Above
Florida’s mangrove forests blanket roughly 966 km2 (600 mi2) of coastal terrain. Even with over 20 years of tower data and the extensive measurements from ground-based fieldwork operations, making comprehensive inferences about the entire ecosystem is tenuous work. To provide flux data at scale – and help quantify the atmospheric influence that Florida’s coastal wetlands carry as a whole – NASA’s BlueFlux campaign relies on a relatively new, airborne technique for measuring flux – see Photo 7.
Photo 7. At the Miami Executive Airfield, members of NASA’s BlueFlux airborne science team stand in front of the Beechcraft 200 King Air before the final flight of the fieldwork campaign. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) Between 2022 and 2024, over 5 deployments, the team conducted more than 34 carefully planned flights – see Figure 3 – collecting flux data over Florida’s wetlands by plane. Each flight is equipped with a payload known colloquially as “CARAFE,” short for the CARbon Airborne Flux Experiment, which is the airborne campaign’s primary means of data collection. “This is one of the first times an instrument like this has flown over a mangrove forest anywhere in the world,” said Fatoyinbo. “So, it’s really just kind of groundbreaking.”
Figure 3. An example of flight paths from eight BlueFlux airborne deployments flown in April 2023. The flight paths are highlighted in blue. The legs of each flight where flux measurements were taken are highlighted in green. Accurate flux calculations rely on stable measurements of the aircraft’s speed and orientation, which is why the flux legs of each flight are flown in straight lines. Credit: GSFC In the air, GHG concentrations are measured using a well-established technique called cavity ringdown spectroscopy, which involves firing a laser into a small cavity where it will ping back and forth between two highly reflective mirrors. Most gas-phase molecules absorb light at specific wavelengths, depending on their atomic makeup. Since the target molecules in this case are CO2 and CH4, the laser is configured to emit light at a wavelength that only these molecules will absorb. As the laser bounces between the mirrors, a fraction of the light is absorbed by any molecules present in the chamber. The rate of the light’s decay is used to estimate CO2 and CH4 concentrations, generating a time series with continuous readings of gas concentrations, measured in parts per million – see Photo 8. This information is combined with measurements of vertical wind velocity to calculate a corresponding time series of fluxes along the flight track. While these measurements are important on their own, a priority for the airborne team is understanding GHG fluxes in relation to what’s happening on the ground.
Photo 8. The CARAFE payload is responsible for taking readings of atmospheric CO2, CH4, and H2Ov levels using a wind probe and two optical spectroscopy instruments manufactured by Picarro: the G2401m Gas Concentration Analyzer and the G2311f Gas Concentration Analyzer. The readings pictured above were made by the G2311f, which measures gas concentrations at a faster rate than the G2401m. The G2401m makes slower but more stable measurements, which are necessary for verifying the accuracy of measurements made by the G2311f. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) Unlike flux towers, which only collect data within a 100 m2 (328 ft2) “footprint,” airborne readings have a footprint that can stretch up to 1 km (0.6 mi) in upwind directions. The plane’s speed, position, and orientation are used to help link flux data to fixed points along the flight’s path – so the team can make comparisons between aerial measurements and those made by the ground-based towers – see Photo 9.
“One challenge with that is the flux towers are much lower to the ground, and their footprint is much smaller,” said Glenn Wolfe [GSFC—BlueFlux Flight Lead]. “So, we have to be really careful with our airborne observations, to make sure they closely resemble our ground-based measurements.”
Part of decoding the airborne data involves overlaying each footprint with detailed maps of different surface properties, such as vegetation cover, soil water depth, or leaf-area index, so the team can constrain the measurements and assign fluxes to specific sources – whether its mangroves, sawgrass, or even water.
Photo 9. The BlueFlux airborne science team collects flux measurements from 90m (300ft) above Florida’s mangrove forests. Photo credit: Nathan Marder/NASA’s Goddard Space Flight Center (GSFC) Data Upscaling – Making Daily Flux Predictions from Space
The coupling of BlueFlux’s ground-based and airborne data provides the framework for the production of a broader, regional image of GHG flux.
“The eddy flux towers give us information about the temporal variability,” said Cheryl Doughty [GSFC]. “And the airborne campaign gives us this great intermediate dataset that allows us to go from individual trees to a much larger area.”
Doughty is now using BlueFlux data to train a remote-sensing data product, the prototype of which is called Daily Flux Predictions for South Florida. The product’s underlying model relies on machine learning algorithms and an ensemble modeling technique called random forest regression. It will make flux predictions based on surface reflectance data captured by the Moderate Resolution Imaging Spectroradiometer (MODIS), an instrument that flies on NASA’s polar-orbiting Aqua and Terra satellites – see Figure 4.
“We’re really at the mercy of the data that’s out there,” said Doughty. “One of the things we’re trying to produce as part of this project is a daily archive of fluxes, so MODIS is an amazing resource, because it has over 20 years of data at a daily temporal resolution.”
This archival flux data will help researchers explain how fluxes change in relation to processes that are directly described by MODIS surface reflectance data, including sea-level rise, land use, water management, and disturbances from hurricanes and fires.
Figure 4. Sample of methane flux upscaling, in which MODIS surface reflectance retrievals are used to predict CH4 flux for South Florida at a regional scale [bottom row, left]. The model inputs rely on a composite of MODIS Nadir Bidirectional Reflectance Distribution Function (BRDF)-Adjusted Radiance (NBAR) measurements from all available MODIS land bands: [top row, left to right]: red (620–670 nm), green (545–565 nm), blue (459–479 nm); [middle row, left to right] near infrared 1, or NIR1 (841–876 nm), NIR2 (1230–1250 nm), shortwave IR 1, or SWIR1 (1628–1652 nm), and SWIR 2 (2105–2155 nm). The Everglades National Park boundary is indicated on each image with a white line. Output of the model is shown [bottom row, left] as well as a comparison between modeled fluxes of MODIS NBAR with Terra and Aqua [bottom row, right]. Credit: GSFC To help validate the model, researchers must reformat flux measurements from the airborne campaign to match the daily temporal resolution and 500m2 (0.3mi2) spatial resolution of MODIS reflectance retrievals.
“It’s best practice to meet the data at the coarsest resolution,” said Doughty. “So, we have to take an average of the hourly estimates to match MODIS’ daily scale.”
The matching process is slightly more complicated for spatial datasets. BlueFlux’s airborne flux measurements produce roughly 20 data points for each 500 m2 (0.3 mi2) area, the same resolution as a single MODIS pixel.
“We’re essentially taking an average of all those CARAFE points to get an estimate that corresponds to one pixel,” said Doughty.
This symmetry is critical, allowing the team to test, train, and tune the model using measurements that capture what’s really happening on the ground – ensuring the accuracy of flux measurements generated from satellite data alone.
Researchers don’t expect the model to serve as a perfect reconstruction of reality. The heterogenous nature of Florida’s wetland terrain – which consists of a patchwork of sawgrass marshland, mangrove forests, hardwood hammocks, and freshwater swamps – contributes to high degree of variability in CO2 removal rates within and across its distinct regions. The daily flux product accounts for some of this complexity by making hundreds of calculations at a time, each with slightly different parameters based on in-situ measurements.
“The goal isn’t to just give people one flux measurement but an estimate of the uncertainty that is so inherent to these wetlands,” explained Doughty.
The prototype of the product will be operational by early 2025 and accessible to the public through NASA’s ORNL DAAC. Doughty hopes it will help stakeholders and decision makers evaluate policies related to water management, land use, and conservation that might impact critical stocks of blue carbon.
From Drainage to Restoration in the Florida Everglades
In the late 19th century, land developers were drawn to South Florida, where they hoped the fertile soil and tropical climate could support year-round cultivation of commodities such as exotic fruits, vegetables, and sugar cane. There was just one thing standing in the way – the water. If they could find a way to tame Florida’s wilderness, to drain the wetland of its excess water, Florida would offer Americans a new agricultural frontier.
Progress was made incrementally, but the Everglades drainage project idled for more than 50 years as its organizers wrestled with the literal and political morass surrounding South Florida’s wetland topography. It was mother nature’s hand that ultimately accelerated the drainage project. In 1926 and 1928, two large hurricanes tore through the barrier along Lake Okeechobee’s southern shore built to prevent water from spilling onto the newly settled, small-scale farmland just south of the lake. The second of the two storms – 1928’s Okeechobee Hurricane – made landfall in early September and resulted in nearly 3,000 recorded fatalities. In some areas, the torrent of flood water was deep enough that even those who sought refuge from the flood on the roofs of their homes were swept away by the current. The federal government was forced to step in.
By 1938, the U.S. Army Corps of Engineers had completed construction of the Hoover Dike, adding to a collection of four canals responsible for siphoning water away from Lake Okeechobee’s floodplain and into the Atlantic Ocean. Seasonal flooding was brought under control, but the complete reclamation of South Florida’s wetlands proved more challenging than anticipated. As water levels fell and freshly cleared lands dried out, the high organic content of the soil fueled tremendous peat and muck fires that could burn for days, spreading through underground seams where water once flowed. In some areas, fires consumed the entire topsoil layer – exposing the limestone substrata to the atmosphere for the first time in thousands of years. The engineers in charge of Florida’s early wetland reclamation projects underestimated the value of the state’s hydrological system and overestimated its capacity to withstand human interference.
“Those initial four canals were enough to drain the everglades three times over,” said Fred Sklar [South Florida Water Management District—Everglades System Sciences Director]. “And they still exist, but now there are more than seven million people who rely on them for drinking water and flood control.”
Today, much of the Water Management District’s work involves unwinding the damage wrought by earlier drainage efforts.
“One thing we’re trying to do is make sure these peat fires never happen again,” said Sklar.
But restoring natural water flow to the Everglades – which is critical to the region’s ecological health – isn’t an option. Even if drainage could be reversed, it would subject Florida’s residents to the same flood risks that made drainage a priority. Some residents, including members of the Miccosukee and Seminole tribes, live directly alongside or within Everglades wilderness areas, where the risk of flooding is even greater than it is in the state’s highly populated coastal communities. These areas are also out of reach of the Water Management District’s existing infrastructure. It’s not as simple as turning the tap on and off.
Photo 10. The Tamiami Trail Canal runs across the Florida Peninsula from west to east, towards a saltwater treatment facility near the Miami River. Construction was completed in 1928, shortly after the first four drainage canals opened. It quickly became apparent that the canal and its adjacent roadway dramatically impede water flow to the Everglades wilderness areas to their south, cutting off the region’s vegetation and wildlife from a critical source of freshwater. New modifications to the canal are currently underway, which aim to introduce a hydrological regime that more closely resembles the pre-drainage system. Photo credit: U.S. National Park Service Florida’s Water Management District works with federal agencies, including the U.S. Army Corps of Engineers, to monitor and govern the flow of Florida’s freshwater. The District has overseen the construction and management of dozens of canals, dikes, levees, dredges, and pumps over the last half-century that offer a higher degree of control over Florida’s complex hydrological network – see Photo 10.
“The goal is to restore as much acreage as we can, but we also need to restore it functionally, without degrading the whole system or putting residents at risk,” summarized Sklar. “To do this effectively, we need a detailed understanding of how the hydrology functions and how it influences all of these other systems, such as carbon sequestration.”
Since the 1920s, more than half of Florida’s original wetland coverage has been lost. The present system also carries 65% less peat coverage and 77% less stored carbon than it did prior to drainage. As atmospheric CO2 concentrations climb at unprecedented rates, an accompanying rise in sea levels, severe weather, and ocean salinity all present serious threats to Florida’s wetland ecology – see Figure 5.
“We’re worried about losing that stored carbon,” said Poulter. “But blue carbon also offers tremendous opportunities for climate mitigation if conservation and restoration are properly supported by science.”
Figure 5. A map of the BlueFlux study region, showing mangrove extent (green) and the paths of tropical storms and hurricanes from 2011 to 2021 (red). These storms drive losses in mangrove forest coverage – the result of erosion and wind damage. The inset regions at the top of the image highlight proposed targets for the airborne component of NASA’s BlueFlux Campaign. Figure credit: GSFC Conclusion – The Future of Flux
Every few years, the Intergovernmental Panel on Climate Change (IPCC) releases emissions data and budget reports that have important policy implications related to the Paris Agreement’s goal of limiting global warming to between 1.5°C (2.7°F) and 2°C (3.6°F) compared to pre-industrial levels. Refining the accuracy of global carbon budgets is paramount to reaching that goal, and wetland ecosystems – which have been historically under-represented in climate research – are an important part of the equation.
Early estimates based on BlueFlux fieldwork deployments and upscaled using MODIS surface reflectance data suggest that wetland CH4 emissions in South Florida offset CO2 removal in the region by about 5% based on a 100-year CH4 warming potential, resulting in a net annual CO2 removal of 31.8 Tg (3.18 million metric tons) per year. This is a small fraction of total CO2 emissions in the U.S. and an even smaller fraction of global emissions. In 2023, an estimated 34,800 Tg (34.8 billion metric tons) of CO2 were released into the atmosphere. But relative to their size, the CO2 removal services provided by tropical wetlands are hardly dismissible.
“We’re finding that massive amounts of CO2 are removed and substantial amounts of CH4 are produced, but overall, these ecosystems provide a net climate benefit by removing more greenhouse gases than they produce,” Poulter said.
Access to a daily satellite data product also provides researchers with the means to make more regular adjustments to budgets based on how Florida’s mutable landscape is responding to climate disturbances and restoration efforts in real time.
With the right resources in hand, the scientists who dedicate their careers to understanding and restoring South Florida’s ecology share a hopeful outlook.
“Nature and people can absolutely coexist,” said Meenakshi Chabba [The Everglades Foundation—Ecologist and Resilience Scientist]. “But what we need is good science and good management to reach that goal.”
The Everglades Foundation provides scientific evaluation and guidance to the elected officials and governmental institutions responsible for the implementation of the Comprehensive Everglades Restoration Plan (CERP), a federal program approved by Congress in 2000 that outlines a 30-year plan to restore Florida’s wetland ecology. The Foundation sees NASA’s BlueFlux campaign as an important accompaniment to that goal.
“The [Daily Flux Predictions for South Florida] data product is incredibly valuable, because it provides us with an indicator of the health of the whole system,” said Steve Davis [The Everglades Foundation—Chief Science Officer]. “We know how valuable the wetlands are, but we need this reliable science from NASA and the BlueFlux Campaign to help translate those benefits into something we can use to reach people as well as policymakers.”
Researchers hope the product can inform decisions about the management of Florida’s wetlands, the preservation of which is not only a necessity but – to many – a responsibility.
“These impacts are of our own doing,” added Chabba. “So, now it’s incumbent upon us to make these changes and correct the mistakes of the past.”
Next, the BlueFlux team is shifting their focus to what they call BlueFlux 2. This stage of the project centers around further analysis of the data collected during fieldwork campaigns and outlines the deployment of the beta version of Daily BlueFlux Predictions for South Florida, which will help generate a more accurate evaluation of flux for the many wetland ecosystems that exist beyond Florida’s borders.
“We’re trying to contribute to a better understanding of global carbon markets and inspire further and more ambitious investments in these critical stocks of blue carbon,” said Poulter. “First, we want to scale this work to the Caribbean, where we have these great maps of mangrove distribution but limited data on flux.”
An additional BlueFlux fieldwork deployment is slated for 2026, with plans to make flux measurements above sites targeted by the state for upcoming restoration initiatives, such as the Everglades Agricultural Area Environmental Protection District. In the Agricultural Area, construction is underway on a series of reservoirs that will store excess water during wet seasons and provide a reserve source of water for wildlife and residents during dry seasons. As the landscape evolves, BlueFlux will help local officials evaluate how Florida’s wetlands are responding to efforts designed to protect the state’s most precious natural resource – and all those who depend on it.
Nathan Marder
NASA’s Goddard Space Flight Center/Global Science and Technology Inc.
nathan.marder@nasa.gov
Share
Details
Last Updated Nov 12, 2024 Related Terms
Earth Science View the full article
-
By European Space Agency
New research, partially funded by ESA, reveals that the cool ‘ocean skin’ allows oceans to absorb more atmospheric carbon dioxide than previously thought. These findings could enhance global carbon assessments, shaping more effective emission-reduction policies.
View the full article
-
By NASA
A NASA-developed material made of carbon nanotubes will enable our search for exoplanets—some of which might be capable of supporting life. Originally developed in 2007 by a team of researchers led by Innovators of the Year John Hagopian and Stephanie Getty at NASA’s Goddard Space Flight Center, this carbon nanotube technology is being refined for potential use on NASA’s upcoming Habitable Worlds Observatory (HWO)—the first telescope designed specifically to search for signs of life on planets orbiting other stars.
As shown in the figure below, carbon nanotubes look like graphene (a single layer of carbon atoms arranged in a hexagonal lattice) that is rolled into a tube. The super-dark material consists of multiwalled carbon nanotubes (i.e., nested nanotubes) that grow vertically into a “forest.” The carbon nanotubes are 99% empty space so the light entering the material doesn’t get reflected. Instead, the light enters the nanotube forest and jiggles electrons in the hexagonal lattice of carbon atoms, converting the light to heat. The ability of the carbon nanotubes to eliminate almost all light is enabling for NASA’s scientific instruments because stray light limits how sensitive the observations can be. When applied to instrument structures, this material can eliminate much of the stray light and enable new and better observations.
Left: Artist’s conception of graphene, single and multiwalled carbon nanotube structures. Right: Scanning electron microscope image of vertically aligned multiwalled carbon nanotube forest with a section removed in the center. Credit: Delft University/Dr. Sten Vollebregt and NASA GSFC Viewing exoplanets is incredibly difficult; the exoplanets revolve around stars that are 10 billion times brighter than they are. It’s like looking at the Sun and trying to see a dim star next to it in the daytime. Specialized instruments called coronagraphs must be used to block the light from the star to enable these exoplanets to be viewed. The carbon nanotube material is employed in the coronagraph to block as much stray light as possible from entering the instrument’s detector.
The image below depicts a notional telescope and coronagraph imaging an exoplanet. The telescope collects the light from the distant star and exoplanet. The light is then directed to a coronagraph that collimates the beam, making the light rays parallel, and then the beam is reflected off the apodizer mirror, which is used to precisely control the diffraction of light. Carbon nanotubes on the apodizer mirror absorb the stray light that is diffracted off edges of the telescope structures, so it does not contaminate the observations. The light is then focused on the focal plane mask, which blocks the light from the star but allows light from the exoplanet to pass. The light gets collimated again and is then reflected off a deformable mirror to correct distortion in the image. Finally, the light passes through the Lyot Stop, which is also coated with carbon nanotubes to remove the remaining stray light. The beam is then focused onto the detector array, which forms the image.
Even with all these measures some stray light still reaches the detector, but the coronagraph creates a dark zone where only the light coming from the exoplanet can be seen. The final image on the right in the figure below shows the remaining light from the star in yellow and the light from the exoplanet in red in the dark zone.
Schematic of a notional telescope and coronagraph imaging an exoplanet Credit: Advanced Nanophotonics/John Hagopian, LLC HWO will use a similar scheme to search for habitable exoplanets. Scientists will analyze the spectrum of light captured by HWO to determine the gases in the atmosphere of the exoplanet. The presence of water vapor, oxygen, and perhaps other gases can indicate if an exoplanet could potentially support life.
But how do you make a carbon-nanotube-coated apodizer mirror that could be used on the HWO? Hagopian’s company Advanced Nanophotonics, LLC received Small Business Innovation Research (SBIR) funding to address this challenge.
Carbon nanotubes are grown by depositing catalyst seeds onto a substrate and then placing the substrate into a tube-shaped furnace and heating it to 1382 degrees F, which is red hot! Gases containing carbon are then flowed into the heated tube, and at these temperatures the gases are absorbed by the metal catalyst and transform into a solution, similar to how carbon dioxide in soda water fizzes. The carbon nanotubes literally grow out of the substrate into vertically aligned tubes to form a “forest” wherever the catalyst is located.
Since the growth of carbon nanotubes on the apodizer mirror must occur only in designated areas where stray light is predicted, the catalyst must be applied only to those areas. The four main challenges that had to be overcome to develop this process were: 1) how to pattern the catalyst precisely, 2) how to get a mirror to survive high temperatures without distorting, 3) how to get a coating to survive high temperatures and still be shiny, and 4) how to get the carbon nanotubes to grow on top of a shiny coating. The Advanced Nanophotonics team refined a multi-step process (see figure below) to address these challenges.
Making an Apodizer Mirror for use in a coronagraph Credit: Advanced Nanophotonics/John Hagopian, LLC First a silicon mirror substrate is fabricated to serve as the base for the mirror. This material has properties that allow it to survive very high temperatures and remain flat. These 2-inch mirrors are so flat that if one was scaled to the diameter of Earth, the highest mountain would only be 2.5 inches tall!
Next, the mirror is coated with multiple layers of dielectric and metal, which are deposited by knocking atoms off a target and onto the mirror in a process called sputtering. This coating must be reflective to direct the desired photons, but still be able to survive in the hot environment with corrosive gases that is required to grow carbon nanotubes.
Then a material called resist that is sensitive to light is applied to the mirror and a pattern is created in the resist with a laser. The image on the mirror is chemically developed to remove the resist only in the areas illuminated by the laser, creating a pattern where the mirror’s reflecting surface is exposed only where nanotube growth is desired.
The catalyst is then deposited over the entire mirror surface using sputtering to provide the seeds for carbon nanotube growth. A process called liftoff is used to remove the catalyst and the resist that are located where nanotubes growth is not needed. The mirror is then put in a tube furnace and heated to 1380 degrees Fahrenheit while argon, hydrogen, and ethylene gases are flowed through the tube, which allows the chemical vapor deposition of carbon nanotubes where the catalyst has been patterned. The apodizer mirror is cooled and removed from the tube furnace and characterized to make sure it is still flat, reflective where desired, and very black everywhere else.
The Habitable Worlds Observatory will need a coronagraph with an optimized apodizer mirror to effectively view exoplanets and gather their light for evaluation. To make sure NASA has the best chance to succeed in this search for life, the mirror design and nanotube technology are being refined in test beds across the country.
Under the SBIR program, Advanced Nanophotonics, LLC has delivered apodizers and other coronagraph components to researchers including Remi Soummer at the Space Telescope Science Institute, Eduardo Bendek and Rus Belikov at NASA Ames, Tyler Groff at NASA Goddard, and Arielle Bertrou-Cantou and Dmitri Mawet at the California Institute of Technology. These researchers are testing these components and the results of these studies will inform new designs to eventually enable the goal of a telescope with a contrast ratio of 10 billion to 1.
Reflective Apodizers delivered to Scientists across the country Credit: Advanced Nanophotonics/John Hagopian, LLC In addition, although the desired contrast ratio cannot be achieved using telescopes on Earth, testing apodizer mirror designs on ground-based telescopes not only facilitates technology development, but helps determine the objects HWO might observe. Using funding from the SBIR program, Advanced Nanophotonics also developed transmissive apodizers for the University of Notre Dame to employ on another instrument—the Gemini Planet Imager (GPI) Upgrade. In this case the carbon nanotubes were patterned and grown on glass that transmits the light from the telescope into the coronagraph. The Gemini telescope is an 8.1-meter telescope located in Chile, high atop a mountain in thin air to allow for better viewing. Dr. Jeffrey Chilcote is leading the effort to upgrade the GPI and install the carbon nanotube patterned apodizers and Lyot Stops in the coronagraph to allow viewing of exoplanets starting next year. Discoveries enabled by GPI may also drive future apodizer designs.
More recently, the company was awarded a Phase II SBIR contract to develop next-generation apodizers and other carbon nanotube-based components for the test beds of existing collaborators and new partners at the University of Arizona and the University of California Santa Clara.
Tyler Groff (left) and John Hagopian (right) display a carbon nanotube patterned apodizer mirror used in the NASA Goddard Space Flight Center coronagraph test bed. Credit: Advanced Nanophotonics/John Hagopian, LLC As a result of this SBIR-funded technology effort, Advanced Nanophotonics has collaborated with NASA Scientists to develop a variety of other applications for this nanotube technology.
A special carbon nanotube coating developed by Advanced Nanophotonics was used on the recently launched NASA Ocean Color Instrument onboard the Plankton, Aerosol, Cloud, ocean Ecosystem (PACE) mission that is observing both the atmosphere and phytoplankton in the ocean, which are key to the health of our planet. A carbon nanotube coating that is only a quarter of the thickness of a human hair was applied around the entrance slit of the instrument. This coating absorbs 99.5% of light in the visible to infrared and prevents stray light from reflecting into the instrument to enable more accurate measurements. Hagopian’s team is also collaborating with the Laser Interferometer Space Antenna (LISA) team to apply the technology to mitigate stray light in the European Space Agency’s space-based gravity wave mission.
They are also working to develop carbon nanotubes for use as electron beam emitters for a project sponsored by the NASA Planetary Instrument Concepts for the Advancement of Solar System Observations (PICASSO) Program. Led by Lucy Lim at NASA Goddard, this project aims to develop an instrument to probe asteroid and comet constituents in space.
In addition, Advanced Nanophotonics worked with researcher Larry Hess at NASA Goddard’s Detector Systems Branch and Jing Li at the NASA Ames Research Center to develop a breathalyzer to screen for Covid-19 using carbon nanotube technology. The electron mobility in a carbon nanotube network enables high sensitivity to gases in exhaled breath that are associated with disease.
This carbon nanotube-based technology is paying dividends both in space, as we continue our search for life, and here on Earth.
For additional details, see the entry for this project on NASA TechPort.
PROJECT LEAD
John Hagopian (Advanced Nanophotonics, LLC)
SPONSORING ORGANIZATION
SMD-funded SBIR project
Share
Details
Last Updated Sep 03, 2024 Related Terms
Astrophysics Science-enabling Technology Technology Highlights Explore More
2 min read Hubble Zooms into the Rosy Tendrils of Andromeda
Article
4 days ago
2 min read Hubble Observes An Oddly Organized Satellite
Article
5 days ago
3 min read Eclipse Soundscapes AudioMoth Donations Will Study Nature at Night
Article
6 days ago
View the full article
-
By NASA
4 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
An astronaut aboard the International Space Station photographed wildfire smoke from Nova Scotia billowing over the Atlantic Ocean in May 2023. Warm weather and lack of rain fueled blazes across Canada last year, burning 5% of the country’s forests.NASA Extreme wildfires like these will continue to have a large impact on global climate.
Stoked by Canada’s warmest and driest conditions in decades, extreme forest fires in 2023 released about 640 million metric tons of carbon, NASA scientists have found. That’s comparable in magnitude to the annual fossil fuel emissions of a large industrialized nation. NASA funded the study as part of its ongoing mission to understand our changing planet.
The research team used satellite observations and advanced computing to quantify the carbon emissions of the fires, which burned an area roughly the size of North Dakota from May to September 2023. The new study, published on Aug. 28 in the journal Nature, was led by scientists at NASA’s Jet Propulsion Laboratory in Southern California.
To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
Carbon monoxide from Canada wildfires curls thousands of miles across North America in this animation showing data from summer 2023. Lower concentrations are shown in purple; higher concentrations are in yellow. Red triangles indicate fire hotspots.NASA’s Goddard Space Flight Center They found that the Canadian fires released more carbon in five months than Russia or Japan emitted from fossil fuels in all of 2022 (about 480 million and 291 million metric tons, respectively). While the carbon dioxide (CO2) emitted from both wildfires and fossil fuel combustion cause extra warming immediately, there’s an important distinction, the scientists noted. As the forest regrows, the amount of carbon emitted from fires will be reabsorbed by Earth’s ecosystems. The CO2 emitted from the burning of fossil fuels is not readily offset by any natural processes.
An ESA (European Space Agency) instrument designed to measure air pollution observed the fire plumes over Canada. The TROPOspheric Monitoring Instrument, or TROPOMI, flies aboard the Sentinel 5P satellite, which has been orbiting Earth since 2017. TROPOMI has four spectrometers that measure and map trace gases and fine particles (aerosols) in the atmosphere.
The scientists started with the end result of the fires: the amount of carbon monoxide (CO) in the atmosphere during the fire season. Then they “back-calculated” how large the emissions must have been to produce that amount of CO. They were able to estimate how much CO2 was released based on ratios between the two gases in the fire plumes.
“What we found was that the fire emissions were bigger than anything in the record for Canada,” said Brendan Byrne, a JPL scientist and lead author of the new study. “We wanted to understand why.”
Warmest Conditions Since at Least 1980
Wildfire is essential to the health of forests, clearing undergrowth and brush and making way for new plant life. In recent decades, however, the number, severity, and overall size of wildfires have increased, according to the U.S. Department of Agriculture. Contributing factors include extended drought, past fire management strategies, invasive species, and the spread of residential communities into formerly less developed areas.
To explain why Canada’s fire season was so intense in 2023, the authors of the new study cited tinderbox conditions across its forests. Climate data revealed the warmest and driest fire season since at least 1980. Temperatures in the northwest part of the country — where 61% of fire emissions occurred — were more than 4.5 degrees Fahrenheit (2.6 degrees Celsius) above average from May through September. Precipitation was also more than 3 inches (8 centimeters) below average for much of the year.
Driven in large part by these conditions, many of the fires grew to enormous sizes. The fires were also unusually widespread, charring some 18 million hectares of forest from British Columbia in the west to Quebec and the Atlantic provinces in the east. The area of land that burned was more than eight times the 40-year average and accounted for 5% of Canadian forests.
“Some climate models project that the temperatures we experienced last year will become the norm by the 2050s,” Byrne said. “The warming, coupled with lack of moisture, is likely to trigger fire activity in the future.”
If events like the 2023 Canadian forest fires become more typical, they could impact global climate. That’s because Canada’s vast forests compose one of the planet’s important carbon sinks, meaning that they absorb more CO2 from the atmosphere than they release. The scientists said that it remains to be seen whether Canadian forests will continue to absorb carbon at a rapid rate or whether increasing fire activity could offset some of the uptake, diminishing the forests’ capacity to forestall climate warming.
News Media Contacts
Jane J. Lee / Andrew Wang
Jet Propulsion Laboratory, Pasadena, Calif.
818-354-0307 / 626-379-6874
jane.j.lee@jpl.nasa.gov / andrew.wang@jpl.nasa.gov
Written by Sally Younger
2024-113
Share
Details
Last Updated Aug 28, 2024 Related Terms
Earth Climate Change Earth Science Water on Earth Explore More
3 min read Eclipse Soundscapes AudioMoth Donations Will Study Nature at Night
During the April 8, 2024 total solar eclipse, approximately 770 AudioMoth recording devices were used…
Article 45 mins ago 9 min read Looking Back on Looking Up: The 2024 Total Solar Eclipse
Introduction First as a bite, then a half Moon, until crescent-shaped shadows dance through the…
Article 6 days ago 3 min read Entrepreneurs Challenge Prize Winner Uses Artificial Intelligence to Identify Methane Emissions
The NASA Science Mission Directorate (SMD) instituted the Entrepreneurs Challenge to identify innovative ideas and…
Article 1 week ago Keep Exploring Discover Related Topics
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
This photo shows the Wide Field Instrument for NASA’s Nancy Grace Roman Space Telescope arriving at the big clean room at NASA’s Goddard Space Flight Center. About the size of a commercial refrigerator, this instrument will help astronomers explore the universe’s evolution and the characteristics of worlds outside our solar system. Unlocking these cosmic mysteries and more will offer a better understanding of the nature of the universe and our place within it.NASA/Chris Gunn The primary instrument for NASA’s Nancy Grace Roman Space Telescope is a sophisticated camera that will survey the cosmos from the outskirts of our solar system all the way out to the edge of the observable universe. Called the Wide Field Instrument, it was recently delivered to the agency’s Goddard Space Flight Center in Greenbelt, Maryland.
The camera’s large field of view, sharp resolution, and sensitivity from visible to near-infrared wavelengths will give Roman a deep, panoramic view of the universe. Scanning much larger portions of the sky than astronomers can with NASA’s Hubble or James Webb space telescopes will open new avenues of cosmic exploration. Roman is designed to study dark energy (a mysterious cosmic pressure thought to accelerate the universe’s expansion), dark matter (invisible matter seen only via its gravitational influence), and exoplanets (worlds beyond our solar system).
“This instrument will turn signals from space into a new understanding of how our universe works,” said Julie McEnery, the Roman senior project scientist at Goddard. “To achieve its main goals, the mission will precisely measure hundreds of millions of galaxies. That’s quite a dataset for all kinds of researchers to pull from, so there will be a flood of results on a vast array of science.”
Technicians inspect NASA’s Nancy Grace Roman Space Telescope’s Wide Field Instrument upon delivery to the big clean room at NASA’s Goddard Space Flight Center.NASA/Chris Gunn About 1,000 people contributed to the Wide Field Instrument’s development, from the initial design phase to assembling it from around a million individual components. The WFI’s design was a collaborative effort between Goddard and BAE Systems in Boulder, Colorado. Teledyne Imaging Sensors, Hawaii Aerospace Corporation, Applied Aerospace Structures Corporation, Northrop Grumman, Honeybee Robotics, CDA Intercorp, Alluxa, and JenOptik provided critical components. Those parts and many more, made by other vendors, were delivered to Goddard and BAE Systems, where they were assembled and tested prior to the instrument’s delivery to Goddard this month.
“I am so happy to be delivering this amazing instrument,” said Mary Walker, Roman’s Wide Field Instrument manager at Goddard. “All the years of hard work and the team’s dedication have brought us to this exciting moment.”
NASA’s Nancy Grace Roman Space Telescope is a next-generation observatory that will survey the infrared universe from beyond the orbit of the Moon. The spacecraft’s giant camera, the Wide Field Instrument, will be fundamental to this exploration. Data it gathers will enable scientists to discover new and uniquely detailed information about planetary systems around other stars. The instrument will also map how matter is structured and distributed throughout the cosmos, which could ultimately allow scientists to discover the fate of the universe. Watch this video to see a simplified version of how the Wide Field Instrument works.
NASA’s Goddard Space Flight Center Seeing the Bigger Picture
After Roman launches by May 2027, each of the Wide Field Instrument’s 300-million-pixel images will capture a patch of the sky bigger than the apparent size of a full moon. The instrument’s large field of view will enable sweeping celestial surveys, revealing billions of cosmic objects across vast stretches of time and space. Astronomers will conduct research that could take hundreds of years using other telescopes.
And by observing from space, Roman’s camera will be very sensitive to infrared light –– light with longer wavelengths than our eyes can see –– from far across the cosmos. This ancient cosmic light will help scientists address some of the biggest cosmic mysteries, one of which is how the universe evolved to its present state.
From the telescope, light’s path through the instrument begins by passing through one of several optical elements in a large wheel. These elements include filters, which allow specific wavelengths of light to pass through, and a grism and prism, which split light into all of its individual colors. These detailed patterns, called spectra, reveal information about the object that emitted the light.
Then, the light travels on toward the camera’s set of 18 detectors, which each contain 16 million pixels. The large number of detectors and pixels gives Roman its large field of view. The instrument is designed for accurate, stable images and exquisite precision in measuring the exact amount of light in every pixel of every image, giving Roman unprecedented power to study dark energy. The detectors will be held at about minus 300 degrees Fahrenheit (minus 184 degrees Celsius) to increase sensitivity to the infrared universe.
“When the light reaches the detectors, that marks the end of what may have been a 10-billion-year journey through space,” said Art Whipple, an aerospace engineer at Goddard who has contributed to the Wide Field Instrument’s design and construction for more than a decade.
Once Roman begins observing, its rapid data delivery will require new analysis techniques.
“If we had every astronomer on Earth working on Roman data, there still wouldn’t be nearly enough people to go through it all,” McEnery said. “We’re looking at modern techniques like machine learning and artificial intelligence to help sift through Roman’s observations and find where the most exciting things are.”
Now that the Wide Field Instrument is at Goddard, it will be tested to ensure everything is operating as expected. It will be integrated onto the instrument carrier and mated to the telescope this fall, bringing scientists one step closer to making groundbreaking discoveries for decades to come.
One panel on the Wide Field Instrument for NASA’s Nancy Grace Roman Space Telescope contains hundreds of names of team members who helped design and build the instrument.BAE Systems To virtually tour an interactive version of the telescope, visit:
https://roman.gsfc.nasa.gov/interactive
The Nancy Grace Roman Space Telescope is managed at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, with participation by NASA’s Jet Propulsion Laboratory and Caltech/IPAC in Southern California, the Space Telescope Science Institute in Baltimore, and a science team comprising scientists from various research institutions. The primary industrial partners are BAE Systems, Inc. in Boulder, Colorado; L3Harris Technologies in Rochester, New York; and Teledyne Scientific & Imaging in Thousand Oaks, California.
By Ashley Balzer
NASA’s Goddard Space Flight Center, Greenbelt, Md.
Media contact:
Claire Andreoli
claire.andreoli@nasa.gov
NASA’s Goddard Space Flight Center, Greenbelt, Md.
301-286-1940
Explore More
3 min read NASA’s Roman Space Telescope’s ‘Eyes’ Pass First Vision Test
Article 4 months ago 6 min read How NASA’s Roman Space Telescope Will Chronicle the Active Cosmos
Article 9 months ago 5 min read NASA Tests Deployment of Roman Space Telescope’s ‘Visor’
Article 4 days ago Share
Details
Last Updated Aug 13, 2024 EditorAshley BalzerContactAshley Balzerashley.m.balzer@nasa.govLocationGoddard Space Flight Center Related Terms
Nancy Grace Roman Space Telescope Dark Energy Dark Matter Exoplanets Goddard Space Flight Center Science-enabling Technology The Universe View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.