Members Can Post Anonymously On This Site
ESA’s digital Historical Archives open online
-
Similar Topics
-
By NASA
Ken Freeman (center) receives the ATCA Award for ATM-X Digital Information Platform (DIP) from Rachel Jackson, Chair ATCA Board of Directors (left) and Carey Fagan, President and CEO ATCA (right).NASA Air Traffic Control Association (ATCA) Award to the NASA ATM-X Digital Information Platform (DIP) Team
In November 2024, the Digital Information Platform (DIP) team received the prestigious Industry Award from the Air Traffic Control Association (ATCA) at the annual ATCA Connect Conference in Washington, DC. The award recognized the team’s efforts in supporting NASA’s Sustainable Flight National Partnership (SFNP), which aims for net-zero carbon emissions from aviation by 2050. The DIP sub-project focuses on increasing access to digital aviation information to enable efficient and sustainable airspace operations. DIP team has been conducting live operational demonstrations in North Texas Metroplex environment since 2022 with commercial airlines on the Collaborative Digital Departure Reroute (CDDR) tool that applies machine learning to make predictions on runway availability, departure times, and arrival times. DIP has signed Space Act Agreements with five major US airlines to carryout operational evaluation of CDDR in complex metroplex environments and is now deploying the CDDR capability to Houston. CDDR machine learning algorithm intelligently provides re-routing options to the operators by using real time weather and operational data reducing delays, fuel burn and carbon emissions. DIP is part of the Air Traffic Management – eXploration (ATM-X) project, which is focused on transforming the air traffic management system to accommodate new air vehicles. More information on the ATCA award is at: https://www.atca.org/detail-pages/news/2024/11/15/atca-presents-annual-awards-at-atca-connect-recognizing-exceptional-efforts-made-to-the-worldwide-air-traffic-control-and-airspace-system.
View the full article
-
By NASA
4 min read
NASA Open Science Reveals Sounds of Space
A composite image of the Crab Nebula features X-rays from Chandra (blue and white), optical data from Hubble (purple), and infrared data from Spitzer (pink). This image is one of several that can be experienced as a sonification through Chandra’s Universe of Sound project. X-ray: NASA/CXC/SAO; Optical: NASA/STScI; Infrared: NASA-JPL-Caltech NASA has a long history of translating astronomy data into beautiful images that are beloved by the public. Through its Chandra X-ray Observatory and Universe of Learning programs, NASA brings that principle into the world of audio in a project known as “A Universe of Sound.” The team has converted openly available data from Chandra, supplemented by open data from other observatories, into dozens of “sonifications,” with more on the way.
Following the open science principle of accessibility, “A Universe of Sound” helps members of the public who are blind or low vision experience NASA data in a new sensory way. Sighted users also enjoy listening to the sonifications.
“Open science is this way to not just have data archives that are accessible and incredibly rich, but also to enhance the data outputs themselves,” said Dr. Kimberly Arcand, the visualization scientist and emerging technology lead at Chandra and member of NASA’s Universe of Learning who heads up the sonification team. “I want everybody to have the same type of access to this data that I do as a scientist. Sonification is just one of those steps.”
Data sonification of the Milky Way galactic center, made using data from NASA’s Chandra X-ray Observatory, Hubble Space Telescope, and Spitzer Space Telescope. While the Chandra telescope provides data in X-ray wavelengths for most of the sonifications, the team also took open data from other observatories to create a fuller picture of the universe. Types of data used to create some of the sonifications include visual and ultraviolet light from the Hubble Space Telescope, infrared and visual light from the James Webb Space Telescope, and infrared light from the now-retired Spitzer Space Telescope.
The sonification team, which includes astrophysicist Matt Russo, musician Andrew Santaguida (both of the SYSTEM Sounds project), consultant Christine Malec, and Dr. Arcand, assigned each wavelength of observation to a different musical instrument or synthesized sound to create a symphony of data. Making the separate layers publicly available was important to the team to help listeners understand the data better.
“It’s not just about accessibility. It’s also about reproducibility,” Arcand said. “We’re being very specific with providing all of the layers of sound, and then describing what those layers are doing to make it more transparent and obvious which steps were taken and what process of translation has occurred.”
For example, in a sonification of the supernova remnant Cassiopeia A, modified piano sounds represent X-ray data from Chandra, strings and brass represent infrared data from Webb and Spitzer, and small cymbals represent stars located via visual light data from Hubble.
Data sonification of the Cassiopeia A supernova remnant, made using data from NASA’s Chandra X-ray Observatory, James Webb Space Telescope, and Hubble Space Telescope. The team brought together people of various backgrounds to make the project a success – scientists to obtain and interpret the data, audio engineers to mix the sonifications, and members of the blind and low vision community to direct the product into something that brought a greater understanding of the data.
“Another benefit to open science is it tends to open those pathways of collaboration,” Arcand said. “We invite lots of different community members into the process to make sure we’re creating something that adds value, that adds to the greater good, and that makes the investment in the data worthwhile.”
A documentary about the sonifications called “Listen to the Universe” is hosted on NASA+. Visitors can listen to all the team’s sonifications, including the separate layers from each wavelength of observation, on the Universe of Sound website.
By Lauren Leese
Web Content Strategist for the Office of the Chief Science Data Officer
Share
Details
Last Updated Dec 17, 2024 Related Terms
Chandra X-Ray Observatory Galaxies Open Science Stars Explore More
7 min read NASA’s Webb Finds Planet-Forming Disks Lived Longer in Early Universe
Article
2 days ago
2 min read Hubble Images a Grand Spiral
Article
5 days ago
6 min read Found: First Actively Forming Galaxy as Lightweight as Young Milky Way
Article
1 week ago
Keep Exploring Discover More Topics From NASA
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
Northrop Grumman & NASA Digital Engineering SAA Kick-off meeting at Thompson Space Innovation Center. NASA’s Digital Engineering is paving the way for exciting new possibilities. Their latest Space Act Agreement with Northrop Grumman promises to accelerate progress in space exploration through innovative collaboration.
Under NASA’s HQ Office of the Chief Engineer, Terry Hill the Digital Engineering Program Manager, recently signed a Space Act Agreement with Northrop Grumman Space Sector to explore digital engineering approaches to sharing information between industry partners and NASA. This collaboration aims to support NASA’s mission by advancing engineering practices to reduce the time from concept to flight. By leveraging digital engineering tools, this collaboration could lead to improved design, testing, and simulation processes, It could also help improve how the government and industry write contracts, making it easier and more efficient for them to share information. This would help both sides work together better, handle more complicated missions, and speed up the development of new space technologies.
This collaboration between NASA and Northrop Grumman brings exciting possibilities for the future of space exploration. By embracing digital engineering, both organizations are working toward more efficient, cost-effective missions and solutions to greater challenges. Beyond accelerating mission timelines, the insights and technologies developed through this collaboration could pave the way for groundbreaking advancements in space capabilities.
View the full article
-
By NASA
4 min read
NASA AI, Open Science Advance Natural Disaster Research and Recovery
Hurricane Ida is pictured as a category 2 storm from the International Space Station as it orbited 264 miles above the Gulf of Mexico. In the foreground is the Canadarm2 robotic arm with Dextre, the fine-tuned robotic hand, attached. NASA By Lauren Perkins
When you think of NASA, disasters such as hurricanes may not be the first thing to come to mind, but several NASA programs are building tools and advancing science to help communities make more informed decisions for disaster planning.
Empowered by NASA’s commitment to open science, the NASA Disasters Program supports disaster risk reduction, response, and recovery. A core element of the Disasters Program is providing trusted, timely, and actionable data to aid organizations actively responding to disasters.
Hurricane Ida made landfall in Louisiana Aug. 21, 2021, as a category 4 hurricane, one of the deadliest and most destructive hurricanes in the continental United States on record. The effects of the storm were widespread, causing devastating damage and affecting the lives of millions of people.
During Hurricane Ida, while first responders and other organizations addressed the storm’s impacts from the ground, the NASA Disasters program was able to provide a multitude of remotely sensed products. Some of the products and models included information on changes in soil moisture, changes in vegetation, precipitation accumulations, flood detection, and nighttime lights to help identify areas of power outages.
Image Before/After The NASA team shared the data with its partners on the NASA Disasters Mapping Portal and began participating in cross-agency coordination calls to determine how to further aid response efforts. To further connect and collaborate using open science efforts, NASA Disasters overlaid publicly uploaded photos on their Damage Proxy Maps to provide situational awareness of on-the-ground conditions before, during, and after the storm.
Immediate post-storm response is critical to saving lives; just as making informed, long- term response decisions are critical to providing equitable recovery solutions for all. One example of how this data can be used is blue tarp detection in the aftermath of Hurricane Ida.
Using artificial intelligence (AI) with NASA satellite images, the Interagency Implementation and Advanced Concepts Team (IMPACT), based at NASA’s Marshall Space Flight Center in Huntsville, Alabama, conducted a study to detect the number of blue tarps on rooftops in the aftermath of hurricanes, such as Ida, as a way of characterizing the severity of damage in local communities.
An aerial photograph shows damaged roofs from Hurricane Maria in 2017 in Barrio Obrero, Puerto Rico. In the wake of the hurricane, the Federal Emergency Management Agency (FEMA) and United States Army Corps of Engineers distributed 126,000 blue tarps and nearly 60,000 temporary blue roofs to people awaiting repairs on damaged homes. NASA While disasters cannot be avoided altogether, timely and accessible information helps communities worldwide reduce risk, improve response, hasten recovery, and build disaster resilience.
Through an initiative led by NASA’s Office of the Chief Science Data Officer, NASA and IBM are developing five open-source artificial intelligence foundation models trained on NASA’s expansive satellite repositories. This effort will help make NASA’s vast, ever-growing amounts of data more accessible and usable. Leveraging NASA’s AI expertise allows users to make faster, more informed decisions. User applications of the Prithvi Earth Foundation Models could range from identifying flood risks and predicting crop yields to forecasting long range atmospheric weather patterns.
“NASA is dedicated to ensuring that our scientific data are accessible and beneficial to all. Our AI foundation models are scientifically validated and adaptable to new data, designed to maximize efficiency and lower technical barriers. This ensures that even in the face of challenging disasters, response teams can be swift and effective,” said Kevin Murphy, NASA’s chief science data officer. “Through these efforts, we’re not only advancing scientific frontiers, but also delivering tangible societal benefits, providing data that can safeguard lives and improve resilience against future threats.”
Hear directly from some of the data scientists building these AI models, the NASA disaster response team, as well as hurricane hunters that fly directly into these devastating storms on NASA’s Curious Universe podcast.
Learn more about NASA’s AI for Science models at https://science.nasa.gov/artificial-intelligence-science/.
Share
Details
Last Updated Nov 26, 2024 Related Terms
Earth Natural Disasters Open Science Explore More
5 min read NASA Data Reveals Role of Green Spaces in Cooling Cities
Article
3 hours ago
5 min read 5 Surprising NASA Heliophysics Discoveries Not Related to the Sun
Article
6 days ago
14 min read NASA’s Brad Doorn Brings Farm Belt Wisdom to Space-Age Agriculture
From his South Dakota roots to leading NASA’s agricultural program, Brad Doorn’s mission has remained…
Article
6 days ago
Keep Exploring Discover More Topics From NASA
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
2 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
ESI24 Haghighi Quadchart
Azadeh Haghighi
University of Illinois, Chicago
In-space manufacturing and assembly are vital to NASA’s long-term exploration goals, especially for the Moon and Mars missions. Deploying welding technology in space enables the assembly and repair of structures, reducing logistical burdens and supply needs from Earth. The unique challenges and extreme conditions of space–high thermal variations, microgravity, and vacuum–require advanced welding techniques and computational tools to ensure reliability, repeatability, safety, and structural integrity in one-shot weld scenarios. For the first time, this project investigates these challenges by focusing on three key factors: (1) Very low temperatures in space degrade the weldability of high thermal conductivity materials, like aluminum alloys, making it harder to achieve strong, defect-free welds. (2) The extreme vacuum in space lowers the boiling points of alloying elements, altering the keyhole geometry during welding. This selective vaporization changes the weld’s final chemical composition, affecting its microstructure and properties. (3) Microgravity nearly eliminates buoyancy-driven flow of liquid metal inside the molten pool, preventing gas bubbles from escaping, which leads to porosity and defects in the welds. By examining these critical factors using multi-scale multi-physics models integrated with physics-informed machine learning, and forward/inverse uncertainty quantification techniques, this project provides the first-ever real-time digital twin platform to evaluate welding processes under extreme space/lunar conditions. The models are validated through Earth-based experiments, parabolic flight tests, and publicly available data from different databases and agencies worldwide. Moreover, the established models will facilitate extendibility to support in-situ resource utilization on the Moon, including construction and repair using locally sourced materials like regolith. The established fundamental scientific knowledge will minimize trial-and-error, enable high-quality one-shot welds in space, and reduce the need for reworks, significantly reducing the costs and time needed for space missions.
Back to ESI 2024
Keep Exploring Discover More Topics From STRG
Space Technology Mission Directorate
STMD Solicitations and Opportunities
Space Technology Research Grants
About STRG
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.