Jump to content

Weld-ASSIST: Weldability Assessment for In-Space Conditions using a Digital Twin


Recommended Posts

  • Publishers
Posted

2 min read

Preparations for Next Moonwalk Simulations Underway (and Underwater)

ESI24 Haghighi Quadchart

Azadeh Haghighi
University of Illinois, Chicago

In-space manufacturing and assembly are vital to NASA’s long-term exploration goals, especially for the Moon and Mars missions. Deploying welding technology in space enables the assembly and repair of structures, reducing logistical burdens and supply needs from Earth. The unique challenges and extreme conditions of space–high thermal variations, microgravity, and vacuum–require advanced welding techniques and computational tools to ensure reliability, repeatability, safety, and structural integrity in one-shot weld scenarios. For the first time, this project investigates these challenges by focusing on three key factors: (1) Very low temperatures in space degrade the weldability of high thermal conductivity materials, like aluminum alloys, making it harder to achieve strong, defect-free welds. (2) The extreme vacuum in space lowers the boiling points of alloying elements, altering the keyhole geometry during welding. This selective vaporization changes the weld’s final chemical composition, affecting its microstructure and properties. (3) Microgravity nearly eliminates buoyancy-driven flow of liquid metal inside the molten pool, preventing gas bubbles from escaping, which leads to porosity and defects in the welds. By examining these critical factors using multi-scale multi-physics models integrated with physics-informed machine learning, and forward/inverse uncertainty quantification techniques, this project provides the first-ever real-time digital twin platform to evaluate welding processes under extreme space/lunar conditions. The models are validated through Earth-based experiments, parabolic flight tests, and publicly available data from different databases and agencies worldwide. Moreover, the established models will facilitate extendibility to support in-situ resource utilization on the Moon, including construction and repair using locally sourced materials like regolith. The established fundamental scientific knowledge will minimize trial-and-error, enable high-quality one-shot welds in space, and reduce the need for reworks, significantly reducing the costs and time needed for space missions.

Back to ESI 2024

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      4 min read
      Entrepreneurs Challenge Winner PRISM is Using AI to Enable Insights from Geospatial Data
      PRISM’s platform uses AI segmentation to identify and highlight residential structures in a neighborhood. NASA sponsored Entrepreneurs Challenge events in 2020, 2021, and 2023 to invite small business start-ups to showcase innovative ideas and technologies with the potential to advance the agency’s science goals. To potentially leverage external funding sources for the development of innovative technologies of interest to NASA, SMD involved the venture capital community in Entrepreneurs Challenge events. Challenge winners were awarded prize money, and in 2023 the total Entrepreneurs Challenge prize value was $1M. Numerous challenge winners have subsequently refined their products and/or received funding from NASA and external sources (e.g., other government agencies or the venture capital community) to further develop their technologies.
      One 2023 Entrepreneurs Challenge winner, PRISM Intelligence (formerly known as Pegasus Intelligence and Space), is using artificial intelligence (AI) and other advances in computer vision to create a new platform that could provide geospatial insights to a broad community.
      Every day, vast amounts of remote sensing data are collected through satellites, drones, and aerial imagery, but for most businesses and individuals, accessing and extracting meaningful insights from this data is nearly impossible.  
      The company’s product—Personal Real-time Insight from Spatial Maps, a.k.a. PRISM—is transforming geospatial data into an easy-to-navigate, queryable world. By leveraging 3D computer vision, geospatial analytics, and AI-driven insights, PRISM creates photorealistic, up-to-date digital environments that anyone can interact with. Users can simply log in and ask natural-language questions to instantly retrieve insights—no advanced Geographic Information System (GIS) expertise is required.
      For example, a pool cleaner looking for business could use PRISM to search for all residential pools in a five-mile radius. A gardener could identify overgrown trees in a community. City officials could search for potholes in their jurisdiction to prioritize repairs, enhance public safety, and mitigate liability risks. This broad level of accessibility brings geospatial intelligence out of the hands of a few and into everyday decision making.
      The core of PRISM’s platform uses radiance fields to convert raw 2D imagery into high-fidelity, dynamic 3D visualizations. These models are then enhanced with AI-powered segmentation, which autonomously identifies and labels objects in the environment—such as roads, vehicles, buildings, and natural features—allowing for seamless search and analysis. The integration of machine learning enables PRISM to refine its reconstructions continuously, improving precision with each dataset. This advanced processing ensures that the platform remains scalable, efficient, and adaptable to various data sources, making it possible to produce large-scale, real-time digital twins of the physical world.
      The PRISM platform’s interface showcasing a 3D digital twin of California State Polytechnic University, Pomona, with AI-powered search and insights. “It’s great being able to push the state of the art in this relatively new domain of radiance fields, evolving it from research to applications that can impact common tasks. From large sets of images, PRISM creates detailed 3D captures that embed more information than the source pictures.” — Maximum Wilder-Smith, Chief Technology Officer, PRISM Intelligence
      Currently the PRISM platform uses proprietary data gathered from aerial imagery over selected areas. PRISM then generates high-resolution digital twins of cities in select regions. The team is aiming to eventually expand the platform to use NASA Earth science data and commercial data, which will enable high-resolution data capture over larger areas, significantly increasing efficiency, coverage, and update frequency. PRISM aims to use the detailed multiband imagery that NASA provides and the high-frequency data that commercial companies provide to make geospatial intelligence more accessible by providing fast, reliable, and up-to-date insights that can be used across multiple industries.
      What sets PRISM apart is its focus on usability. While traditional GIS platforms require specialized training to use, PRISM eliminates these barriers by allowing users to interact with geospatial data through a frictionless, conversational interface.
      The impact of this technology could extend across multiple industries. Professionals in the insurance and appraisal industries have informed the company how the ability to generate precise, 3D assessments of properties could streamline risk evaluations, reduce costs, and improve accuracy—replacing outdated or manual site visits. Similarly, local governments have indicated they could potentially use PRISM to better manage infrastructure, track zoning compliance, and allocate resources based on real-time, high-resolution urban insights. Additionally, scientists could use the consistent updates and layers of three-dimensional data that PRISM can provide to better understand changes to ecosystems and vegetation.
      As PRISM moves forward, the team’s focus remains on scaling its capabilities and expanding its applications. Currently, the team is working to enhance the technical performance of the platform while also adding data sources to enable coverage of more regions. Future iterations will further improve automation of data processing, increasing the speed and efficiency of real-time 3D reconstructions. The team’s goal is to expand access to geospatial insights, ensuring that anyone—from city planners to business owners—can make informed decisions using the best possible data.
      PRISM Intelligence founders Zachary Gaines, Hugo Delgado, and Maximum Wilder-Smith in their California State Polytechnic University, Pomona lab, where the company was first formed. Share








      Details
      Last Updated Apr 21, 2025 Related Terms
      Earth Science Division Earth Science Science-enabling Technology Technology Highlights Explore More
      4 min read NASA Aims to Fly First Quantum Sensor for Gravity Measurements


      Article


      7 days ago
      4 min read GLOBE Mission Earth Supports Career Technical Education


      Article


      2 weeks ago
      4 min read New York Math Teacher Measures Trees & Grows Scientists with GLOBE


      Article


      2 weeks ago
      View the full article
    • By NASA
      Tess Caswell, a stand-in crew member for the Artemis III Virtual Reality Mini-Simulation, executes a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. The simulation was a test of using VR as a training method for flight controllers and science teams’ collaboration on science-focused traverses on the lunar surface. Credit: NASA/Robert Markowitz When astronauts walk on the Moon, they’ll serve as the eyes, hands, and boots-on-the-ground interpreters supporting the broader teams of scientists on Earth. NASA is leveraging virtual reality to provide high-fidelity, cost-effective support to prepare crew members, flight control teams, and science teams for a return to the Moon through its Artemis campaign.
      The Artemis III Geology Team, led by principal investigator Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, participated in an Artemis III Surface Extra-Vehicular VR Mini-Simulation, or “sim” at NASA’s Johnson Space Center in Houston in the fall of 2024. The sim brought together science teams and flight directors and controllers from Mission Control to carry out science-focused moonwalks and test the way the teams communicate with each other and the astronauts.
      “There are two worlds colliding,” said Dr. Matthew Miller, co-lead for the simulation and exploration engineer, Amentum/JETSII contract with NASA. “There is the operational world and the scientific world, and they are becoming one.”
      NASA mission training can include field tests covering areas from navigation and communication to astronaut physical and psychological workloads. Many of these tests take place in remote locations and can require up to a year to plan and large teams to execute. VR may provide an additional option for training that can be planned and executed more quickly to keep up with the demands of preparing to land on the Moon in an environment where time, budgets, and travel resources are limited.
      VR helps us break down some of those limitations and allows us to do more immersive, high-fidelity training without having to go into the field. It provides us with a lot of different, and significantly more, training opportunities.
      BRI SPARKS
      NASA co-lead for the simulation and Extra Vehicular Activity Extended Reality team at Johnson.
      Field testing won’t be going away. Nothing can fully replace the experience crew members gain by being in an environment that puts literal rocks in their hands and incudes the physical challenges that come with moonwalks, but VR has competitive advantages.
      The virtual environment used in the Artemis III VR Mini-Sim was built using actual lunar surface data from one of the Artemis III candidate regions. This allowed the science team to focus on Artemis III science objectives and traverse planning directly applicable to the Moon. Eddie Paddock, engineering VR technical discipline lead at NASA Johnson, and his team used data from NASA’s Lunar Reconnaissance Orbiter and planet position and velocity over time to develop a virtual software representation of a site within the Nobile Rim 1 region near the south pole of the Moon. Two stand-in crew members performed moonwalk traverses in virtual reality in the Prototype Immersive Technology lab at Johnson, and streamed suit-mounted virtual video camera views, hand-held virtual camera imagery, and audio to another location where flight controllers and science support teams simulated ground communications.
      A screen capture of a virtual reality view during the Artemis III VR Mini-Simulation. The lunar surface virtual environment was built using actual lunar surface data from one of the Artemis III candidate regions. Credit: Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. The crew stand-ins were immersed in the lunar environment and could then share the experience with the science and flight control teams. That quick and direct feedback could prove critical to the science and flight control teams as they work to build cohesive teams despite very different approaches to their work.
      The flight operations team and the science team are learning how to work together and speak a shared language. Both teams are pivotal parts of the overall mission operations. The flight control team focuses on maintaining crew and vehicle safety and minimizing risk as much as possible. The science team, as Miller explains, is “relentlessly thirsty” for as much science as possible. Training sessions like this simulation allow the teams to hone their relationships and processes.
      Members of the Artemis III Geology Team and science support team work in a mock Science Evaluation Room during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Video feeds from the stand-in crew members’ VR headsets allow the science team to follow, assess, and direct moonwalks and science activities. Credit: NASA/Robert Markowitz Denevi described the flight control team as a “well-oiled machine” and praised their dedication to getting it right for the science team. Many members of the flight control team have participated in field and classroom training to learn more about geology and better understand the science objectives for Artemis.
      “They have invested a lot of their own effort into understanding the science background and science objectives, and the science team really appreciates that and wants to make sure they are also learning to operate in the best way we can to support the flight control team, because there’s a lot for us to learn as well,” Denevi said. “It’s a joy to get to share the science with them and have them be excited to help us implement it all.”
      Artemis III Geology Team lead Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, left, Artemis III Geology Team member, Dr. Jose Hurtado, University of Texas at El Paso, and simulation co-lead, Bri Sparks, work together during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz This simulation, Sparks said, was just the beginning for how virtual reality could supplement training opportunities for Artemis science. In the future, using mixed reality could help take the experience to the next level, allowing crew members to be fully immersed in the virtual environment while interacting with real objects they can hold in their hands. Now that the Nobile Rim 1 landing site is built in VR, it can continue to be improved and used for crew training, something that Sparks said can’t be done with field training on Earth.
      While “virtual” was part of the title for this exercise, its applications are very real.
      “We are uncovering a lot of things that people probably had in the back of their head as something we’d need to deal with in the future,” Miller said. “But guess what? The future is now. This is now.”
      Test subject crew members for the Artemis III Virtual Reality Mini-Simulation, including Grier Wilt, left, and Tess Caswell, center, execute a moonwalk in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Grier Wilt, left, and Tess Caswell, crew stand-ins for the Artemis III Virtual Reality Mini-Simulation, execute a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Engineering VR technical discipline lead Eddie Paddock works with team members to facilitate the virtual reality components of the Artemis III Virtual Reality Mini-Simulation in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: Robert Markowitz Flight director Paul Konyha follows moonwalk activities during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz




      Rachel Barry
      NASA’s Johnson Space Center
      Keep Exploring Discover More Topics From NASA
      Astromaterials



      Artemis Science


      A Time Capsule The Moon is a 4.5-billion-year-old time capsule, pristinely preserved by the cold vacuum of space. It is…


      Lunar Craters


      Earth’s Moon is covered in craters. Lunar craters tell us the history not only of the Moon, but of our…


      Solar System


      View the full article
    • By NASA
      3 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      How to Attend
      The workshop will be hosted by NASA Jet Propulsion Laboratory.
      Virtual and in-person attendance are available. Registration is required for both. (Link coming soon!)
      Virtual attendees will receive connection information one week before the workshop.
      Background, Goals and Objectives
      The NASA Engineering and Safety Center (NESC) is conducting an assessment of the state of cold capable electronics for future lunar surface missions. The intent is to enable the continuous use of electronics with minimal or no thermal management on missions of up to 20 years in all regions of the lunar surface, e.g., permanently shadowed regions and equatorial. The scope of the assessment includes: capture of the state of cold electronics at NASA, academia, and industry; applications and challenges for lunar environments; gap analyses of desired capabilities vs state of the art/practice; guidance for cold electronics selection, evaluation and qualification; and recommendations for technology advances and follow-on actions to close the gaps. The preliminary report of the assessment will be available the first week of April 2025 on this website, i.e., 3 weeks prior to the workshop. Attendees are urged to read the report beforehand as the workshop will provide only a limited, high-level summary of the report’s key findings. The goal of the workshop is to capture your feedback with regards to the findings of the report, especially in the areas below: Technologies, new or important studies or data that we missed. Gaps, i.e. requirements vs available capabilities that we missed. Additional recommendations, suggestions, requests, that we missed.
      Preliminary Agenda
      Day 1, April 30, 2025 8:00 – 9:00      Sign-in 9:00 – 10:00    Introduction – Y. Chen 10:00 – 11:00  Environment and Architectural Considerations – R. Some 11:00 – 12:00 Custom Electronics – M. Mojarradi 12:00 – 13:00  Lunch 13:00 – 14:00  COTS Components – J. Yang-Scharlotta 14:00 – 15:00  Power Architecture – R. Oeftering 15:00 – 15:30  Energy Storage – E. Brandon 15:30 – 17:00  Materials and Packaging and Passives – L. Del Castillo 17:00 – 17:30  Qualification – Y. Chen 18:30               Dinner Day 2, May 1, 2025 8:00 – 9:00      Sign-in 9:00 – 12:00    Review and discussion of key findings   12:00 – 13:00  Lunch 13:00 – 15:00  Follow on work concepts & discussions. Please be prepared to discuss: 15 min each from industry primes and subsystem developers What would you like to see developed and how would it impact your future missions/platforms? 15:00 – 17:30  Follow on work concepts & discussions 15 min each from technology & component developers, academia, government agencies, etc. What would you like to be funded to do and what are benefits to NASA/missions? 17:00 – 17:30  Wrap up – Y. Chen Points of Contact
      If you have any questions regarding the workshop, please contact Roxanne Cena at Roxanne.R.Cena@jpl.nasa.gov and Amy K. Wilson at Amy.K.Wilson@jpl.nasa.gov
      Share
      Details
      Last Updated Feb 20, 2025 Related Terms
      NASA Engineering and Safety Center Explore More
      2 min read NESC Key In-Progress Technical Activities
      Article 1 week ago 5 min read Mechanical Systems TDT Support Reaches Across NASA Programs
      Article 2 months ago 2 min read NESC Assists in Heatshield Investigation
      Article 2 months ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      NASA Science Live: Asteroid Bennu Originated from World with Ingredients and Conditions for Life
    • By NASA
      3 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Equipped with state-of-the-art technology to test and evaluate communication, navigation, and surveillance systems NASA’s Pilatus PC-12 performs touch-and-go maneuvers over a runway at NASA’s Armstrong Flight Research Center in Edwards, California on Sept. 23, 2024. Researchers will use the data to understand Automatic Dependent Surveillance-Broadcast (ADS-B) signal loss scenarios for air taxi flights in urban areas. To prepare for ADS-B test flights pilots and crew from NASA Armstrong and NASA’s Glenn Research Center in Cleveland, ran a series of familiarization flights. These flights included several approach and landings, with an emphasis on avionics, medium altitude air-work with steep turns, slow flight and stall demonstrations.NASA/Steve Freeman As air taxis, drones, and other innovative aircraft enter U.S. airspace, systems that communicate an aircraft’s location will be critical to ensure air traffic safety.
      The Federal Aviation Administration (FAA) requires aircraft to communicate their locations to other aircraft and air traffic control in real time using an Automatic Dependent Surveillance-Broadcast (ADS-B) system. NASA is currently evaluating an ADS-B system’s ability to prevent collisions in a simulated urban environment. Using NASA’s Pilatus PC-12 aircraft, researchers are investigating how these systems could handle the demands of air taxis flying at low altitudes through cities.  
      When operating in urban areas, one particular challenge for ADS-B systems is consistent signal coverage. Like losing cell-phone signal, air taxis flying through densely populated areas may have trouble maintaining ADS-B signals due to distance or interference. If that happens, those vehicles become less visible to air traffic control and other aircraft in the area, increasing the likelihood of collisions.
      NASA pilot Kurt Blankenship maps out flight plans during a pre-flight brief. Pilots, crew, and researchers from NASA’s Armstrong Flight Research Center in Edwards, California and NASA’s Glenn Research Center in Cleveland are briefed on the flight plan to gather Automatic Dependent Surveillance-Broadcast signal data between the aircraft and ping-Stations on the ground at NASA Armstrong. These flights are the first cross-center research activity with the Pilatus-PC-12 at NASA Armstrong.NASA/Steve Freeman To simulate the conditions of an urban flight area and better understand signal loss patterns, NASA researchers established a test zone at NASA’s Armstrong Flight Research Center in Edwards, California, on Sept. 23 and 24, 2024.
      Flying in the agency’s Pilatus PC-12 in a grid pattern over four ADS-B stations, researchers collected data on signal coverage from multiple ground locations and equipment configurations. Researchers were able to pinpoint where signal dropouts occurred from the strategically placed ground stations in connection to the plane’s altitude and distance from the stations. This data will inform future placement of additional ground stations to enhance signal boosting coverage.  
      “Like all antennas, those used for ADS-B signal reception do not have a constant pattern,” said Brad Snelling, vehicle test team chief engineer for NASA’s Air Mobility Pathfinders project. “There are certain areas where the terrain will block ADS-B signals and depending on the type of antenna and location characteristics, there are also flight elevation angles where reception can cause signal dropouts,” Snelling said. “This would mean we need to place additional ground stations at multiple locations to boost the signal for future test flights. We can use the test results to help us configure the equipment to reduce signal loss when we conduct future air taxi flight tests.”
      Working in the Mobile Operations Facility at NASA’s Armstrong Flight Research Center in Edwards, California, NASA Advanced Air Mobility researcher Dennis Iannicca adjusts a control board to capture Automatic Dependent Surveillance-Broadcast (ADS-B) data during test flights. The data will be used to understand ADS-B signal loss scenarios for air taxi flights in urban areas.NASA/Steve Freeman The September flights at NASA Armstrong built upon earlier tests of ADS-B in different environments. In June, researchers at NASA’s Glenn Research Center in Cleveland flew the Pilatus PC-12 and found a consistent ADS-B signal between the aircraft and communications antennas mounted on the roof of the center’s Aerospace Communications Facility. Data from these flights helped researchers plan out the recent tests at NASA Armstrong. In December 2020, test flights performed under NASA’s Advanced Air Mobility National Campaign used an OH-58C Kiowa helicopter and ground-based ADS-B stations at NASA Armstrong to collect baseline signal information.
      NASA’s research in ADS-B signals and other communication, navigation, and surveillance systems will help revolutionize U.S. air transportation. Air Mobility Pathfinders researchers will evaluate the data from the three separate flight tests to understand the different signal transmission conditions and equipment needed for air taxis and drones to safely operate in the National Air Space. NASA will use the results of this research to design infrastructure to support future air taxi communication, navigation, and surveillance research and to develop new ADS-B-like concepts for uncrewed aircraft systems.
      Share
      Details
      Last Updated Jan 23, 2025 EditorDede DiniusContactLaura Mitchelllaura.a.mitchell@nasa.govLocationArmstrong Flight Research Center Related Terms
      Armstrong Flight Research Center Advanced Air Mobility Aeronautics Air Mobility Pathfinders project Airspace Operations and Safety Program Ames Research Center Glenn Research Center Langley Research Center Explore More
      2 min read NASA Glenn Trains Instructors for After-School STEM Program 
      Article 1 day ago 1 min read NASA Glenn Helps Bring Joy to Children in Need
      Article 1 day ago 4 min read NASA Sets Sights on Mars Terrain with Revolutionary Tire Tech
      Article 2 days ago Keep Exploring Discover More Topics From NASA
      Armstrong Flight Research Center
      Aeronautics
      Drones & You
      Air Mobility Pathfinders Project
      View the full article
  • Check out these Videos

×
×
  • Create New...