Jump to content

Integrating Data-Driven and Physics-Based Models for Plume-Surface Interaction Predictions


NASA

Recommended Posts

  • Publishers

1 min read

Preparations for Next Moonwalk Simulations Underway (and Underwater)

Laura Villafane
University of Illinois at Urbana-Champaign

Rocket engine exhaust during lunar landings can blow away a large amount of lunar regolith causing damage to nearby hardware and the landing spacecraft itself. The complex physics governing this behavior is not well understood making it hard to predict and mitigate its effects. Professor Villafane’s team will use a multi-stage approach to address this issue, in which advanced image and data processing tools, statistical models, and modern machine learning algorithms are combined. The team will extract the most relevant quantities of interest for cratering, erosion, and ejecta from the large volume of parametric experimental data, and to use them to derive simple closed-form models of rocket plume – surface interaction phenomena.

Back to ESI 2023

View the full article

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      A collaboration between IMPACT and IBM has produced INDUS, a comprehensive suite of large language models (LLMs) tailored for the domains of Earth science, biological and physical sciences, heliophysics, planetary sciences, and astrophysics and trained using curated scientific corpora drawn from diverse data sources. Kaylin Bugbee (ST11), team lead of NASA’s Science Discovery Engine (SDE), spoke to the benefit INDUS offers to existing applications: “Large language models are rapidly changing the search experience. The Science Discovery Engine, a unified, insightful search interface for all of NASA’s open science data and information, has prototyped integrating INDUS into its search engine. Initial results have shown that INDUS improved the accuracy and relevancy of the returned results.”
      The INDUS models are openly available on Hugging Face. For the benefit of the scientific community, the team has released the developed models and will release the benchmark datasets that span named entity recognition for climate change, extractive QA for Earth science, and information retrieval for multiple domains. A paper on INDUS, “INDUS: Effective and Efficient Language Models for Scientific Applications,” is available at https://arxiv.org/pdf/2405.10725.
      View the full article
    • By NASA
      SatSummit brings together leaders in the satellite industry and global development experts for two days of presentations and discussions on using satellite data to address critical development challenges. Rahul Ramachandran (ST11/IMPACT) participated in a panel focused on large earth foundation models, offering an overview of AI foundation models and their potential for societal good. He detailed NASA’s approach to building these models and the agency’s overall strategy, underscoring their importance in advancing Earth science and global development initiatives.

      View the full article
    • By NASA
      4 min read
      NASA-IBM Collaboration Develops INDUS Large Language Models for Advanced Science Research
      Named for the southern sky constellation, INDUS (stylized in all caps) is a comprehensive suite of large language models supporting five science domains. NASA By Derek Koehl
      Collaborations with private, non-federal partners through Space Act Agreements are a key component in the work done by NASA’s Interagency Implementation and Advanced Concepts Team (IMPACT). A collaboration with International Business Machines (IBM) has produced INDUS, a comprehensive suite of large language models (LLMs) tailored for the domains of Earth science, biological and physical sciences, heliophysics, planetary sciences, and astrophysics and trained using curated scientific corpora drawn from diverse data sources.
      INDUS contains two types of models; encoders and sentence transformers. Encoders convert natural language text into numeric coding that can be processed by the LLM. The INDUS encoders were trained on a corpus of 60 billion tokens encompassing astrophysics, planetary science, Earth science, heliophysics, biological, and physical sciences data. Its custom tokenizer developed by the IMPACT-IBM collaborative team improves on generic tokenizers by recognizing scientific terms like biomarkers and phosphorylated. Over half of the 50,000-word vocabulary contained in INDUS is unique to the specific scientific domains used for its training. The INDUS encoder models were used to fine tune the sentence transformer models on approximately 268 million text pairs, including titles/abstracts and questions/answers.
      By providing INDUS with domain-specific vocabulary, the IMPACT-IBM team achieved superior performance over open, non-domain specific LLMs on a benchmark for biomedical tasks, a scientific question-answering benchmark, and Earth science entity recognition tests. By designing for diverse linguistic tasks and retrieval augmented generation, INDUS is able to process researcher questions, retrieve relevant documents, and generate answers to the questions. For latency sensitive applications, the team developed smaller, faster versions of both the encoder and sentence transformer models.
      Validation tests demonstrate that INDUS excels in retrieving relevant passages from the science corpora in response to a NASA-curated test set of about 400 questions. IBM researcher Bishwaranjan Bhattacharjee commented on the overall approach: “We achieved superior performance by not only having a custom vocabulary but also a large specialized corpus for training the encoder model and a good training strategy. For the smaller, faster versions, we used neural architecture search to obtain a model architecture and knowledge distillation to train it with supervision of the larger model.”
      NASA Chief Scientist Kate Calvin gives remarks in a NASA employee town hall on how the agency is using and developing Artificial Intelligence (AI) tools to advance missions and research, Wednesday, May 22, 2024, at the NASA Headquarters Mary W. Jackson Building in Washington. The INDUS suite of models will help facilitate the agency’s AI goals. NASA/Bill Ingalls INDUS was also evaluated using data from NASA’s Biological and Physical Sciences (BPS) Division. Dr. Sylvain Costes, the NASA BPS project manager for Open Science, discussed the benefits of incorporating INDUS: “Integrating INDUS with the Open Science Data Repository  (OSDR) Application Programming Interface (API) enabled us to develop and trial a chatbot that offers more intuitive search capabilities for navigating individual datasets. We are currently exploring ways to improve OSDR’s internal curation data system by leveraging INDUS to enhance our curation team’s productivity and reduce the manual effort required daily.”
      At the NASA Goddard Earth Sciences Data and Information Services Center (GES-DISC), the INDUS model was fine-tuned using labeled data from domain experts to categorize publications specifically citing GES-DISC data into applied research areas. According to NASA principal data scientist Dr. Armin Mehrabian, this fine-tuning “significantly improves the identification and retrieval of publications that reference GES-DISC datasets, which aims to improve the user journey in finding their required datasets.” Furthermore, the INDUS encoder models are integrated into the GES-DISC knowledge graph, supporting a variety of other projects, including the dataset recommendation system and GES-DISC GraphRAG.
      Kaylin Bugbee, team lead of NASA’s Science Discovery Engine (SDE), spoke to the benefit INDUS offers to existing applications: “Large language models are rapidly changing the search experience. The Science Discovery Engine, a unified, insightful search interface for all of NASA’s open science data and information, has prototyped integrating INDUS into its search engine. Initial results have shown that INDUS improved the accuracy and relevancy of the returned results.”
      INDUS enhances scientific research by providing researchers with improved access to vast amounts of specialized knowledge. INDUS can understand complex scientific concepts and reveal new research directions based on existing data. It also enables researchers to extract relevant information from a wide array of sources, improving efficiency. Aligned with NASA and IBM’s commitment to open and transparent artificial intelligence, the INDUS models are openly available on Hugging Face. For the benefit of the scientific community, the team has released the developed models and will release the benchmark datasets that span named entity recognition for climate change, extractive QA for Earth science, and information retrieval for multiple domains. The INDUS encoder models are adaptable for science domain applications, and the INDUS retriever models support information retrieval in RAG applications.
      A paper on INDUS, “INDUS: Effective and Efficient Language Models for Scientific Applications,” is available on arxiv.org.
      Learn more about the Science Discovery Engine here.
      Share








      Details
      Last Updated Jun 24, 2024 Related Terms
      Open Science Explore More
      4 min read Marshall Research Scientist Enables Large-Scale Open Science


      Article


      5 days ago
      2 min read NASA’s Repository Supports Research of Commercial Astronaut Health  


      Article


      2 weeks ago
      4 min read NASA, IBM Research to Release New AI Model for Weather, Climate


      Article


      1 month ago
      Keep Exploring Discover Related Topics
      Missions



      Humans in Space



      Climate Change



      Solar System


      View the full article
    • By NASA
      4 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Say cheese, Moon. We’re coming in for a close-up.
      As Intuitive Machines’ Nova-C lander descends toward the Moon, four tiny NASA cameras will be trained on the lunar surface, collecting imagery of how the surface changes from interactions with the spacecraft’s engine plume.
      The Stereo Cameras for Lunar Plume-Surface Studies will help us to land larger payloads as we explore space. Olivia Tyrrell from the SCALPPS photogrammetry team explains how a small array of cameras will capture invaluable imagery during lunar descent and landing, and how that imagery can inform our future missions to the Moon and beyond. Developed at NASA’s Langley Research Center in Hampton, Virginia, Stereo Cameras for Lunar Plume-Surface Studies (SCALPSS) is an array of cameras placed around the base of a lunar lander to collect imagery during and after descent. Using a technique called stereo photogrammetry, researchers at Langley will use the overlapping images from the version of SCALPSS on Nova-C — SCALPSS 1.0 — to produce a 3D view of the surface.
      These images of the Moon’s surface won’t just be a “gee-whiz” novelty. As trips to the Moon increase and the number of payloads touching down in proximity to one another grows, scientists and engineers need to be able to accurately predict the effects of landings.
      How much will the surface change? As a lander comes down, what happens to the lunar soil, or regolith, it ejects? With limited data collected during descent and landing to date, SCALPSS will be the first dedicated instrument to measure plume-surface interaction on the Moon in real time and help to answer these questions.
      “If we’re placing things  – landers, habitats, etc. – near each other, we could be sand blasting what’s next to us, so that’s going to drive requirements on protecting those other assets on the surface, which could add mass, and that mass ripples through the architecture,” said Michelle Munk, principal investigator for SCALPSS and acting chief architect for NASA’s Space Technology Mission Directorate at NASA Headquarters. “It’s all part of an integrated engineering problem.”
      Under Artemis, NASA intends to collaborate with commercial and international partners to establish the first long-term presence on the Moon. On this Commercial Lunar Payload Services (CLPS) initiative delivery, SCALPSS 1.0 is purely focused on how the lander alters the surface of the Moon during landing. It will begin capturing imagery from before the time the lander’s plume begins interacting with the surface until after the landing is complete.
      The final images will be gathered on a small onboard data storage unit before being sent to the lander for downlink back to Earth. The team will likely need at least a couple of months to process the images, verify the data, and generate the 3D digital elevation maps of the surface. The expected depression they reveal probably won’t be very deep — not this time, anyway.
      “Even if you look at the old Apollo images — and the Apollo crewed landers were larger than these new robotic landers — you have to look really closely to see where the erosion took place,” said Rob Maddock, SCALPSS project manager at Langley. “We’re anticipating something on the order of centimeters deep — maybe an inch. It really depends on the landing site and how deep the regolith is and where the bedrock is.”
      But this is a chance for researchers to see how well SCALPSS will work as the U.S. advances into a future where Human-Landing-Systems-class spacecraft will start making trips to the Moon.
      “Those are going to be much larger than even Apollo. Those are pretty large engines, and they could conceivably dig some good holes,” said Maddock. “So that’s what we’re doing. We’re collecting data we can use to validate the models that are predicting what will happen.”
      SCALPSS 1.1, which will feature two additional cameras, is scheduled to fly on another CLPS delivery — Firefly Aerospace’s Blue Ghost — later this year. The extra cameras are optimized to take images at a higher altitude, prior to the expected onset of plume-surface interaction, and provide a more accurate before-and-after comparison.
      SCALPSS 1.0 was funded by NASA’s Science Mission Directorate through the NASA-Provided Lunar Payloads Program. The SCALPSS 1.1 project is funded by the Space Technology Mission Directorate’s Game Changing Development Program.
      NASA is working with several American companies to deliver science and technology to the lunar surface through the CLPS initiative.
      These companies, ranging in size, bid on delivering payloads for NASA. This includes everything from payload integration and operations, to launching from Earth and landing on the surface of the Moon.
      Joe Atkinson
      NASA Langley Research Center
      Explore More
      5 min read NASA to Study Effects of Radio Noise on Lunar Science
      Article 1 day ago 1 min read Intuitive Machines IM-1 Mission
      Article 2 days ago 4 min read NASA Autonomous Flight Software Successfully Used in Air Taxi Stand-Ins
      Article 1 week ago Share
      Details
      Last Updated Feb 02, 2024 Related Terms
      Commercial Lunar Payload Services (CLPS) Langley Research Center View the full article
    • By NASA
      1 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      David Scarborough
      Auburn University
      Professor Scarborough will develop and implement tools to extract critical data from experimental measurements of plume surface interaction (PSI) to identify and classify dominant regimes, develop physics-based, semi-empirical models to predict the PSI phenomena, and quantify the uncertainties. The team will adapt and apply state-of-the-art image processing techniques such as edge detection, 3D-stereo reconstruction to extract the cratering dynamics, and particle tracking velocimetry to extract ejecta dynamics and use supervised Machine Learning algorithms to identify patterns. The models developed will establish a relationship between crater geometry and ejecta dynamics, including quantified uncertainties.
      Back to ESI 2023
      Keep Exploring Discover More Topics From STRG
      Space Technology Mission Directorate
      STMD Solicitations and Opportunities
      Space Technology Research Grants
      About STRG
      View the full article
  • Check out these Videos

×
×
  • Create New...