Jump to content

Recommended Posts

  • Publishers
Posted

5 min read

Preparations for Next Moonwalk Simulations Underway (and Underwater)

human-landing-system-2024-surface-astron
This artist’s concept shows astronauts working on the Moon alongside different technology systems. The Data & Reasoning Fabric technology could help these systems operate in harmony, supporting the astronauts and ground control on Earth.
Credit: NASA

Imagine your car is in conversation with other traffic and road signals as you travel. Those conversations help your car anticipate actions you can’t see: the sudden slowing of a truck as it begins to turn ahead of you, or an obscured traffic signal turning red. Meanwhile, this system has plotted a course that will drive you toward a station to recharge or refuel, while a conversation with a weather service prepares your windshield wipers and brakes for the rain ahead.

This trip requires a lot of communication among systems from companies, government agencies, and organizations. How might these different entities – each with their own proprietary technology – share data safely in real time to make your trip safe, efficient, and enjoyable?

Technologists at NASA’s Ames Research Center in California’s Silicon Valley created a framework called Data & Reasoning Fabric (DRF), a set of software infrastructure, tools, protocols, governance, and policies that allow safe, secure data sharing and logical prediction-making across different operators and machines. Originally developed with a focus on providing autonomous aviation drones with decision-making capabilities, DRF is now being explored for other applications.

This means that one day, DRF-informed technology could allow your car to receive traffic data safely and securely from nearby stoplights and share data with other vehicles on the road. In this scenario, DRF is the choreographer of a complex dance of moving objects, ensuring each moves seamlessly in relation to one another towards a shared goal. The system is designed to create an integrated environment, combining data from systems that would otherwise be unable to interact with each other.

“DRF is built to be used behind the scenes,” said David Alfano, chief of the Intelligent Systems Division at Ames. “Companies are developing autonomous technology, but their systems aren’t designed to work with technology from competitors. The DRF technology bridges that gap, organizing these systems to work together in harmony.”

Traffic enhancements are just one use case for this innovative system. The technology could enhance how we use autonomy to support human needs on Earth, in the air, and even on the Moon.

Supporting Complex Logistics

To illustrate the technology’s impact, the DRF team worked with the city of Phoenix on an aviation solution to improve transportation of critical medical supplies from urban areas out to rural communities with limited access to these resources. An autonomous system identified where supplies were needed and directed a drone to pick up and transport supplies quickly and safely.

“All the pieces need to come together, which takes a lot of effort. The DRF technology provides a framework where suppliers, medical centers, and drone operators can work together efficiently,” said Moustafa Abdelbaky, senior computer scientist at Ames. “The goal isn’t to remove human involvement, but help humans achieve more.”

The DRF technology is part of a larger effort at Ames to develop concepts that enable autonomous operations while integrating them into the public and commercial sector to create safer, efficient environments.

“At NASA, we’re always learning something. There’s a silver lining when one project ends, you can identify a new lesson learned, a new application, or a new economic opportunity to continue and scale that work,” said Supreet Kaur, lead systems engineer at Ames. “And because we leverage all of the knowledge we’ve gained through these experiments, we are able to make future research more robust.”

Choreographed Autonomy

Industries like modern mining involve a variety of autonomous and advanced vehicles and machinery, but these systems face the challenge of communicating sufficiently to operate in the same area. The DRF technology’s “choreography” might help them work together, improving efficiency. Researchers met with a commercial mining company to learn what issues they struggle with when using autonomous equipment to identify where DRF might provide future solutions.

“If an autonomous drill is developed by one company, but the haul trucks are developed by another, those two machines are dancing to two different sets of music. Right now, they need to be kept apart manually for safety,” said Johnathan Stock, chief scientist for innovation at the Ames Intelligent Systems Division. “The DRF technology can harmonize their autonomous work so these mining companies can use autonomy across the board to create a safer, more effective enterprise.”

Further testing of DRF on equipment like those used in mines could be done at the NASA Ames Roverscape, a surface that includes obstacles such as slopes and rocks, where DRF’s choreography could be put to the test.

Stock also envisions DRF improving operations on the Moon. Autonomous vehicles could transport materials, drill, and excavate, while launch vehicles come and go. These operations will likely include systems from different companies or industries and could be choreographed by DRF.

As autonomous systems and technologies increase across markets, on Earth, in orbit, and on the Moon, DRF researchers are ready to step on the dance floor to make sure everything runs smoothly.

“When everyone’s dancing to the same tune, things run seamlessly, and more is possible.”

Share

Details

Last Updated
Mar 20, 2025

Related Terms

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      Credit: NASA NASA has awarded a contract to MacLean Engineering & Applied Technologies, LLC of Houston to provide simulation and advanced software services to the agency.
      The Simulation and Advanced Software Services II (SASS II) contract includes services from Oct. 1, 2025, through Sept. 30, 2030, with a maximum potential value not to exceed $150 million. The contract is a single award, indefinite-delivery/indefinite-quality contract with the capability to issue cost-plus-fixed-fee task orders and firm-fixed-price task orders.
      Under the five-year SASS II contract, the awardee is tasked to provide simulation and software services for space-based vehicle models and robotic manipulator systems; human biomechanical representations for analysis and development of countermeasures devices; guidance, navigation, and control of space-based vehicles for all flight phases; and space-based vehicle on-board computer systems simulations of flight software systems. Responsibilities also include astronomical object surface interaction simulation of space-based vehicles, graphics support for simulation visualization and engineering analysis, and ground-based and onboarding systems to support human-in-the-loop training.
      Major subcontractors include Tietronix Software Inc. in Houston and VEDO Systems, LLC, in League City, Texas.
      For information about NASA and agency programs, visit:
      https://www.nasa.gov/
      -end-
      Tiernan Doyle
      Headquarters, Washington
      202-358-1600
      tiernan.doyle@nasa.gov
      Chelsey Ballarte
      Johnson Space Center, Houston
      281-483-5111
      Chelsey.n.ballarte@nasa.gov
      Share
      Details
      Last Updated Jul 02, 2025 LocationNASA Headquarters Related Terms
      Technology Johnson Space Center View the full article
    • By NASA
      On June 11, NASA’s LRO (Lunar Reconnaissance Orbiter) captured photos of the site where the ispace Mission 2 SMBC x HAKUTO-R Venture Moon (RESILIENCE) lunar lander experienced a hard landing on June 5, 2025, UTC.
      RESILIENCE lunar lander impact site, as seen by NASA’s Lunar Reconnaissance Orbiter Camera (LROC) on June 11, 2025. The lander created a dark smudge surrounded by a subtle bright halo.Credit: NASA/Goddard/Arizona State University. RESILIENCE was launched on Jan. 15 on a privately funded spacecraft.
      LRO’s right Narrow Angle Camera (one in a suite of cameras known as LROC) captured the images featured here from about 50 miles above the surface of Mare Frigoris, a volcanic region interspersed with large-scale faults known as wrinkle ridges.
      The dark smudge visible above the arrow in the photo formed as the vehicle impacted the surface, kicking up regolith — the rock and dust that make up Moon “soil.” The faint bright halo encircling the site resulted from low-angle regolith particles scouring the delicate surface.
      This animation shows the RESILIENCE site before and after the impact. In the image, north is up. Looking from west to east, or left to right, the area pictured covers 2 miles.Credit: NASA/Goddard/Arizona State University.  LRO is managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland, for the Science Mission Directorate at NASA Headquarters in Washington. Launched on June 18, 2009, LRO has collected a treasure trove of data with its seven powerful instruments, making an invaluable contribution to our knowledge about the Moon. NASA is returning to the Moon with commercial and international partners to expand human presence in space and bring back new knowledge and opportunities.
      More on this story from Arizona State University’s LRO Camera website
      Media Contact
      Karen Fox / Molly Wasser
      Headquarters, Washington
      202-358-1600
      karen.c.fox@nasa.gov / molly.l.wasser@nasa.gov

      Lonnie Shekhtman
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      lonnie.shekhtman@nasa.gov
      Share
      Details
      Last Updated Jun 20, 2025 EditorMadison OlsonContactMolly Wassermolly.l.wasser@nasa.govLocationGoddard Space Flight Center Related Terms
      Lunar Reconnaissance Orbiter (LRO) Earth's Moon View the full article
    • By USH
      The photograph was captured by the Mast Camera (Mastcam) aboard NASA’s Curiosity rover on Sol 3551 (August 2, 2022, at 20:43:28 UTC). 

      What stands out in the image are two objects, that appear strikingly out of place amid the natural Martian landscape of rocks and boulders. Their sharp edges, right angles, flat surfaces, and geometric symmetry suggest they may have been shaped by advanced cutting tools rather than natural erosion. 

      Could these ancient remnants be part of a destroyed structure or sculpture? If so, they may serve as yet another piece of evidence pointing to the possibility that Mars was once home to an intelligent civilization, perhaps even the advanced humanoid beings who, according to some theories, fled the catastrophic destruction of planet Maldek and sought refuge on the Red Planet. 
      Objects discovered by Jean Ward Watch Jean Ward's YouTube video on this topic: HereSee original NASA source: Here 
      View the full article
    • By European Space Agency
      ESA Impact: Pick of our spring space snaps

      View the full article
    • By NASA
      NASA named Stanford University of California winner of the Lunar Autonomy Challenge, a six-month competition for U.S. college and university student teams to virtually map and explore using a digital twin of NASA’s In-Situ Resource Utilization Pilot Excavator (IPEx). 
      The winning team successfully demonstrated the design and functionality of their autonomous agent, or software that performs specified actions without human intervention. Their agent autonomously navigated the IPEx digital twin in the virtual lunar environment, while accurately mapping the surface, correctly identifying obstacles, and effectively managing available power.
      Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Team photo of NAV Lab Lunar Autonomy Challenge from Stanford UniversityCredit: Stanford University’s NAV Lab team The Lunar Autonomy Challenge has been a truly unique experience. The challenge provided the opportunity to develop and test methods in a highly realistic simulation environment."
      Adam dai
      Lunar Autonomy Challenge team lead, Stanford University

      Dai added, “It pushed us to find solutions robust to the harsh conditions of the lunar surface. I learned so much through the challenge, both about new ideas and methods, as well as through deepening my understanding of core methods across the autonomy stack (perception, localization, mapping, planning). I also very much enjoyed working together with my team to brainstorm different approaches and strategies and solve tangible problems observed in the simulation.” 
      The challenge offered 31 teams a valuable opportunity to gain experience in software development, autonomy, and machine learning using cutting-edge NASA lunar technology. Participants also applied essential skills common to nearly every engineering discipline, including technical writing, collaborative teamwork, and project management.
      The Lunar Autonomy Challenge supports NASA’s Lunar Surface Innovation Initiative (LSII), which is part of the Space Technology Mission Directorate. The LSII aims to accelerate technology development and pursue results that will provide essential infrastructure for lunar exploration by collaborating with industry, academia, and other government agencies.
      The work displayed by all of these teams has been impressive, and the solutions they have developed are beneficial to advancing lunar and Mars surface technologies as we prepare for increasingly complex missions farther from home.” 
      Niki Werkheiser
      Director of Technology Maturation and LSII lead, NASA Headquarters
      “To succeed, we need input from everyone — every idea counts to propel our goals forward. It is very rewarding to see these students and software developers contributing their skills to future lunar and Mars missions,” Werkheiser added.  
      Through the Lunar Autonomy Challenge, NASA collaborated with the Johns Hopkins Applied Physics Laboratory, Caterpillar Inc., and Embodied AI. Each team contributed unique expertise and tools necessary to make the challenge a success.
      The Applied Physics Laboratory managed the challenge for NASA. As a systems integrator for LSII, they provided expertise to streamline rigor and engineering discipline across efforts, ensuring the development of successful, efficient, and cost-effective missions — backed by the world’s largest cohort of lunar scientists. 
      Caterpillar Inc. is known for its construction and excavation equipment and operates a large fleet of autonomous haul trucks. They also have worked with NASA for more than 20 years on a variety of technologies, including autonomy, 3D printing, robotics, and simulators as they continue to collaborate with NASA on technologies that support NASA’s mission objectives and provide value to the mining and construction industries. 
      Embodied AI collaborated with Caterpillar to integrate the simulation into the open-source  driving environment used for the challenge. For the Lunar Autonomy Challenge, the normally available digital assets of the CARLA simulation platform, such as urban layouts, buildings, and vehicles, were replaced by an IPEx “Digital Twin” and lunar environmental models.
      “This collaboration is a great example of how the government, large companies, small businesses, and research institutions can thoughtfully leverage each other’s different, but complementary, strengths,” Werkheiser added. “By substantially modernizing existing tools, we can turn today’s novel technologies into tomorrow’s institutional capabilities for more efficient and effective space exploration, while also stimulating innovation and economic growth on Earth.”

      FINALIST TEAMS
      First Place
      NAV Lab team
      Stanford University, Stanford, California


      Second Place
      MAPLE (MIT Autonomous Pathfinding for Lunar Exploration) team
      Massachusetts Institute of Technology, Cambridge, MA


      Third Place
      Moonlight team
      Carnegie Mellon University, Pittsburgh, PA
      OTHER COMPETING TEAMS
      Lunar ExplorersArizona State UniversityTempe, ArizonaAIWVU West Virginia University Morgantown, West VirginiaStellar Sparks California Polytechnic Institute Pomona Pomona, California LunatiX Johns Hopkins University Whiting School of EngineeringBaltimore CARLA CSU California State University, Stanislaus Turlock, CaliforniaRose-Hulman Rose-Hulman Institute of Technology Terre Haute, IndianaLunar PathfindersAmerican Public University SystemCharles Town, West Virginia Lunar Autonomy Challenge digital simulation of lunar surface activity using a digital twin of NASA’s ISRU Pilot ExcavatorJohns Hopkins Applied Physics Laboratory Keep Exploring Discover More Topics From NASA
      Space Technology Mission Directorate
      NASA’s Lunar Surface Innovation Initiative
      Game Changing Development Projects
      Game Changing Development projects aim to advance space technologies, focusing on advancing capabilities for going to and living in space.
      ISRU Pilot Excavator
      View the full article
  • Check out these Videos

×
×
  • Create New...