Jump to content

NASA’s Laser Navigation Tech Enables Commercial Lunar Exploration


Recommended Posts

  • Publishers
Posted
5 Min Read

NASA’s Laser Navigation Tech Enables Commercial Lunar Exploration

Navigation Doppler Lidar is a guidance system that uses laser pulses to precisely measure velocity and distance. NASA will demonstrate NDL’s capabilities in the lunar environment during the IM-1 mission.
Navigation Doppler Lidar is a guidance system that uses laser pulses to precisely measure velocity and distance. NASA will demonstrate NDL’s capabilities in the lunar environment during the IM-1 mission.
Credits: NASA / David C. Bowman

Later this month, NASA’s commercial lunar delivery services provider Intuitive Machines will launch its Nova-C lunar lander carrying several NASA science and technology payloads, including the Navigation Doppler Lidar (NDL). This innovative guidance system, developed by NASA’s Langley Research Center in Hampton, Virginia, under the agency’s Space Technology Mission Directorate (STMD), can potentially revolutionize landing spacecraft on extraterrestrial worlds.

The NDL technology is a NASA payload for this Intuitive Machines Commercial Lunar Payload Services (CLPS) delivery, meaning NASA will demonstrate NDL’s capabilities in the lunar environment during the mission but the data is not considered mission-critical for the successful landing of Nova-C, as Intuitive Machines has its own navigation and landing systems.

The Artemis mission will take humans back to the Moon and Navigation Doppler Lidar will ensure a safe landing for everyone onboard. NDL Chief Engineer Glenn Hines explains how lasers will relieve astronauts of some of the burdens of making safe, precise landings on the Moon.

The NDL story started almost 20 years ago when Dr. Farzin Amzajerdian, NDL project manager at NASA Langley, made a breakthrough and successfully found a precise way to land rovers on Mars. In the late 1990s and early 2000s, several attempts at landing rovers on the surface of Mars were met with several significant challenges. 

Radar was inherently imprecise for this application. Radio waves cover a large area on the ground, meaning smaller craters and boulders that are commonly found on the Martian surface could ‘hide’ from detection and cause unexpected hazards for landers.

“The landers needed the radar sensor to tell them how far they were off the ground and how fast they were moving so they could time their parachute deployment,” said Amzajerdian. “Too early or too late, the lander would miss its target or crash into the surface.”

Radio waves also couldn’t measure velocity and range independently of one another, which is important, according to Aram Gragossian, electro-optics lead for NDL at NASA Langley, who joined the team about six years ago.

“If you go over a steep slope, the range changes very quickly, but that doesn’t mean your velocity has changed,” he said. “So if you just feed that information back to your system, it may cause catastrophic reactions.”

Amzajerdian knew about this problem, and he knew how to fix it.

“Why not use a lidar instead of a radar?” he asked.

LiDAR, which stands for light detection and ranging, is a technology that uses visible or infrared light the same way radar uses radio waves. Lidar sends laser pulses to a target, which reflects some of that light back onto a detector. As the instrument moves in relation to its target, the change in frequency of the returning signal – also known as the Doppler effect – allows the lidar to measure velocity directly and precisely. Distance is measured based on the travel time of the light to the target and back.

Lidar offered several advantages over radar, notably the fact that a laser transmits a pencil beam of light that can give a more precise and accurate measurement.

In 2004, Amzajerdian proposed NDL as a concept to the Mars Science Laboratory team. In 2005, he and his team received funding from Langley to put together a proof of concept. Then, in 2007, they received funding for building and testing a prototype of a helicopter. This is when Langley’s Dr. Glenn Hines joined NDL — first as electronic lead and now as chief engineer.

Since then, Amzajerdian, Hines, and numerous other team members have worked tirelessly to ensure NDL’s success. 

Hines credits the various NASA personnel who have continued to advocate for NDL. “In almost everything in life, you’ve got to have a champion,” Hines said, “somebody in your corner saying, ‘Look, what you’re doing is good. This has credibility.’ ”

The Intuitive Machines delivery is just the beginning of the NDL story; a next-generation system is already in the works. The team has developed a companion sensor to NDL, a multi-functional Flash Lidar camera. Flash Lidar is a 3D camera technology that surveys the surrounding terrain — even in complete darkness. When combined with NDL, Flash Lidar will allow you to go “anywhere, anytime.”

Other future versions of NDL could have uses outside the tricky business of landing on extraterrestrial surfaces. In fact, they may have uses in a very terrestrial setting, like helping self-driving cars navigate local streets and highways. 

Looking at the history and trajectory of NDL, one thing is certain: The initial journey to the Moon will be the culmination of decades of hard work, perseverance, determination, and a steadfast belief in the project across the team, but held most fervently by NDL’s champions, Amzajerdian and Hines.

NDL was NASA’s Invention of the Year in 2022. Four programs within STMD contributed to NDL’s development: Flight OpportunitiesTechnology TransferSmall Business Innovation Research & Small Business Technology Transfer, and Game Changing Development.

NASA is working with multiple CLPS vendors to establish a regular cadence of payload deliveries to the Moon to perform experiments, test technologies, and demonstrate capabilities to help NASA explore the lunar surface. Payloads delivered through CLPS will help NASA advance capabilities for science, technology, and exploration on the Moon.

Simone Williams
NASA Langley Research Center

Share

Details

Last Updated
Feb 05, 2024

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      3 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Gateway’s HALO module at Northrop Grumman’s facility in Gilbert, Arizona, on April 4, 2025, shortly after its arrival from Thales Alenia Space in Turin, Italy. NASA/Josh Valcarcel NASA continues to mark progress on plans to work with commercial and international partners as part of the Gateway program. The primary structure of HALO (Habitation and Logistics Outpost) arrived at Northrop Grumman’s facility in Gilbert, Arizona, where it will undergo final outfitting and verification testing.
      HALO will provide Artemis astronauts with space to live, work, and conduct scientific research. The habitation module will be equipped with essential systems including command and control, data handling, energy storage, power distribution, and thermal regulation.
      Following HALO’s arrival on April 1 from Thales Alenia Space in Turin, Italy, where it was assembled, NASA and Northrop Grumman hosted an April 24 event to acknowledge the milestone, and the module’s significance to lunar exploration. The event opened with remarks by representatives from Northrop Grumman and NASA, including NASA’s Acting Associate Administrator for Exploration Systems Development Lori Glaze, Gateway Program Manager Jon Olansen, and NASA astronaut Randy Bresnik. Event attendees, including Senior Advisor to the NASA Administrator Todd Ericson, elected officials, and local industry and academic leaders, viewed HALO and virtual reality demonstrations during a tour of the facilities.
      Dr. Lori Glaze, acting associate administrator for NASA’s Exploration Systems Development Mission Directorate, and Dr. Jon B. Olansen, Gateway Program manager, on stage during an April 24, 2025, event at Northrop Grumman’s facility in Gilbert, Arizona, commemorating HALO’s arrival in the United States. Northrop Grumman While the module is in Arizona, HALO engineers and technicians will install propellant lines for fluid transfer and electrical lines for power and data transfer. Radiators will be attached for the thermal control system, as well as racks to house life support hardware, power equipment, flight computers, and avionics systems. Several mechanisms will be mounted to enable docking of the Orion spacecraft, lunar landers, and visiting spacecraft.
      Launching on top of HALO is the ESA (European Space Agency)-provided Lunar Link system which will enable communication between crewed and robotic systems on the Moon and to mission control on Earth. Once these systems are installed, the components will be tested as an integrated spacecraft and subjected to thermal vacuum, acoustics, vibration, and shock testing to ensure the spacecraft is ready to perform in the harsh conditions of deep space.
      In tandem with HALO’s outfitting at Northrop Grumman, the Power and Propulsion Element – a powerful solar electric propulsion system – is being assembled at Maxar Space Systems in Palo Alto, California. Solar electric propulsion uses energy collected from solar panels converted to electricity to create xenon ions, then accelerates them to more than 50,000 miles per hour to create thrust that propels the spacecraft.
      The element’s central cylinder, which resembles a large barrel, is being attached to the propulsion tanks, and avionics shelves are being installed. The first of three 12-kilowatt thrusters has been delivered to NASA’s Glenn Research Center in Cleveland for acceptance testing before delivery to Maxar and integration with the Power and Propulsion Element later this year.
      Learn More About Gateway Facebook logo @NASAGateway @NASA_Gateway Instagram logo @nasaartemis Linkedin logo @NASA Share
      Details
      Last Updated Apr 25, 2025 ContactLaura RochonLocationJohnson Space Center Related Terms
      Artemis Artemis 4 Earth's Moon Exploration Systems Development Mission Directorate Gateway Space Station General Humans in Space Explore More
      2 min read NASA Welcomes Gateway Lunar Space Station’s HALO Module to US
      From Italy to Arizona: Gateway’s first habitation module takes a major step on its path…
      Article 3 weeks ago 2 min read NASA Prepares Gateway Lunar Space Station for Journey to Moon
      Assembly is underway for Gateway's Power and Propulsion Element, the module that will power the…
      Article 2 months ago 2 min read Advanced Modeling Enhances Gateway’s Lunar Dust Defense
      Ahead of more frequent and intense contact with dust during Artemis missions, NASA is developing…
      Article 3 months ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      4 Min Read Navigation Technology
      ESA astronaut Matthias Maurer sets up an Astrobee for the ReSWARM experiment. Credits: NASA Science in Space April 2025
      Humans have always been explorers, venturing by land and sea into unknown and uncharted places on Earth and, more recently, in space. Early adventurers often navigated by the Sun and stars, creating maps that made it easier for others to follow. Today, travelers on Earth have sophisticated technology to guide them.
      Navigation in space, including for missions to explore the Moon and Mars, remains more of a challenge. Research on the International Space Station is helping NASA scientists improve navigation tools and processes for crewed spacecraft and remotely controlled or autonomous robots to help people boldly venture farther into space, successfully explore there, and safely return home.
      NASA astronaut Nichole Ayers talks to students on the ground using ham radio equipment.NASA A current investigation, NAVCOM, uses the space station’s ISS Ham Radio program hardware to test software for a system that could shape future lunar navigation. The technology processes signals in the same way as global navigation satellite systems such as GPS, but while those rely on constellations of satellites, the NAVCOM radio equipment receives position and time information from ground stations and reference clocks.
      The old made new
      ESA astronaut Alexander Gerst operates the Sextant Navigation device.NASA Sextant Navigation tested star-sighting from space using a hand-held sextant. These mechanical devices measure the angle between two objects, typically the Sun or other stars at night and the horizon. Sextants guided navigators on Earth for centuries and NASA’s Gemini and Apollo missions demonstrated that they were useful in space as well, meaning they could provide emergency backup navigation for lunar missions. Researchers report that with minimal training and practice, crew members of different skill levels produced quality sightings through a station window and measurements improved with more use. The investigation identified several techniques for improving sightings, including refocusing between readings and adjusting the sight to the center of the window.
      Navigating by neutron stars
      The station’s NICER instrument studies the nature and behavior of neutron stars, the densest objects in the universe. Some neutron stars, known as pulsars, emit beams of light that appear to pulse, sweeping across the sky as the stars rotate. Some of them pulse at rates as accurate as atomic clocks. As part of the NICER investigation, the Station Explorer for X-ray Timing and Navigation Technology or SEXTANT tested technology for using pulsars in GPS-like systems to navigate anywhere in the solar system. SEXTANT successfully completed a first in-space demonstration of this technology in 2017. In 2018, researchers reported that real-time, autonomous X-ray pulsar navigation is clearly feasible and they plan further experiments to fine tune and modify the technology.
      Robot navigation
      Crews on future space exploration missions need efficient and safe ways to handle cargo and to move and assemble structures on the surface of the Moon or Mars. Robots are promising tools for these functions but must be able to navigate their surroundings, whether autonomously or via remote control, often in proximity with other robots and within the confines of a spacecraft. Several investigations have focused on improving navigation by robotic helpers.
      NASA astronaut Michael Barratt (left) and JAXA astronaut Koichi Wakata perform a check of the SPHERES robots.NASA The SPHERES investigation tested autonomous rendezvous and docking maneuvers with three spherical free-flying robots on the station. Researchers reported development of an approach to control how the robots navigate around obstacles and along a designated path, which could support their use in the future for satellite servicing, vehicle assembly, and spacecraft formation flying.
      NASA astronaut Megan McArthur with the three Astrobee robots.NASA The station later gained three cube-shaped robots known as Astrobees. The ReSWARM experiments used them to test coordination of multiple robots with each other, cargo, and their environment. Results provide a base set of planning and control tools for robotic navigation in close proximity and outline important considerations for the design of future autonomous free-flyers.
      Researchers also used the Astrobees to show that models to predict the robots’ behavior could make it possible to maneuver one or two of them for carrying cargo. This finding suggests that robots can navigate around each other to perform tasks without a human present, which would increase their usefulness on future missions.
      ESA astronaut Samantha Cristoforetti working on the Surface Avatar experiment.ESA An investigation from ESA (European Space Agency), Surface Avatar evaluated orbit-to-ground remote control of multiple robots. Crew members successfully navigated a four-legged robot, Bert, through a simulated Mars environment. Robots with legs rather than wheels could explore uneven lunar and planetary surfaces that are inaccessible to wheeled rovers. The German Aerospace Center is developing Bert.

      View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      In our modern wireless world, almost all radio frequency (RF) spectrum bands are shared among multiple users. In some domains, similar users technically coordinate to avoid interference. The spectrum management team, part of NASA’s SCaN (Space Communications and Navigation) Program, represents the collaborative efforts across U.S. agencies and the international community to protect and enable NASA’s current and future spectrum-dependent science, exploration, and innovation.     
      Coordination with Other Spectrum Stakeholders
      NASA works to promote the collaborative use of the RF spectrum around Earth, and beyond. For example, NASA coordinates closely with other U.S. government agencies, international civil space agencies, and the private sector to ensure missions that overlap in time, location, and frequency do not cause or receive interference that could jeopardize their success. The spectrum management team protects NASA’s various uses of the spectrum by collaborating with U.S. and international spectrum users on technical matters that inform regulatory discussions.  
      As a founding member of the Space Frequency Coordination Group, NASA works with members of governmental space- and science-focused agencies from more than 35 countries. The Space Frequency Coordination Group annual meetings provide a forum for multilateral discussion and consideration of international spectrum regulatory issues related to Earth, lunar, and deep space research and exploration. The Space Frequency Coordination Group also provides a forum for the exchange of technical information to facilitate coordination for specific missions and enable efficient use of limited spectrum resources in space. 
      Domestic and International Spectrum Regulators 
      Creating and maintaining the global spectrum regulations that govern spectrum sharing requires collaboration and negotiation among all its diverse users. The International Telecommunication Union manages the global spectrum regulatory framework to optimize the increasing, diverse uses of the RF spectrum and reduce the likelihood of RF systems experiencing interference. U.S. regulators at the National Telecommunications and Information Administration and the Federal Communications Commission are responsible for developing and administering domestic spectrum regulations.  Organizations across the world cooperatively plan and regulate spectrum use.  The spectrum management team participates on behalf of NASA at both national and international levels to ensure that the U.S. domestic and international spectrum regulatory framework supports and enables NASA’s current and future missions.  
      NASA collaborates with domestic and international spectrum stakeholders to provide technical expertise on space spectrum topics to ensure regulations continue to enable space exploration, science, and innovation.NASA Share
      Details
      Last Updated Apr 23, 2025 Related Terms
      General Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By Amazing Space
      LIVE SpaceX Dragon Commercial Resupply Services Rendezvous and Docking
    • By NASA
      NASA's SpaceX 32nd Commercial Resupply Services Rendezvous and Docking
  • Check out these Videos

×
×
  • Create New...