Jump to content

NASA’s Laser Navigation Tech Enables Commercial Lunar Exploration


Recommended Posts

  • Publishers
Posted
5 Min Read

NASA’s Laser Navigation Tech Enables Commercial Lunar Exploration

Navigation Doppler Lidar is a guidance system that uses laser pulses to precisely measure velocity and distance. NASA will demonstrate NDL’s capabilities in the lunar environment during the IM-1 mission.
Navigation Doppler Lidar is a guidance system that uses laser pulses to precisely measure velocity and distance. NASA will demonstrate NDL’s capabilities in the lunar environment during the IM-1 mission.
Credits: NASA / David C. Bowman

Later this month, NASA’s commercial lunar delivery services provider Intuitive Machines will launch its Nova-C lunar lander carrying several NASA science and technology payloads, including the Navigation Doppler Lidar (NDL). This innovative guidance system, developed by NASA’s Langley Research Center in Hampton, Virginia, under the agency’s Space Technology Mission Directorate (STMD), can potentially revolutionize landing spacecraft on extraterrestrial worlds.

The NDL technology is a NASA payload for this Intuitive Machines Commercial Lunar Payload Services (CLPS) delivery, meaning NASA will demonstrate NDL’s capabilities in the lunar environment during the mission but the data is not considered mission-critical for the successful landing of Nova-C, as Intuitive Machines has its own navigation and landing systems.

The Artemis mission will take humans back to the Moon and Navigation Doppler Lidar will ensure a safe landing for everyone onboard. NDL Chief Engineer Glenn Hines explains how lasers will relieve astronauts of some of the burdens of making safe, precise landings on the Moon.

The NDL story started almost 20 years ago when Dr. Farzin Amzajerdian, NDL project manager at NASA Langley, made a breakthrough and successfully found a precise way to land rovers on Mars. In the late 1990s and early 2000s, several attempts at landing rovers on the surface of Mars were met with several significant challenges. 

Radar was inherently imprecise for this application. Radio waves cover a large area on the ground, meaning smaller craters and boulders that are commonly found on the Martian surface could ‘hide’ from detection and cause unexpected hazards for landers.

“The landers needed the radar sensor to tell them how far they were off the ground and how fast they were moving so they could time their parachute deployment,” said Amzajerdian. “Too early or too late, the lander would miss its target or crash into the surface.”

Radio waves also couldn’t measure velocity and range independently of one another, which is important, according to Aram Gragossian, electro-optics lead for NDL at NASA Langley, who joined the team about six years ago.

“If you go over a steep slope, the range changes very quickly, but that doesn’t mean your velocity has changed,” he said. “So if you just feed that information back to your system, it may cause catastrophic reactions.”

Amzajerdian knew about this problem, and he knew how to fix it.

“Why not use a lidar instead of a radar?” he asked.

LiDAR, which stands for light detection and ranging, is a technology that uses visible or infrared light the same way radar uses radio waves. Lidar sends laser pulses to a target, which reflects some of that light back onto a detector. As the instrument moves in relation to its target, the change in frequency of the returning signal – also known as the Doppler effect – allows the lidar to measure velocity directly and precisely. Distance is measured based on the travel time of the light to the target and back.

Lidar offered several advantages over radar, notably the fact that a laser transmits a pencil beam of light that can give a more precise and accurate measurement.

In 2004, Amzajerdian proposed NDL as a concept to the Mars Science Laboratory team. In 2005, he and his team received funding from Langley to put together a proof of concept. Then, in 2007, they received funding for building and testing a prototype of a helicopter. This is when Langley’s Dr. Glenn Hines joined NDL — first as electronic lead and now as chief engineer.

Since then, Amzajerdian, Hines, and numerous other team members have worked tirelessly to ensure NDL’s success. 

Hines credits the various NASA personnel who have continued to advocate for NDL. “In almost everything in life, you’ve got to have a champion,” Hines said, “somebody in your corner saying, ‘Look, what you’re doing is good. This has credibility.’ ”

The Intuitive Machines delivery is just the beginning of the NDL story; a next-generation system is already in the works. The team has developed a companion sensor to NDL, a multi-functional Flash Lidar camera. Flash Lidar is a 3D camera technology that surveys the surrounding terrain — even in complete darkness. When combined with NDL, Flash Lidar will allow you to go “anywhere, anytime.”

Other future versions of NDL could have uses outside the tricky business of landing on extraterrestrial surfaces. In fact, they may have uses in a very terrestrial setting, like helping self-driving cars navigate local streets and highways. 

Looking at the history and trajectory of NDL, one thing is certain: The initial journey to the Moon will be the culmination of decades of hard work, perseverance, determination, and a steadfast belief in the project across the team, but held most fervently by NDL’s champions, Amzajerdian and Hines.

NDL was NASA’s Invention of the Year in 2022. Four programs within STMD contributed to NDL’s development: Flight OpportunitiesTechnology TransferSmall Business Innovation Research & Small Business Technology Transfer, and Game Changing Development.

NASA is working with multiple CLPS vendors to establish a regular cadence of payload deliveries to the Moon to perform experiments, test technologies, and demonstrate capabilities to help NASA explore the lunar surface. Payloads delivered through CLPS will help NASA advance capabilities for science, technology, and exploration on the Moon.

Simone Williams
NASA Langley Research Center

Share

Details

Last Updated
Feb 05, 2024

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      2 Min Read Building a Lunar Network: Johnson Tests Wireless Technologies for the Moon 
      From left, Johnson Exploration Wireless Laboratory (JEWL) Software Lead William Dell; Lunar 3GPP Principal Investigator Raymond Wagner; JEWL intern Harlan Phillips; and JEWL Lab Manager Chatwin Lansdowne. Credits: Nevada Space Proving Grounds (NSPG) NASA engineers are strapping on backpacks loaded with radios, cameras, and antennas to test technology that might someday keep explorers connected on the lunar surface. Their mission: test how astronauts on the Moon will stay connected during Artemis spacewalks using 3GPP (LTE/4G and 5G) and Wi-Fi technologies. 
      It’s exciting to bring lunar spacewalks into the 21st century with the immersive, high-definition experience that will make people feel like they’re right there with the astronauts.
      Raymond Wagner
      NASA’s Lunar 3GPP Project Principal Investigator
      A NASA engineer tests a backpack-mounted wireless communications system in the Nevada desert, simulating how astronauts will stay connected during Artemis lunar spacewalks. NSPG With Artemis, NASA will establish a long-term presence at the Moon, opening more of the lunar surface to exploration than ever before. This growth of lunar activity will require astronauts to communicate seamlessly with each other and with science teams back on Earth.  
      “We’re working out what the software that uses these networks needs to look like,” said Raymond Wagner, principal investigator in NASA’s Lunar 3GPP project and member of Johnson Space Center’s Exploration Wireless Laboratory (JEWL) in Houston. “We’re prototyping it with commercial off-the-shelf hardware and open-source software to show what pieces are needed and how they interact.” 
      Carrying a prototype wireless network pack, a NASA engineer helps test wireless 4G and 5G technologies that could one day keep Artemis astronauts connected on the Moon. NSPG The next big step comes with Artemis III, which will land a crew on the Moon and carry a 4G/LTE demonstration to stream video and audio from the astronauts on the lunar surface. 
       The vision goes further. “Right now the lander or rover will host the network,” Wagner said. “But if we go to the Moon to stay, we may eventually want actual cell towers. The spacesuit itself is already becoming the astronaut’s cell phone, and rovers could act as mobile hotspots. Altogether, these will be the building blocks of communication on the Moon.” 
      Team members from NASA’s Avionics Systems Laboratory at Johnson Space Center in Houston.NASA/Sumer Loggins Back at Johnson, teams are simulating lunar spacewalks, streaming video, audio, and telemetry over a private 5G network to a mock mission control. The work helps engineers refine how future systems will perform in challenging environments. Craters, lunar regolith, and other terrain features all affect how radio signals travel — lessons that will also carry over to Mars. 
      For Wagner, the project is about shaping how humanity experiences the next era of exploration. “We’re aiming for true HD on the Moon,” he said. “It’s going to be pretty mind-blowing.” 
      About the Author
      Sumer Loggins

      Share
      Details
      Last Updated Sep 18, 2025 Related Terms
      Johnson Space Center Artemis Explore More
      3 min read Aaisha Ali: From Marine Biology to the Artemis Control Room 
      Article 2 months ago 4 min read Mark Cavanaugh: Integrating Safety into the Orion Spacecraft 
      Article 2 months ago 3 min read Bringing the Heat: Abigail Howard Leads Thermal Systems for Artemis Rovers, Tools
      Article 6 months ago Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      NASA’s Northrop Grumman Commercial Resupply Services 23 Installation
    • By NASA
      NASA’s Northrop Grumman Commercial Resupply Services 23 Rendezvous and Capture
    • By NASA
      NASA’s Northrop Grumman Commercial Resupply Services 23 Launch
    • By NASA
      Honolulu is pictured here beside a calm sea in 2017. A JPL technology recently detected and confirmed a tsunami up to 45 minutes prior to detection by tide gauges in Hawaii, and it estimated the speed of the wave to be over 580 miles per hour (260 meters per second) near the coast.NASA/JPL-Caltech A massive earthquake and subsequent tsunami off Russia in late July tested an experimental detection system that had deployed a critical component just the day before.
      A recent tsunami triggered by a magnitude 8.8 earthquake off Russia’s Kamchatka Peninsula sent pressure waves to the upper layer of the atmosphere, NASA scientists have reported. While the tsunami did not wreak widespread damage, it was an early test for a detection system being developed at the agency’s Jet Propulsion Laboratory in Southern California.
      Called GUARDIAN (GNSS Upper Atmospheric Real-time Disaster Information and Alert Network), the experimental technology “functioned to its full extent,” said Camille Martire, one of its developers at JPL. The system flagged distortions in the atmosphere and issued notifications to subscribed subject matter experts in as little as 20 minutes after the quake. It confirmed signs of the approaching tsunami about 30 to 40 minutes before waves made landfall in Hawaii and sites across the Pacific on July 29 (local time).
      “Those extra minutes of knowing something is coming could make a real difference when it comes to warning communities in the path,” said JPL scientist Siddharth Krishnamoorthy.
      Near-real-time outputs from GUARDIAN must be interpreted by experts trained to identify the signs of tsunamis. But already it’s one of the fastest monitoring tools of its kind: Within about 10 minutes of receiving data, it can produce a snapshot of a tsunami’s rumble reaching the upper atmosphere.
      The dots in this graph indicate wave disturbances in the ionosphere as measured be-tween ground stations and navigation satellites. The initial spike shows the acoustic wave coming from the epicenter of the July 29 quake that caused the tsunami; the red squiggle shows the gravity wave the tsunami generated.NASA/JPL-Caltech The goal of GUARDIAN is to augment existing early warning systems. A key question after a major undersea earthquake is whether a tsunami was generated. Today, forecasters use seismic data as a proxy to predict if and where a tsunami could occur, and they rely on sea-based instruments to confirm that a tsunami is passing by. Deep-ocean pressure sensors remain the gold standard when it comes to sizing up waves, but they are expensive and sparse in locations.
      “NASA’s GUARDIAN can help fill the gaps,” said Christopher Moore, director of the National Oceanic and Atmospheric Administration Center for Tsunami Research. “It provides one more piece of information, one more valuable data point, that can help us determine, yes, we need to make the call to evacuate.”
      Moore noted that GUARDIAN adds a unique perspective: It’s able to sense sea surface motion from high above Earth, globally and in near-real-time.
      Bill Fry, chair of the United Nations technical working group responsible for tsunami early warning in the Pacific, said GUARDIAN is part of a technological “paradigm shift.” By directly observing ocean dynamics from space, “GUARDIAN is absolutely something that we in the early warning community are looking for to help underpin next generation forecasting.”
      How GUARDIAN works
      GUARDIAN takes advantage of tsunami physics. During a tsunami, many square miles of the ocean surface can rise and fall nearly in unison. This displaces a significant amount of air above it, sending low-frequency sound and gravity waves speeding upwards toward space. The waves interact with the charged particles of the upper atmosphere — the ionosphere — where they slightly distort the radio signals coming down to scientific ground stations of GPS and other positioning and timing satellites. These satellites are known collectively as the Global Navigation Satellite System (GNSS).
      While GNSS processing methods on Earth correct for such distortions, GUARDIAN uses them as clues.
      SWOT Satellite Measures Pacific Tsunami The software scours a trove of data transmitted to more than 350 continuously operating GNSS ground stations around the world. It can potentially identify evidence of a tsunami up to about 745 miles (1,200 kilometers) from a given station. In ideal situations, vulnerable coastal communities near a GNSS station could know when a tsunami was heading their way and authorities would have as much as 1 hour and 20 minutes to evacuate the low-lying areas, thereby saving countless lives and property.
      Key to this effort is the network of GNSS stations around the world supported by NASA’s Space Geodesy Project and Global GNSS Network, as well as JPL’s Global Differential GPS network that transmits the data in real time.
      The Kamchatka event offered a timely case study for GUARDIAN. A day before the quake off Russia’s northeast coast, the team had deployed two new elements that were years in the making: an artificial intelligence to mine signals of interest and an accompanying prototype messaging system.
      Both were put to the test when one of the strongest earthquakes ever recorded spawned a tsunami traveling hundreds of miles per hour across the Pacific Ocean. Having been trained to spot the kinds of atmospheric distortions caused by a tsunami, GUARDIAN flagged the signals for human review and notified subscribed subject matter experts.
      Notably, tsunamis are most often caused by large undersea earthquakes, but not always. Volcanic eruptions, underwater landslides, and certain weather conditions in some geographic locations can all produce dangerous waves. An advantage of GUARDIAN is that it doesn’t require information on what caused a tsunami; rather, it can detect that one was generated and then can alert the authorities to help minimize the loss of life and property. 
      While there’s no silver bullet to stop a tsunami from making landfall, “GUARDIAN has real potential to help by providing open access to this data,” said Adrienne Moseley, co-director of the Joint Australian Tsunami Warning Centre. “Tsunamis don’t respect national boundaries. We need to be able to share data around the whole region to be able to make assessments about the threat for all exposed coastlines.”
      To learn more about GUARDIAN, visit:
      https://guardian.jpl.nasa.gov
      News Media Contacts
      Jane J. Lee / Andrew Wang
      Jet Propulsion Laboratory, Pasadena, Calif.
      626-379-6874 / 818-354-0307
      jane.j.lee@jpl.nasa.gov / andrew.wang@jpl.nasa.gov 
      Written by Sally Younger
      2025-117
      Explore More
      5 min read New U.S.-European Sea Level Satellite Will Help Safeguard Ships at Sea
      Article 21 hours ago 13 min read The Earth Observer Editor’s Corner: July–September 2025
      NOTE TO READERS: After more than three decades associated with or directly employed by NASA,…
      Article 2 days ago 21 min read Summary of the 11th ABoVE Science Team Meeting
      Introduction The NASA Arctic–Boreal Vulnerability Experiment (ABoVE) is a large-scale ecological study in the northern…
      Article 2 days ago Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
  • Check out these Videos

×
×
  • Create New...