Members Can Post Anonymously On This Site
Patrol robot dog seems to become reality
-
Similar Topics
-
By NASA
If you design a new tool for use on Earth, it is easy to test and practice using that tool in its intended environment. But what if that tool is destined for lunar orbit or will be used by astronauts on the surface of the Moon?
NASA’s Simulation and Graphics Branch can help with that. Based at Johnson Space Center in Houston, the branch’s high-fidelity, real-time graphical simulations support in-depth engineering analyses and crew training, ensuring the safety, efficiency, and success of complex space endeavors before execution. The team manages multiple facilities that provide these simulations, including the Prototype Immersive Technologies (PIT) Lab, Virtual Reality Training Lab, and the Systems Engineering Simulator (SES).
Lee Bingham is an aerospace engineer on the simulation and graphics team. His work includes developing simulations and visualizations for the NASA Exploration Systems Simulations team and providing technical guidance on simulation and graphics integration for branch-managed facilities. He also leads the branch’s human-in-the-loop Test Sim and Graphics Team, the Digital Lunar Exploration Sites Unreal Simulation Tool (DUST), and the Lunar Surface Mixed-Reality with the Active Response Gravity Offload System (ARGOS) projects.
Lee Bingham demonstrates a spacewalk simulator for the Gateway lunar space station during NASA’s Tech Day on Capitol Hill in Washington, D.C. Image courtesy of Lee Bingham Bingham is particularly proud of his contributions to DUST, which provides a 3D visualization of the Moon’s South Pole and received Johnson’s Exceptional Software of the Year Award in 2024. “It was designed for use as an early reference to enable candidate vendors to perform initial studies of the lunar terrain and lighting in support of the Strategy and Architecture Office, human landing system, and the Extravehicular Activity and Human Surface Mobility Program,” Bingham explained. DUST has supported several human-in-the-loop studies for NASA. It has also been shared with external collaborators and made available to the public through the NASA Software Catalog.
Bingham has kept busy during his nearly nine years at Johnson and said learning to manage and balance support for multiple projects and customers was very challenging at first. “I would say ‘yes’ to pretty much anything anyone asked me to do and would end up burning myself out by working extra-long hours to meet milestones and deliverables,” he said. “It has been important to maintain a good work-life balance and avoid overcommitting myself while meeting demanding expectations.”
Lee Bingham tests the Lunar Surface Mixed Reality and Active Response Gravity Offload System trainer at Johnson Space Center. Image courtesy of Lee Bingham Bingham has also learned the importance of teamwork and collaboration. “You can’t be an expert at everything or do everything yourself,” he said. “Develop your skills, practice them regularly, and master them over time but be willing to ask for help and advice. And be sure to recognize and acknowledge your coworkers and teammates when they go above and beyond or achieve something remarkable.”
Lee Bingham (left) demonstrates a lunar rover simulator for Apollo 16 Lunar Module Pilot Charlie Duke. Image courtesy of Lee Bingham He hopes that the Artemis Generation will be motivated to tackle difficult challenges and further NASA’s mission to benefit humanity. “Be sure to learn from those who came before you, but be bold and unafraid to innovate,” he advised.
View the full article
-
By European Space Agency
At the European Space Agency’s technical heart in the Netherlands, engineers have spent the last five months unboxing and testing elements of Europe’s next space science mission. With the two main parts now joined together, Smile is well on its way to being ready to launch by the end of 2025.
View the full article
-
By NASA
Tess Caswell, a stand-in crew member for the Artemis III Virtual Reality Mini-Simulation, executes a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. The simulation was a test of using VR as a training method for flight controllers and science teams’ collaboration on science-focused traverses on the lunar surface. Credit: NASA/Robert Markowitz When astronauts walk on the Moon, they’ll serve as the eyes, hands, and boots-on-the-ground interpreters supporting the broader teams of scientists on Earth. NASA is leveraging virtual reality to provide high-fidelity, cost-effective support to prepare crew members, flight control teams, and science teams for a return to the Moon through its Artemis campaign.
The Artemis III Geology Team, led by principal investigator Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, participated in an Artemis III Surface Extra-Vehicular VR Mini-Simulation, or “sim” at NASA’s Johnson Space Center in Houston in the fall of 2024. The sim brought together science teams and flight directors and controllers from Mission Control to carry out science-focused moonwalks and test the way the teams communicate with each other and the astronauts.
“There are two worlds colliding,” said Dr. Matthew Miller, co-lead for the simulation and exploration engineer, Amentum/JETSII contract with NASA. “There is the operational world and the scientific world, and they are becoming one.”
NASA mission training can include field tests covering areas from navigation and communication to astronaut physical and psychological workloads. Many of these tests take place in remote locations and can require up to a year to plan and large teams to execute. VR may provide an additional option for training that can be planned and executed more quickly to keep up with the demands of preparing to land on the Moon in an environment where time, budgets, and travel resources are limited.
VR helps us break down some of those limitations and allows us to do more immersive, high-fidelity training without having to go into the field. It provides us with a lot of different, and significantly more, training opportunities.
BRI SPARKS
NASA co-lead for the simulation and Extra Vehicular Activity Extended Reality team at Johnson.
Field testing won’t be going away. Nothing can fully replace the experience crew members gain by being in an environment that puts literal rocks in their hands and incudes the physical challenges that come with moonwalks, but VR has competitive advantages.
The virtual environment used in the Artemis III VR Mini-Sim was built using actual lunar surface data from one of the Artemis III candidate regions. This allowed the science team to focus on Artemis III science objectives and traverse planning directly applicable to the Moon. Eddie Paddock, engineering VR technical discipline lead at NASA Johnson, and his team used data from NASA’s Lunar Reconnaissance Orbiter and planet position and velocity over time to develop a virtual software representation of a site within the Nobile Rim 1 region near the south pole of the Moon. Two stand-in crew members performed moonwalk traverses in virtual reality in the Prototype Immersive Technology lab at Johnson, and streamed suit-mounted virtual video camera views, hand-held virtual camera imagery, and audio to another location where flight controllers and science support teams simulated ground communications.
A screen capture of a virtual reality view during the Artemis III VR Mini-Simulation. The lunar surface virtual environment was built using actual lunar surface data from one of the Artemis III candidate regions. Credit: Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. The crew stand-ins were immersed in the lunar environment and could then share the experience with the science and flight control teams. That quick and direct feedback could prove critical to the science and flight control teams as they work to build cohesive teams despite very different approaches to their work.
The flight operations team and the science team are learning how to work together and speak a shared language. Both teams are pivotal parts of the overall mission operations. The flight control team focuses on maintaining crew and vehicle safety and minimizing risk as much as possible. The science team, as Miller explains, is “relentlessly thirsty” for as much science as possible. Training sessions like this simulation allow the teams to hone their relationships and processes.
Members of the Artemis III Geology Team and science support team work in a mock Science Evaluation Room during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Video feeds from the stand-in crew members’ VR headsets allow the science team to follow, assess, and direct moonwalks and science activities. Credit: NASA/Robert Markowitz Denevi described the flight control team as a “well-oiled machine” and praised their dedication to getting it right for the science team. Many members of the flight control team have participated in field and classroom training to learn more about geology and better understand the science objectives for Artemis.
“They have invested a lot of their own effort into understanding the science background and science objectives, and the science team really appreciates that and wants to make sure they are also learning to operate in the best way we can to support the flight control team, because there’s a lot for us to learn as well,” Denevi said. “It’s a joy to get to share the science with them and have them be excited to help us implement it all.”
Artemis III Geology Team lead Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, left, Artemis III Geology Team member, Dr. Jose Hurtado, University of Texas at El Paso, and simulation co-lead, Bri Sparks, work together during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz This simulation, Sparks said, was just the beginning for how virtual reality could supplement training opportunities for Artemis science. In the future, using mixed reality could help take the experience to the next level, allowing crew members to be fully immersed in the virtual environment while interacting with real objects they can hold in their hands. Now that the Nobile Rim 1 landing site is built in VR, it can continue to be improved and used for crew training, something that Sparks said can’t be done with field training on Earth.
While “virtual” was part of the title for this exercise, its applications are very real.
“We are uncovering a lot of things that people probably had in the back of their head as something we’d need to deal with in the future,” Miller said. “But guess what? The future is now. This is now.”
Test subject crew members for the Artemis III Virtual Reality Mini-Simulation, including Grier Wilt, left, and Tess Caswell, center, execute a moonwalk in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Grier Wilt, left, and Tess Caswell, crew stand-ins for the Artemis III Virtual Reality Mini-Simulation, execute a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Engineering VR technical discipline lead Eddie Paddock works with team members to facilitate the virtual reality components of the Artemis III Virtual Reality Mini-Simulation in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: Robert Markowitz Flight director Paul Konyha follows moonwalk activities during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz
Rachel Barry
NASA’s Johnson Space Center
Keep Exploring Discover More Topics From NASA
Astromaterials
Artemis Science
A Time Capsule The Moon is a 4.5-billion-year-old time capsule, pristinely preserved by the cold vacuum of space. It is…
Lunar Craters
Earth’s Moon is covered in craters. Lunar craters tell us the history not only of the Moon, but of our…
Solar System
View the full article
-
By NASA
Explore This Section Science Science Activation 2025 Aviation Weather Mission:… Overview Learning Resources Science Activation Teams SME Map Opportunities More Science Activation Stories Citizen Science 2 min read
2025 Aviation Weather Mission: Civil Air Patrol Cadets Help Scientists Study the Atmosphere with GLOBE Clouds
The Science Activation Program’s NASA Earth Science Education Collaborative (NESEC) is working alongside the Civil Air Patrol (CAP) to launch the 2025 Aviation Weather Mission. The mission will engage cadets (students ages 11-20) and senior members to collect aviation-relevant observations including airport conditions, Global Learning and Observations to Benefit the Environment (GLOBE) Cloud observations, commercial aircraft information (including registration number and altitude), and satellite collocations provided by the NASA GLOBE Clouds team at NASA Langley Research Center. This mission results from a highly successful collaboration between NESEC and CAP as cadets and senior members collected cloud, air temperature, and land cover observations during the partial and total solar eclipses in 2023 and 2024, engaging over 400 teams with over 3,000 cadets and over 1,000 senior members in every state, Washington DC, and Puerto Rico.
The 2025 Aviation Weather Mission will take place from April through July 2025, collecting observations over two 4-hour periods while practicing additional skills, such as flight tracking, orienteering, and data management. So far, over 3,000 cadets in 46 wings (states) have signed up to participate.
Science Activation recently showed support for this mission through a letter of collaboration sent to CAP Major General Regena Aye in early February. NASA GLOBE Clouds and GLOBE Observer are part of the NASA Earth Science Education Collaborative (NESEC), which is led by the Institute for Global Environmental Strategies (IGES) and supported by NASA under cooperative agreement award number NNX16AE28A. NESEC is part of NASA’s Science Activation Portfolio. Learn more about how Science Activation connects NASA science experts, real content, and experiences with community leaders to do science in ways that activate minds and promote deeper understanding of our world and beyond: https://science.nasa.gov/learn
Cadets from the Virginia wing making cloud observations as they prepare for the 2025 Aviation Weather Mission. Share
Details
Last Updated Mar 04, 2025 Editor NASA Climate Editorial Team Location NASA Langley Research Center Related Terms
Science Activation Clouds Opportunities For Students to Get Involved Weather and Atmospheric Dynamics Explore More
2 min read Sharing PLANETS Curriculum with Out-of-School Time Educators
Article
1 week ago
3 min read Eclipses to Auroras: Eclipse Ambassadors Experience Winter Field School in Alaska
Article
2 weeks ago
2 min read An Afternoon of Family Science and Rocket Exploration in Alaska
Article
3 weeks ago
Keep Exploring Discover More Topics From NASA
James Webb Space Telescope
Webb is the premier observatory of the next decade, serving thousands of astronomers worldwide. It studies every phase in the…
Perseverance Rover
This rover and its aerial sidekick were assigned to study the geology of Mars and seek signs of ancient microbial…
Parker Solar Probe
On a mission to “touch the Sun,” NASA’s Parker Solar Probe became the first spacecraft to fly through the corona…
Juno
NASA’s Juno spacecraft entered orbit around Jupiter in 2016, the first explorer to peer below the planet’s dense clouds to…
View the full article
-
By NASA
NASA/Suni Williams Blue tentacle-like arms attached to an Astrobee free-flying robot grab onto a “capture cube” in this image from Feb. 4, 2025. The experimental grippers demonstrated autonomous detection and capture techniques that may be used to remove space debris and service satellites in low Earth orbit.
The Astrobee system was designed and built at NASA’s Ames Research Center in Silicon Valley for use inside the International Space Station. The system consists of three cube-shaped robots (named Bumble, Honey, and Queen), software, and a docking station used for recharging. The robots use electric fans as a propulsion system that allows them to fly freely through the microgravity environment of the station. Cameras and sensors help them to “see” and navigate their surroundings. The robots also carry a perching arm that allows them to grasp station handrails to conserve energy or to grab and hold items.
Image credit: NASA/Suni Williams
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.