Members Can Post Anonymously On This Site
Optical Fiber Production
-
Similar Topics
-
By NASA
NASA’s Synthetic Biology Project is turning to the 3D printing experts in the GrabCAD community for ideas and or designs that could lead to the ability to reuse and recycle small scale bioreactors to reduce the mass and volume requirements for deep space missions. Ideally, designs that could be printed using a 3D printer, using recyclable plastics, or a design using cleanable and reusable materials can be created.
Award: $7,000 in total prizes
Open Date: December 2, 2024
Close Date: February 24, 2025
For more information, visit: https://grabcad.com/challenges/3d-printable-bioreactor-for-deep-space-food-production
View the full article
-
By European Space Agency
At the International Astronautical Congress (IAC) in Milan this week, ESA signed a contract for Element #1, the first phase of the HydRON Demonstration System. HydRON, which stands for High thRoughput Optical Network, is set to transform the way data-collecting satellites communicate, using laser technology that will allow satellites to connect with each other and ground networks much faster.
View the full article
-
By NASA
NASA is preparing space at the agency’s Kennedy Space Center in Florida for upcoming assembly activities of the SLS (Space Launch System) rocket core stage for future Artemis missions, beginning with Artemis III.
Teams are currently outfitting the assembly building’s High Bay 2 for future vertical assembly of the rocket stage that will help power NASA’s Artemis campaign to the Moon. During Apollo, High Bay 2, one of four high bays inside the Vehicle Assembly Building, was used to stack the Saturn V rocket. During the Space Shuttle Program, the high bay was used for external tank checkout and storage and as a contingency storage area for the shuttle.
Technicians are building tooling in High Bay 2 at NASA Kennedy that will allow NASA and Boeing, the SLS core stage lead contractor, to vertically integrate the core stage. NASA Michigan-based Futuramic is constructing the tooling that will hold the core stage in a vertical position, allowing NASA and Boeing, the SLS core stage lead contractor, to integrate the SLS rocket’s engine section and four RS-25 engines to finish assembly of the rocket stage. Vertical integration will streamline final production efforts, offering technicians 360-degree access to the stage both internally and externally.
“The High Bay 2 area at NASA Kennedy is critical for work as SLS transitions from a developmental to operational model,” said Chad Bryant, deputy manager of the SLS Stages Office. “While teams are stacking and preparing the SLS rocket for launch of one Artemis mission, the SLS core stage for another Artemis mission will be taking shape just across the aisleway.”
Under the new assembly model beginning with Artemis III, all the major structures for the SLS core stage will continue to be fully produced and manufactured at NASA’s Michoud Assembly Facility in New Orleans. Upon completion of manufacturing and thermal protection system application, the engine section will be shipped to NASA Kennedy for final outfitting. Later, the top sections of the core stage – the forward skirt, intertank, liquid oxygen tank, and liquid hydrogen tank – will be outfitted and joined at NASA Michoud and shipped to NASA Kennedy for final assembly.
The fully assembled core stage for Artemis II arrived at Kennedy on July 23. NASA’s Pegasus barge delivered the SLS engine section for Artemis III to Kennedy in December 2022. Teams at NASA Michoud are outfitting the remaining core stage elements and preparing to horizontally join them. The four RS-25 engines for the Artemis III mission are complete at NASA’s Stennis Space Center in Bay St. Louis, Mississippi, and will be transported to NASA Kennedy in 2025. Major core stage and exploration upper stage structures are in work at NASA Michoud for Artemis IV and beyond.
NASA is working to land the first woman, first person of color, and its first international partner astronaut on the Moon under Artemis. SLS is part of NASA’s backbone for deep space exploration, along with the Orion spacecraft, supporting ground systems, advanced spacesuits and rovers, the Gateway in orbit around the Moon, and commercial human landing systems. SLS is the only rocket that can send Orion, astronauts, and supplies to the Moon in a single launch.
News Media Contact
Jonathan Deal
Marshall Space Flight Center
Huntsville, Ala.
256-544-0034
View the full article
-
By NASA
5 Min Read NASA Optical Navigation Tech Could Streamline Planetary Exploration
Optical navigation technology could help astronauts and robots find their ways using data from cameras and other sensors. Credits: NASA As astronauts and rovers explore uncharted worlds, finding new ways of navigating these bodies is essential in the absence of traditional navigation systems like GPS. Optical navigation relying on data from cameras and other sensors can help spacecraft — and in some cases, astronauts themselves — find their way in areas that would be difficult to navigate with the naked eye. Three NASA researchers are pushing optical navigation tech further, by making cutting edge advancements in 3D environment modeling, navigation using photography, and deep learning image analysis. In a dim, barren landscape like the surface of the Moon, it can be easy to get lost. With few discernable landmarks to navigate with the naked eye, astronauts and rovers must rely on other means to plot a course.
As NASA pursues its Moon to Mars missions, encompassing exploration of the lunar surface and the first steps on the Red Planet, finding novel and efficient ways of navigating these new terrains will be essential. That’s where optical navigation comes in — a technology that helps map out new areas using sensor data.
NASA’s Goddard Space Flight Center in Greenbelt, Maryland, is a leading developer of optical navigation technology. For example, GIANT (the Goddard Image Analysis and Navigation Tool) helped guide the OSIRIS-REx mission to a safe sample collection at asteroid Bennu by generating 3D maps of the surface and calculating precise distances to targets.
Now, three research teams at Goddard are pushing optical navigation technology even further.
Virtual World Development
Chris Gnam, an intern at NASA Goddard, leads development on a modeling engine called Vira that already renders large, 3D environments about 100 times faster than GIANT. These digital environments can be used to evaluate potential landing areas, simulate solar radiation, and more.
While consumer-grade graphics engines, like those used for video game development, quickly render large environments, most cannot provide the detail necessary for scientific analysis. For scientists planning a planetary landing, every detail is critical.
Vira can quickly and efficiently render an environment in great detail.NASA “Vira combines the speed and efficiency of consumer graphics modelers with the scientific accuracy of GIANT,” Gnam said. “This tool will allow scientists to quickly model complex environments like planetary surfaces.”
The Vira modeling engine is being used to assist with the development of LuNaMaps (Lunar Navigation Maps). This project seeks to improve the quality of maps of the lunar South Pole region which are a key exploration target of NASA’s Artemis missions.
Vira also uses ray tracing to model how light will behave in a simulated environment. While ray tracing is often used in video game development, Vira utilizes it to model solar radiation pressure, which refers to changes in momentum to a spacecraft caused by sunlight.
Vira can accurately render indirect lighting, which is when an area is still lit up even though it is not directly facing a light source.NASA Find Your Way with a Photo
Another team at Goddard is developing a tool to enable navigation based on images of the horizon. Andrew Liounis, an optical navigation product design lead, leads the team, working alongside NASA Interns Andrew Tennenbaum and Will Driessen, as well as Alvin Yew, the gas processing lead for NASA’s DAVINCI mission.
An astronaut or rover using this algorithm could take one picture of the horizon, which the program would compare to a map of the explored area. The algorithm would then output the estimated location of where the photo was taken.
Using one photo, the algorithm can output with accuracy around hundreds of feet. Current work is attempting to prove that using two or more pictures, the algorithm can pinpoint the location with accuracy around tens of feet.
“We take the data points from the image and compare them to the data points on a map of the area,” Liounis explained. “It’s almost like how GPS uses triangulation, but instead of having multiple observers to triangulate one object, you have multiple observations from a single observer, so we’re figuring out where the lines of sight intersect.”
This type of technology could be useful for lunar exploration, where it is difficult to rely on GPS signals for location determination.
A Visual Perception Algorithm to Detect Craters
To automate optical navigation and visual perception processes, Goddard intern Timothy Chase is developing a programming tool called GAVIN (Goddard AI Verification and Integration) Tool Suit.
This tool helps build deep learning models, a type of machine learning algorithm that is trained to process inputs like a human brain. In addition to developing the tool itself, Chase and his team are building a deep learning algorithm using GAVIN that will identify craters in poorly lit areas, such as the Moon.
“As we’re developing GAVIN, we want to test it out,” Chase explained. “This model that will identify craters in low-light bodies will not only help us learn how to improve GAVIN, but it will also prove useful for missions like Artemis, which will see astronauts exploring the Moon’s south pole region — a dark area with large craters — for the first time.”
As NASA continues to explore previously uncharted areas of our solar system, technologies like these could help make planetary exploration at least a little bit simpler. Whether by developing detailed 3D maps of new worlds, navigating with photos, or building deep learning algorithms, the work of these teams could bring the ease of Earth navigation to new worlds.
By Matthew Kaufman
NASA’s Goddard Space Flight Center, Greenbelt, Md.
Share
Details
Last Updated Aug 07, 2024 EditorRob GarnerContactRob Garnerrob.garner@nasa.govLocationGoddard Space Flight Center Related Terms
Goddard Technology Artificial Intelligence (AI) Goddard Space Flight Center Technology Explore More
4 min read NASA Improves GIANT Optical Navigation Technology for Future Missions
Goddard's GIANT optical navigation software helped guide the OSIRIS-REx mission to the Asteroid Bennu. Today…
Article 10 months ago 4 min read Space Station Research Contributes to Navigation Systems for Moon Voyages
Article 2 years ago 5 min read NASA, Industry Improve Lidars for Exploration, Science
NASA engineers will test a suite of new laser technologies from an aircraft this summer…
Article 5 months ago View the full article
-
By NASA
Technological innovations make headlines every day, and NASA’s In Space Production Applications (InSPA) Portfolio of awards are driving these innovations into the future. InSPA awards help U.S. companies demonstrate in-space manufacturing of their products and move them to market, propelling U.S. industry toward the development of a sustainable, scalable, and profitable non-NASA demand for services and products manufactured in the microgravity environment of low Earth orbit for use on Earth.
Latest News:
A Meta-Analysis of Semiconductor Materials Fabricated in Microgravity (June 26, 2024) ISSRDC Announces “Steps to Space” Session to Educate Future Researchers (June 27, 2024) Innovation in Focus: Technology Development (June 13, 2024) Optical Fiber Production – Science in Space: March 2024 (March 25, 2024) ISS National Lab Releases In Space Production Applications Funding Opportunity (March 6, 2024) NASA Aims to Boost In Space Production Applications (May 15, 2023) White Paper: The Benefits of Semiconductor Manufacturing in Low-Earth Orbit for Terrestrial Use (November 9, 2023) Station InSPA: The Next Industrial Revolution? (September 1, 2022) Keep Exploring Discover More Topics
In Space Production Applications
Low Earth Orbit Economy
Opportunities and Information for Researchers
Latest News from Space Station Research
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.