Members Can Post Anonymously On This Site
Slime Mold Simulations Used to Map the Dark Matter Holding the Universe Together
-
Similar Topics
-
By NASA
5 min read
How NASA’s SPHEREx Mission Will Share Its All-Sky Map With the World
NASA’s SPHEREx mission will map the entire sky in 102 different wavelengths, or colors, of infrared light. This image of the Vela Molecular Ridge was captured by SPHEREx and is part of the mission’s first ever public data release. The yellow patch on the right side of the image is a cloud of interstellar gas and dust that glows in some infrared colors due to radiation from nearby stars. NASA/JPL-Caltech NASA’s newest astrophysics space telescope launched in March on a mission to create an all-sky map of the universe. Now settled into low-Earth orbit, SPHEREx (Spectro-Photometer for the History of the Universe, Epoch of Reionization, and Ices Explorer) has begun delivering its sky survey data to a public archive on a weekly basis, allowing anyone to use the data to probe the secrets of the cosmos.
“Because we’re looking at everything in the whole sky, almost every area of astronomy can be addressed by SPHEREx data,” said Rachel Akeson, the lead for the SPHEREx Science Data Center at IPAC. IPAC is a science and data center for astrophysics and planetary science at Caltech in Pasadena, California.
Almost every area of astronomy can be addressed by SPHEREx data.
Rachel Akeson
SPHEREx Science Data Center Lead
Other missions, like NASA’s now-retired WISE (Wide-field Infrared Survey Explorer), have also mapped the entire sky. SPHEREx builds on this legacy by observing in 102 infrared wavelengths, compared to WISE’s four wavelength bands.
By putting the many wavelength bands of SPHEREx data together, scientists can identify the signatures of specific molecules with a technique known as spectroscopy. The mission’s science team will use this method to study the distribution of frozen water and organic molecules — the “building blocks of life” — in the Milky Way.
This animation shows how NASA’s SPHEREx observatory will map the entire sky — a process it will complete four times over its two-year mission. The telescope will observe every point in the sky in 102 different infrared wavelengths, more than any other all-sky survey. SPHEREx’s openly available data will enable a wide variety of astronomical studies. Credit: NASA/JPL-Caltech The SPHEREx science team will also use the mission’s data to study the physics that drove the universe’s expansion following the big bang, and to measure the amount of light emitted by all the galaxies in the universe over time. Releasing SPHEREx data in a public archive encourages far more astronomical studies than the team could do on their own.
“By making the data public, we enable the whole astronomy community to use SPHEREx data to work on all these other areas of science,” Akeson said.
NASA is committed to the sharing of scientific data, promoting transparency and efficiency in scientific research. In line with this commitment, data from SPHEREx appears in the public archive within 60 days after the telescope collects each observation. The short delay allows the SPHEREx team to process the raw data to remove or flag artifacts, account for detector effects, and align the images to the correct astronomical coordinates.
The team publishes the procedures they used to process the data alongside the actual data products. “We want enough information in those files that people can do their own research,” Akeson said.
One of the early test images captured by NASA’s SPHEREx mission in April 2025. This image shows a section of sky in one infrared wavelength, or color, that is invisible to the human eye but is represented here in a visible color. This particular wavelength (3.29 microns) reveals a cloud of dust made of a molecule similar to soot or smoke. NASA/JPL-Caltech This image from NASA’s SPHEREx shows the same region of space in a different infrared wavelength (0.98 microns), once again represented by a color that is visible to the human eye. The dust cloud has vanished because the molecules that make up the dust — polycyclic aromatic hydrocarbons — do not radiate light in this color. NASA/JPL-Caltech
During its two-year prime mission, SPHEREx will survey the entire sky twice a year, creating four all-sky maps. After the mission reaches the one-year mark, the team plans to release a map of the whole sky at all 102 wavelengths.
In addition to the science enabled by SPHEREx itself, the telescope unlocks an even greater range of astronomical studies when paired with other missions. Data from SPHEREx can be used to identify interesting targets for further study by NASA’s James Webb Space Telescope, refine exoplanet parameters collected from NASA’s TESS (Transiting Exoplanet Survey Satellite), and study the properties of dark matter and dark energy along with ESA’s (European Space Agency’s) Euclid mission and NASA’s upcoming Nancy Grace Roman Space Telescope.
The SPHEREx mission’s all-sky survey will complement data from other NASA space telescopes. SPHEREx is illustrated second from the right. The other telescope illustrations are, from left to right: the Hubble Space Telescope, the retired Spitzer Space Telescope, the retired WISE/NEOWISE mission, the James Webb Space Telescope, and the upcoming Nancy Grace Roman Space Telescope. NASA/JPL-Caltech The IPAC archive that hosts SPHEREx data, IRSA (NASA/IPAC Infrared Science Archive), also hosts pointed observations and all-sky maps at a variety of wavelengths from previous missions. The large amount of data available through IRSA gives users a comprehensive view of the astronomical objects they want to study.
“SPHEREx is part of the entire legacy of NASA space surveys,” said IRSA Science Lead Vandana Desai. “People are going to use the data in all kinds of ways that we can’t imagine.”
NASA’s Office of the Chief Science Data Officer leads open science efforts for the agency. Public sharing of scientific data, tools, research, and software maximizes the impact of NASA’s science missions. To learn more about NASA’s commitment to transparency and reproducibility of scientific research, visit science.nasa.gov/open-science. To get more stories about the impact of NASA’s science data delivered directly to your inbox, sign up for the NASA Open Science newsletter.
By Lauren Leese
Web Content Strategist for the Office of the Chief Science Data Officer
More About SPHEREx
The SPHEREx mission is managed by NASA’s Jet Propulsion Laboratory for the agency’s Astrophysics Division within the Science Mission Directorate at NASA Headquarters. BAE Systems in Boulder, Colorado, built the telescope and the spacecraft bus. The science analysis of the SPHEREx data will be conducted by a team of scientists located at 10 institutions in the U.S., two in South Korea, and one in Taiwan. Caltech in Pasadena managed and integrated the instrument. The mission’s principal investigator is based at Caltech with a joint JPL appointment. Data will be processed and archived at IPAC at Caltech. The SPHEREx dataset will be publicly available at the NASA-IPAC Infrared Science Archive. Caltech manages JPL for NASA.
To learn more about SPHEREx, visit:
https://nasa.gov/SPHEREx
Media Contacts
Calla Cofield
Jet Propulsion Laboratory, Pasadena, Calif.
626-808-2469
calla.e.cofield@jpl.nasa.gov
Amanda Adams
Office of the Chief Science Data Officer
256-683-6661
amanda.m.adams@nasa.gov
Share
Details
Last Updated Jul 02, 2025 Related Terms
Open Science Astrophysics Galaxies Jet Propulsion Laboratory SPHEREx (Spectro-Photometer for the History of the Universe and Ices Explorer) The Search for Life The Universe Explore More
3 min read Discovery Alert: Flaring Star, Toasted Planet
Article
4 hours ago
11 min read 3 Years of Science: 10 Cosmic Surprises from NASA’s Webb Telescope
Article
5 hours ago
7 min read A New Alloy is Enabling Ultra-Stable Structures Needed for Exoplanet Discovery
Article
1 day ago
Keep Exploring Discover More Topics From NASA
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By European Space Agency
Astronomers have discovered a huge filament of hot gas bridging four galaxy clusters. At 10 times as massive as our galaxy, the thread could contain some of the Universe’s ‘missing’ matter, addressing a decades-long mystery.
View the full article
-
By NASA
A funky effect Einstein predicted, known as gravitational lensing — when a foreground galaxy magnifies more distant galaxies behind it — will soon become common when NASA’s Nancy Grace Roman Space Telescope begins science operations in 2027 and produces vast surveys of the cosmos.
This image shows a simulated observation from NASA’s Nancy Grace Roman Space Telescope with an overlay of its Wide Field Instrument’s field of view. More than 20 gravitational lenses, with examples shown at left and right, are expected to pop out in every one of Roman’s vast observations. A journal paper led by Bryce Wedig, a graduate student at Washington University in St. Louis, Missouri, estimates that of those Roman detects, about 500 from the telescope’s High-Latitude Wide-Area Survey will be suitable for dark matter studies. By examining such a large population of gravitational lenses, the researchers hope to learn a lot more about the mysterious nature of dark matter.Credit: NASA, Bryce Wedig (Washington University), Tansu Daylan (Washington University), Joseph DePasquale (STScI) A particular subset of gravitational lenses, known as strong lenses, is the focus of a new paper published in the Astrophysical Journal led by Bryce Wedig, a graduate student at Washington University in St. Louis. The research team has calculated that over 160,000 gravitational lenses, including hundreds suitable for this study, are expected to pop up in Roman’s vast images. Each Roman image will be 200 times larger than infrared snapshots from NASA’s Hubble Space Telescope, and its upcoming “wealth” of lenses will vastly outpace the hundreds studied by Hubble to date.
Roman will conduct three core surveys, providing expansive views of the universe. This science team’s work is based on a previous version of Roman’s now fully defined High-Latitude Wide-Area Survey. The researchers are working on a follow-up paper that will align with the final survey’s specifications to fully support the research community.
“The current sample size of these objects from other telescopes is fairly small because we’re relying on two galaxies to be lined up nearly perfectly along our line of sight,” Wedig said. “Other telescopes are either limited to a smaller field of view or less precise observations, making gravitational lenses harder to detect.”
Gravitational lenses are made up of at least two cosmic objects. In some cases, a single foreground galaxy has enough mass to act like a lens, magnifying a galaxy that is almost perfectly behind it. Light from the background galaxy curves around the foreground galaxy along more than one path, appearing in observations as warped arcs and crescents. Of the 160,000 lensed galaxies Roman may identify, the team expects to narrow that down to about 500 that are suitable for studying the structure of dark matter at scales smaller than those galaxies.
“Roman will not only significantly increase our sample size — its sharp, high-resolution images will also allow us to discover gravitational lenses that appear smaller on the sky,” said Tansu Daylan, the principal investigator of the science team conducting this research program. Daylan is an assistant professor and a faculty fellow at the McDonnell Center for the Space Sciences at Washington University in St. Louis. “Ultimately, both the alignment and the brightness of the background galaxies need to meet a certain threshold so we can characterize the dark matter within the foreground galaxies.”
To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
This video shows how a background galaxy’s light is lensed or magnified by a massive foreground galaxy, seen at center, before reaching NASA’s Roman Space Telescope. Light from the background galaxy is distorted, curving around the foreground galaxy and appearing more than once as warped arcs and crescents. Researchers studying these objects, known as gravitational lenses, can better characterize the mass of the foreground galaxy, which offers clues about the particle nature of dark matter.Credit: NASA, Joseph Olmsted (STScI) What Is Dark Matter?
Not all mass in galaxies is made up of objects we can see, like star clusters. A significant fraction of a galaxy’s mass is made up of dark matter, so called because it doesn’t emit, reflect, or absorb light. Dark matter does, however, possess mass, and like anything else with mass, it can cause gravitational lensing.
When the gravity of a foreground galaxy bends the path of a background galaxy’s light, its light is routed onto multiple paths. “This effect produces multiple images of the background galaxy that are magnified and distorted differently,” Daylan said. These “duplicates” are a huge advantage for researchers — they allow multiple measurements of the lensing galaxy’s mass distribution, ensuring that the resulting measurement is far more precise.
Roman’s 300-megapixel camera, known as its Wide Field Instrument, will allow researchers to accurately determine the bending of the background galaxies’ light by as little as 50 milliarcseconds, which is like measuring the diameter of a human hair from the distance of more than two and a half American football fields or soccer pitches.
The amount of gravitational lensing that the background light experiences depends on the intervening mass. Less massive clumps of dark matter cause smaller distortions. As a result, if researchers are able to measure tinier amounts of bending, they can detect and characterize smaller, less massive dark matter structures — the types of structures that gradually merged over time to build up the galaxies we see today.
With Roman, the team will accumulate overwhelming statistics about the size and structures of early galaxies. “Finding gravitational lenses and being able to detect clumps of dark matter in them is a game of tiny odds. With Roman, we can cast a wide net and expect to get lucky often,” Wedig said. “We won’t see dark matter in the images — it’s invisible — but we can measure its effects.”
“Ultimately, the question we’re trying to address is: What particle or particles constitute dark matter?” Daylan added. “While some properties of dark matter are known, we essentially have no idea what makes up dark matter. Roman will help us to distinguish how dark matter is distributed on small scales and, hence, its particle nature.”
Preparations Continue
Before Roman launches, the team will also search for more candidates in observations from ESA’s (the European Space Agency’s) Euclid mission and the upcoming ground-based Vera C. Rubin Observatory in Chile, which will begin its full-scale operations in a few weeks. Once Roman’s infrared images are in hand, the researchers will combine them with complementary visible light images from Euclid, Rubin, and Hubble to maximize what’s known about these galaxies.
“We will push the limits of what we can observe, and use every gravitational lens we detect with Roman to pin down the particle nature of dark matter,” Daylan said.
The Nancy Grace Roman Space Telescope is managed at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, with participation by NASA’s Jet Propulsion Laboratory in Southern California; Caltech/IPAC in Pasadena, California; the Space Telescope Science Institute in Baltimore; and a science team comprising scientists from various research institutions. The primary industrial partners are BAE Systems, Inc. in Boulder, Colorado; L3Harris Technologies in Melbourne, Florida; and Teledyne Scientific & Imaging in Thousand Oaks, California.
By Claire Blome
Space Telescope Science Institute, Baltimore, Md.
Share
Details
Last Updated Jun 12, 2025 EditorAshley BalzerContactAshley Balzerashley.m.balzer@nasa.govLocationNASA Goddard Space Flight Center Related Terms
Nancy Grace Roman Space Telescope Astrophysics Dark Matter Galaxies Galaxies, Stars, & Black Holes Galaxies, Stars, & Black Holes Research The Universe Explore More
6 min read NASA’s Roman Mission Shares Detailed Plans to Scour Skies
Article 2 months ago 5 min read Millions of Galaxies Emerge in New Simulated Images From NASA’s Roman
Article 2 years ago 6 min read Team Preps to Study Dark Energy via Exploding Stars With NASA’s Roman
Article 3 months ago View the full article
-
By NASA
What do music ensembles and human spaceflight have in common? They require the harmonization of different elements to create an inspiring opus.
NASA’s Paige Whittington has experience with both.
As a principal flutist for Purdue University’s Wind Ensemble, Whittington helped fellow flutists play beautiful music together while pursuing her graduate degree. Now, as a space exploration simulation architect at Johnson Space Center in Houston, she strives for a cross-team harmony that can inform the agency’s Moon to Mars exploration approach.
“Simulation often sits at the intersection of several teams because we integrate various designs and mission requirements,” she said. “We have to learn how to best fit those teams and their priorities together to enable cutting-edge human exploration.”
Official NASA portrait of Paige Whittington.NASA/Josh Valcarcel Whittington is part of the NASA Exploration Systems Simulations (NExSyS) team, which develops physics-based simulations to evaluate various vehicles and mission concepts. Her role includes working with lunar and Mars architecture teams within NASA’s Strategy and Architecture Office to assess current and potential future elements of vehicle design, logistics, and planning.
“Our simulations help inform engineers, astronauts, and managers about the new, challenging environments that await us on the Moon and Mars,” she said.
One of the most challenging and rewarding projects she is working on is the Artemis Distributed Simulation. “NExSyS develops and maintains several individual simulations such as rovers, landers, and habitats. However, human exploration on other planetary bodies requires careful integration and coordination of these individual pieces,” she explained.
The distributed simulation brings those pieces together to enable agency teams to envision a complete Artemis mission to the lunar surface. Different elements can be added or removed to create a wide variety of scenarios. The simulation can run automatically with predetermined settings or be responsive to real-time and randomized changes. Participants can operate the team’s video walls, mock-up mission control console, virtual reality platforms, and lander piloting facility to interact together within the chosen Artemis mission scenario.
Paige Whittington standing in front of the Video Wall used for human-in-the-loop simulations located inside the Systems Engineering Simulator facility at NASA’s Johnson Space Center. Image courtesy of Paige Whittington “I am very proud to know that the simulations I help develop have impacted some of the decisions being made by NASA’s architecture teams,” she said.
She is excited to take on a new responsibility, as well. Whittington recently became project manager of the JSC Engineering Orbital Dynamics software package. Also known as JEOD, this open-source tool was created by NASA to model spacecraft trajectories, such as proposed flight paths for a lunar lander. JEOD calculates gravitational and other environmental forces acting on spacecraft to simulate the position and orientation of those vehicles over time, whether they are orbiting a cosmic body or traveling between planets.
Whittington’s family moved frequently during her childhood, calling five different states home as she grew up. Their time in Florida would have a life-long impact.
“My parents drove me and my sister across the state to visit NASA’s Kennedy Space Center. It was mesmerizing, awe-inspiring, and seemingly a whole different world from where my 8-year-old self thought I was living,” she said. Her love of space never waned, and a high school physics teacher encouraged her to study aerospace engineering in college. “That was the turning point when I realized space exploration didn’t have to stay in my dreams – it was a career field I could actually work in.”
Whittington took her teacher’s advice, earning a bachelor’s degree in aerospace engineering from the University of Texas at Austin. She also completed two internships at Johnson through the Universities Space Research Association and interned with a NASA contractor after graduation. While pursuing a master’s degree in Aeronautics and Astronautics at Purdue, Whittington was accepted to NASA’s Pathways Program and did two rotations with the Simulation and Graphics Branch before joining the team as a full-time employee in June 2022.
Paige Whittington celebrating the launch of Artemis I at Johnson Space Center in 2022. Image courtesy of Paige Whittington Whittington has learned several key lessons during her five years with NASA, including the essential part open, regular communication plays in understanding an individual’s or team’s core needs and limitations. She also stressed the importance of adaptability.
“The path that you planned for may not be the path you end up choosing. But that planning enabled you to be who you are now and to make different choices,” she said. “I did not anticipate working in simulations when I started my aerospace engineering degree, but I took the opportunity when it was presented, and I am so happy that I did.”
Explore More
9 min read Station Nation: Meet Megan Harvey, Utilization Flight Lead and Capsule Communicator
Article 6 days ago 4 min read Andrea Harrington’s Vision Paves the Way for Lunar Missions
Article 1 week ago 4 min read Aubrie Henspeter: Leading Commercial Lunar Missions
Article 2 weeks ago View the full article
-
By NASA
NASA named Stanford University of California winner of the Lunar Autonomy Challenge, a six-month competition for U.S. college and university student teams to virtually map and explore using a digital twin of NASA’s In-Situ Resource Utilization Pilot Excavator (IPEx).
The winning team successfully demonstrated the design and functionality of their autonomous agent, or software that performs specified actions without human intervention. Their agent autonomously navigated the IPEx digital twin in the virtual lunar environment, while accurately mapping the surface, correctly identifying obstacles, and effectively managing available power.
Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Team photo of NAV Lab Lunar Autonomy Challenge from Stanford UniversityCredit: Stanford University’s NAV Lab team The Lunar Autonomy Challenge has been a truly unique experience. The challenge provided the opportunity to develop and test methods in a highly realistic simulation environment."
Adam dai
Lunar Autonomy Challenge team lead, Stanford University
Dai added, “It pushed us to find solutions robust to the harsh conditions of the lunar surface. I learned so much through the challenge, both about new ideas and methods, as well as through deepening my understanding of core methods across the autonomy stack (perception, localization, mapping, planning). I also very much enjoyed working together with my team to brainstorm different approaches and strategies and solve tangible problems observed in the simulation.”
The challenge offered 31 teams a valuable opportunity to gain experience in software development, autonomy, and machine learning using cutting-edge NASA lunar technology. Participants also applied essential skills common to nearly every engineering discipline, including technical writing, collaborative teamwork, and project management.
The Lunar Autonomy Challenge supports NASA’s Lunar Surface Innovation Initiative (LSII), which is part of the Space Technology Mission Directorate. The LSII aims to accelerate technology development and pursue results that will provide essential infrastructure for lunar exploration by collaborating with industry, academia, and other government agencies.
The work displayed by all of these teams has been impressive, and the solutions they have developed are beneficial to advancing lunar and Mars surface technologies as we prepare for increasingly complex missions farther from home.”
Niki Werkheiser
Director of Technology Maturation and LSII lead, NASA Headquarters
“To succeed, we need input from everyone — every idea counts to propel our goals forward. It is very rewarding to see these students and software developers contributing their skills to future lunar and Mars missions,” Werkheiser added.
Through the Lunar Autonomy Challenge, NASA collaborated with the Johns Hopkins Applied Physics Laboratory, Caterpillar Inc., and Embodied AI. Each team contributed unique expertise and tools necessary to make the challenge a success.
The Applied Physics Laboratory managed the challenge for NASA. As a systems integrator for LSII, they provided expertise to streamline rigor and engineering discipline across efforts, ensuring the development of successful, efficient, and cost-effective missions — backed by the world’s largest cohort of lunar scientists.
Caterpillar Inc. is known for its construction and excavation equipment and operates a large fleet of autonomous haul trucks. They also have worked with NASA for more than 20 years on a variety of technologies, including autonomy, 3D printing, robotics, and simulators as they continue to collaborate with NASA on technologies that support NASA’s mission objectives and provide value to the mining and construction industries.
Embodied AI collaborated with Caterpillar to integrate the simulation into the open-source driving environment used for the challenge. For the Lunar Autonomy Challenge, the normally available digital assets of the CARLA simulation platform, such as urban layouts, buildings, and vehicles, were replaced by an IPEx “Digital Twin” and lunar environmental models.
“This collaboration is a great example of how the government, large companies, small businesses, and research institutions can thoughtfully leverage each other’s different, but complementary, strengths,” Werkheiser added. “By substantially modernizing existing tools, we can turn today’s novel technologies into tomorrow’s institutional capabilities for more efficient and effective space exploration, while also stimulating innovation and economic growth on Earth.”
FINALIST TEAMS
First Place
NAV Lab team
Stanford University, Stanford, California
Second Place
MAPLE (MIT Autonomous Pathfinding for Lunar Exploration) team
Massachusetts Institute of Technology, Cambridge, MA
Third Place
Moonlight team
Carnegie Mellon University, Pittsburgh, PA
OTHER COMPETING TEAMS
Lunar ExplorersArizona State UniversityTempe, ArizonaAIWVU West Virginia University Morgantown, West VirginiaStellar Sparks California Polytechnic Institute Pomona Pomona, California LunatiX Johns Hopkins University Whiting School of EngineeringBaltimore CARLA CSU California State University, Stanislaus Turlock, CaliforniaRose-Hulman Rose-Hulman Institute of Technology Terre Haute, IndianaLunar PathfindersAmerican Public University SystemCharles Town, West Virginia Lunar Autonomy Challenge digital simulation of lunar surface activity using a digital twin of NASA’s ISRU Pilot ExcavatorJohns Hopkins Applied Physics Laboratory Keep Exploring Discover More Topics From NASA
Space Technology Mission Directorate
NASA’s Lunar Surface Innovation Initiative
Game Changing Development Projects
Game Changing Development projects aim to advance space technologies, focusing on advancing capabilities for going to and living in space.
ISRU Pilot Excavator
View the full article
-
-
Similar Videos
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.