Members Can Post Anonymously On This Site
High Velocity Clouds Found to Dwell in Milky Way's Halo
-
Similar Topics
-
By USH
Some time ago, while visiting the Grand Canyon in Arizona, a photographer captured several short video clips of the landscape. In one of those clips, an unusual anomaly was discovered.
The original footage is only 1.9 seconds long, but within that moment, something remarkable was caught on camera. An unidentified aerial phenomenon (UAP) flashed across the frame, visible for less than a second, only noticeable when the video was paused and analyzed frame by frame.
The object was moving at an astonishing speed, covering an estimated two to three miles in under a second, far beyond the capabilities of any conventional aircraft, drone, or helicopter.
This isn’t the first time such anomalous flying objects have been observed. Their characteristics defy comparison with known aerial technology.
Some skeptics have proposed that the object might have been a rock thrown into the canyon from behind the camera. However, that explanation seems unlikely. Most people can only throw objects at speeds of 10 to 20 meters per second (approximately 22 to 45 mph). The velocity of this object far exceeded that range, and its near-invisibility in the unedited video suggests it was moving much faster.
View the full article
-
By NASA
5 min read
NASA Launching Rockets Into Radio-Disrupting Clouds
NASA is launching rockets from a remote Pacific island to study mysterious, high-altitude cloud-like structures that can disrupt critical communication systems. The mission, called Sporadic-E ElectroDynamics, or SEED, opens its three-week launch window from Kwajalein Atoll in the Marshall Islands on Friday, June 13.
The atmospheric features SEED is studying are known as Sporadic-E layers, and they create a host of problems for radio communications. When they are present, air traffic controllers and marine radio users may pick up signals from unusually distant regions, mistaking them for nearby sources. Military operators using radar to see beyond the horizon may detect false targets — nicknamed “ghosts” — or receive garbled signals that are tricky to decipher. Sporadic-E layers are constantly forming, moving, and dissipating, so these disruptions can be difficult to anticipate.
An animated illustration depicts Sporadic-E layers forming in the lower portions of the ionosphere, causing radio signals to reflect back to Earth before reaching higher layers of the ionosphere. NASA’s Goddard Space Flight Center/Conceptual Image Lab Sporadic-E layers form in the ionosphere, a layer of Earth’s atmosphere that stretches from about 40 to 600 miles (60 to 1,000 kilometers) above sea level. Home to the International Space Station and most Earth-orbiting satellites, the ionosphere is also where we see the greatest impacts of space weather. Primarily driven by the Sun, space weather causes myriad problems for our communications with satellites and between ground systems. A better understanding of the ionosphere is key to keeping critical infrastructure running smoothly.
The ionosphere is named for the charged particles, or ions, that reside there. Some of these ions come from meteors, which burn up in the atmosphere and leave traces of ionized iron, magnesium, calcium, sodium, and potassium suspended in the sky. These “heavy metals” are more massive than the ionosphere’s typical residents and tend to sink to lower altitudes, below 90 miles (140 kilometers). Occasionally, they clump together to create dense clusters known as Sporadic-E layers.
The Perseids meteor shower peaks in mid-August. Meteors like these can deposit metals into Earth’s ionosphere that can help create cloud-like structures called Sporadic-E layers. NASA/Preston Dyches “These Sporadic-E layers are not visible to naked eye, and can only be seen by radars. In the radar plots, some layers appear like patchy and puffy clouds, while others spread out, similar to an overcast sky, which we call blanketing Sporadic-E layer” said Aroh Barjatya, the SEED mission’s principal investigator and a professor of engineering physics at Embry-Riddle Aeronautical University in Daytona Beach, Florida. The SEED team includes scientists from Embry-Riddle, Boston College in Massachusetts, and Clemson University in South Carolina.
“There’s a lot of interest in predicting these layers and understanding their dynamics because of how they interfere with communications,” Barjatya said.
A Mystery at the Equator
Scientists can explain Sporadic-E layers when they form at midlatitudes but not when they appear close to Earth’s equator — such as near Kwajalein Atoll, where the SEED mission will launch.
In the Northern and Southern Hemispheres, Sporadic-E layers can be thought of as particle traffic jams.
Think of ions in the atmosphere as miniature cars traveling single file in lanes defined by Earth’s magnetic field lines. These lanes connect Earth end to end — emerging near the South Pole, bowing around the equator, and plunging back into the North Pole.
A conceptual animation shows Earth’s magnetic field. The blue lines radiating from Earth represent the magnetic field lines that charged particles travel along. NASA’s Goddard Space Flight Center/Conceptual Image Lab At Earth’s midlatitudes, the field lines angle toward the ground, descending through atmospheric layers with varying wind speeds and directions. As the ions pass through these layers, they experience wind shear — turbulent gusts that cause their orderly line to clump together. These particle pileups form Sporadic-E layers.
But near the magnetic equator, this explanation doesn’t work. There, Earth’s magnetic field lines run parallel to the surface and do not intersect atmospheric layers with differing winds, so Sporadic-E layers shouldn’t form. Yet, they do — though less frequently.
“We’re launching from the closest place NASA can to the magnetic equator,” Barjatya said, “to study the physics that existing theory doesn’t fully explain.”
Taking to the Skies
To investigate, Barjatya developed SEED to study low-latitude Sporadic-E layers from the inside. The mission relies on sounding rockets — uncrewed suborbital spacecraft carrying scientific instruments. Their flights last only a few minutes but can be launched precisely at fleeting targets.
Beginning the night of June 13, Barjatya and his team will monitor ALTAIR (ARPA Long-Range Tracking and Instrumentation Radar), a high-powered, ground-based radar system at the launch site, for signs of developing Sporadic-E layers. When conditions are right, Barjatya will give the launch command. A few minutes later, the rocket will be in flight.
The SEED science team and mission management team in front of the ARPA Long-Range Tracking and Instrumentation Radar (ALTAIR). The SEED team will use ALTAIR to monitor the ionosphere for signs of Sporadic-E layers and time the launch. U.S. Army Space and Missile Defense Command On ascent, the rocket will release colorful vapor tracers. Ground-based cameras will track the tracers to measure wind patterns in three dimensions. Once inside the Sporadic-E layer, the rocket will deploy four subpayloads — miniature detectors that will measure particle density and magnetic field strength at multiple points. The data will be transmitted back to the ground as the rocket descends.
On another night during the launch window, the team will launch a second, nearly identical rocket to collect additional data under potentially different conditions.
Barjatya and his team will use the data to improve computer models of the ionosphere, aiming to explain how Sporadic-E layers form so close to the equator.
“Sporadic-E layers are part of a much larger, more complicated physical system that is home to space-based assets we rely on every day,” Barjatya said. “This launch gets us closer to understanding another key piece of Earth’s interface to space.”
By Miles Hatfield
NASA’s Goddard Space Flight Center, Greenbelt, Md.
Share
Details
Last Updated Jun 12, 2025 Related Terms
Heliophysics Goddard Space Flight Center Heliophysics Division Ionosphere Missions NASA Centers & Facilities NASA Directorates Science & Research Science Mission Directorate Sounding Rockets Sounding Rockets Program The Solar System The Sun Uncategorized Wallops Flight Facility Weather and Atmospheric Dynamics Explore More
9 min read The Earth Observer Editor’s Corner: April–June 2025
Article
22 hours ago
5 min read NASA’s Webb ‘UNCOVERs’ Galaxy Population Driving Cosmic Renovation
Article
22 hours ago
6 min read Frigid Exoplanet in Strange Orbit Imaged by NASA’s Webb
Article
2 days ago
Keep Exploring Discover Related Topics
Sounding Rockets
Ionosphere, Thermosphere & Mesosphere
Space Weather
Solar flares, coronal mass ejections, solar particle events, and the solar wind form the recipe space weather that affects life…
Solar System
View the full article
-
By NASA
NASA NASA astronaut Franklin Chang-Diaz works with a grapple fixture during a June 2002 spacewalk outside of the International Space Station. He was partnered with CNES (Centre National d’Etudes Spatiales) astronaut Philippe Perrin for the spacewalk – one of three that occurred during the STS-111 mission. Chang-Diaz was part of NASA’s ninth class of astronaut candidates. He became the first Hispanic American to fly in space.
Image credit: NASA
View the full article
-
By NASA
ESA/Hubble & NASA, C. Murray This NASA/ESA Hubble Space Telescope image features a sparkling cloudscape from one of the Milky Way’s galactic neighbors, a dwarf galaxy called the Large Magellanic Cloud. Located 160,000 light-years away in the constellations Dorado and Mensa, the Large Magellanic Cloud is the largest of the Milky Way’s many small satellite galaxies.
This view of dusty gas clouds in the Large Magellanic Cloud is possible thanks to Hubble’s cameras, such as the Wide Field Camera 3 (WFC3) that collected the observations for this image. WFC3 holds a variety of filters, and each lets through specific wavelengths, or colors, of light. This image combines observations made with five different filters, including some that capture ultraviolet and infrared light that the human eye cannot see.
The wispy gas clouds in this image resemble brightly colored cotton candy. When viewing such a vividly colored cosmic scene, it is natural to wonder whether the colors are ‘real’. After all, Hubble, with its 7.8-foot-wide (2.4 m) mirror and advanced scientific instruments, doesn’t bear resemblance to a typical camera! When image-processing specialists combine raw filtered data into a multi-colored image like this one, they assign a color to each filter. Visible-light observations typically correspond to the color that the filter allows through. Shorter wavelengths of light such as ultraviolet are usually assigned blue or purple, while longer wavelengths like infrared are typically red.
This color scheme closely represents reality while adding new information from the portions of the electromagnetic spectrum that humans cannot see. However, there are endless possible color combinations that can be employed to achieve an especially aesthetically pleasing or scientifically insightful image.
Learn how Hubble images are taken and processed.
Text credit: ESA/Hubble
Image credit: ESA/Hubble & NASA, C. Murray
View the full article
-
By NASA
6 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
Advancing new hazard detection and precision landing technologies to help future space missions successfully achieve safe and soft landings is a critical area of space research and development, particularly for future crewed missions. To support this, NASA’s Space Technology Mission Directorate (STMD) is pursuing a regular cadence of flight testing on a variety of vehicles, helping researchers rapidly advance these critical systems for missions to the Moon, Mars, and beyond.
“These flight tests directly address some of NASA’s highest-ranked technology needs, or shortfalls, ranging from advanced guidance algorithms and terrain-relative navigation to lidar-and optical-based hazard detection and mapping,” said Dr. John M. Carson III, STMD technical integration manager for precision landing and based at NASA’s Johnson Space Center in Houston.
Since the beginning of this year, STMD has supported flight testing of four precision landing and hazard detection technologies from many sectors, including NASA, universities, and commercial industry. These cutting-edge solutions have flown aboard a suborbital rocket system, a high-speed jet, a helicopter, and a rocket-powered lander testbed. That’s four precision landing technologies tested on four different flight vehicles in four months.
“By flight testing these technologies on Earth in spaceflight-relevant trajectories and velocities, we’re demonstrating their capabilities and validating them with real data for transitioning technologies from the lab into mission applications,” said Dr. Carson. “This work also signals to industry and other partners that these capabilities are ready to push beyond NASA and academia and into the next generation of Moon and Mars landers.”
The following NASA-supported flight tests took place between February and May:
Suborbital Rocket Test of Vision-Based Navigation System
Identifying landmarks to calculate accurate navigation solutions is a key function of Draper’s Multi-Environment Navigator (DMEN), a vision-based navigation and hazard detection technology designed to improve safety and precision of lunar landings.
Aboard Blue Origin’s New Shepard reusable suborbital rocket system, DMEN collected real-world data and validated its algorithms to advance it for use during the delivery of three NASA payloads as part of NASA’s Commercial Lunar Payload Services (CLPS) initiative. On Feb. 4, DMEN performed the latest in a series of tests supported by NASA’s Flight Opportunities program, which is managed at NASA’s Armstrong Flight Research Center in Edwards, California.
During the February flight, which enabled testing at rocket speeds on ascent and descent, DMEN scanned the Earth below, identifying landmarks to calculate an accurate navigation solution. The technology achieved accuracy levels that helped Draper advance it for use in terrain-relative navigation, which is a key element of landing on other planets.
New Shepard booster lands during the flight test on February 4, 2025.Blue Origin High-Speed Jet Tests of Lidar-Based Navigation
Several highly dynamic maneuvers and flight paths put Psionic’s Space Navigation Doppler Lidar (PSNDL) to the test while it collected navigation data at various altitudes, velocities, and orientations.
Psionic licensed NASA’s Navigation Doppler Lidar technology developed at Langley Research Center in Hampton, Virginia, and created its own miniaturized system with improved functionality and component redundancies, making it more rugged for spaceflight. In February, PSNDL along with a full navigation sensor suite was mounted aboard an F/A-18 Hornet aircraft and underwent flight testing at NASA Armstrong.
The aircraft followed a variety of flight paths over several days, including a large figure-eight loop and several highly dynamic maneuvers over Death Valley, California. During these flights, PSNDL collected navigation data relevant for lunar and Mars entry and descent.
The high-speed flight tests demonstrated the sensor’s accuracy and navigation precision in challenging conditions, helping prepare the technology to land robots and astronauts on the Moon and Mars. These recent tests complemented previous Flight Opportunities-supported testing aboard a lander testbed to advance earlier versions of their PSNDL prototypes.
The Psionic Space Navigation Doppler Lidar (PSNDL) system is installed in a pod located under the right wing of a NASA F/A-18 research aircraft for flight testing above Death Valley near NASA’s Armstrong Flight Research Center in Edwards, California, in February 2025.NASA Helicopter Tests of Real-Time Mapping Lidar
Researchers at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, developed a state-of-the-art Hazard Detection Lidar (HDL) sensor system to quickly map the surface from a vehicle descending at high speed to find safe landing sites in challenging locations, such as Europa (one of Jupiter’s moons), our own Moon, Mars, and other planetary bodies throughout the solar system. The HDL-scanning lidar generates three-dimensional digital elevation maps in real time, processing approximately 15 million laser measurements and mapping two football fields’ worth of terrain in only two seconds.
In mid-March, researchers tested the HDL from a helicopter at NASA’s Kennedy Space Center in Florida, with flights over a lunar-like test field with rocks and craters. The HDL collected numerous scans from several different altitudes and view angles to simulate a range of landing scenarios, generating real-time maps. Preliminary reviews of the data show excellent performance of the HDL system.
The HDL is a component of NASA’s Safe and Precise Landing – Integrated Capabilities Evolution (SPLICE) technology suite. The SPLICE descent and landing system integrates multiple component technologies, such as avionics, sensors, and algorithms, to enable landing in hard-to-reach areas of high scientific interest. The HDL team is also continuing to test and further improve the sensor for future flight opportunities and commercial applications.
NASA’s Hazard Detection Lidar field test team at Kennedy Space Center’s Shuttle Landing Facility in Florida in March 2025. Lander Tests of Powered-Descent Guidance Software
Providing pinpoint landing guidance capability with minimum propellant usage, the San Diego State University (SDSU) powered-descent guidance algorithms seek to improve autonomous spacecraft precision landing and hazard avoidance. During a series of flight tests in April and May, supported by NASA’s Flight Opportunities program, the university’s software was integrated into Astrobotic’s Xodiac suborbital rocket-powered lander via hardware developed by Falcon ExoDynamics as part of NASA TechLeap Prize’s Nighttime Precision Landing Challenge.
The SDSU algorithms aim to improve landing capabilities by expanding the flexibility and trajectory-shaping ability and enhancing the propellant efficiency of powered-descent guidance systems. They have the potential for infusion into human and robotic missions to the Moon as well as high-mass Mars missions.
To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
As part of a series of tethered and free-flight tests in April and May 2025, algorithms developed by San Diego State University guided the descent of the Xodiac lander testbed vehicle.Astrobotic By advancing these and other important navigation, precision landing, and hazard detection technologies with frequent flight tests, NASA’s Space Technology Mission Directorate is prioritizing safe and successful touchdowns in challenging planetary environments for future space missions.
Learn more: https://www.nasa.gov/space-technology-mission-directorate/
By: Lee Ann Obringer
NASA’s Flight Opportunities program
Facebook logo @NASATechnology @NASA_Technology Explore More
2 min read NASA Langley Uses Height, Gravity to Test Long, Flexible Booms
Article 4 hours ago 3 min read Autonomous Tritium Micropowered Sensors
Article 2 days ago 3 min read Addressing Key Challenges To Mapping Sub-cm Orbital Debris in LEO via Plasma Soliton Detection
Article 2 days ago Keep Exploring Discover More …
Space Technology Mission Directorate
Flight Opportunities
Moon
These two printable STL files demonstrate the differences between the near and far side of Earth’s Moon. The near side…
Technology
Share
Details
Last Updated May 29, 2025 EditorLoura Hall Related Terms
Space Technology Mission Directorate Armstrong Flight Research Center Flight Opportunities Program Technology Technology for Space Travel View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.