Jump to content

Recommended Posts

Posted

David Morris and his son walking along a beach near Gillan in England when they noticed a huge ship apparently floating in the sky. Although they were stunned to see the strange phenomenon, they say that they had seen this phenomenon before. 


floating%2Bship%2Bsky.jpg
Credit image: David Morris.

It turns out they witnessed a so-called superior mirage, an optical illusion which occurs because of the weather condition known as a temperature inversion. 

Although most of these phenomena are optical illusions, not all of these appearances can be explained. Apparitions or similar events that cannot be explained are often interpreted as a "holographic" display from a parallel universe or glitches in reality. 

Glitches in reality, might just be some of the weirdest phenomenon occurring today. Also known as 'glitches in the matrix,' these unusual events help provide evidence to the theory that our existence may be nothing more than part of a computer simulation, a simulated reality that keeps humanity unaware that intelligent machines have actually taken over the world. In other words, the plot of the movie The Matrix is our reality.

 

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By European Space Agency
      Video: 00:15:30 Meet Arnaud Prost—aerospace engineer, professional diver, and member of ESA’s Astronaut Reserve. From flying aircraft to getting a taste of spacewalk simulation, his passion for exploration knows no bounds.
      In this miniseries, we take you on a journey through the ESA Astronaut Reserve, diving into the first part of their Astronaut Reserve Training (ART) at the European Astronaut Centre (EAC) near Cologne, Germany. Our “ARTists” are immersing themselves in everything from ESA and the International Space Station programme to the European space industry and institutions. They’re gaining hands-on experience in technical skills like spacecraft systems and robotics, alongside human behaviour, scientific lessons, scuba diving, and survival training.
      ESA’s Astronaut Reserve Training programme is all about building Europe’s next generation of space explorers—preparing them for the opportunities of future missions in Earth orbit and beyond.
      This interview was recorded in November 2024. 
      You can listen to this episode on all major podcast platforms.
      Keep exploring with ESA Explores!
      Learn more about Arnaud’s PANGAEA training here.
      View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      A NASA F/A-18 research aircraft flies above California near NASA’s Armstrong Flight Research Center in Edwards, California, testing a commercial precision landing technology for future space missions. The Psionic Space Navigation Doppler Lidar (PSNDL) system is installed in a pod located under the right wing of the aircraft.NASA Nestled in a pod under an F/A-18 Hornet aircraft wing, flying above California, and traveling up to the speed of sound, NASA put a commercial sensor technology to the test. The flight tests demonstrated the sensor accuracy and navigation precision in challenging conditions, helping prepare the technology to land robots and astronauts on the Moon and Mars. 
      The Psionic Space Navigation Doppler Lidar (PSNDL) system is rooted in NASA technology that Psionic, Inc. of Hampton, Virginia, licensed and further developed. They miniaturized the NASA technology, added further functionality, and incorporated component redundancies that make it more rugged for spaceflight. The PSNDL navigation system also includes cameras and an inertial measurement unit to make it a complete navigation system capable of accurately determining a vehicle’s position and velocity for precision landing and other spaceflight applications. 
      NASA engineers and technicians install the Psionic Space Navigation Doppler Lidar (PSNDL) system into a testing pod on a NASA F/A-18 research aircraft ahead of February 2025 flight tests at NASA’s Armstrong Flight Research Center in Edwards, California.NASA The aircraft departed from NASA’s Armstrong Flight Research Center in Edwards, California, and conducted a variety of flight paths over several days in February 2025. It flew a large figure-8 loop and conducted several highly dynamic maneuvers over Death Valley, California, to collect navigation data at various altitudes, velocities, and orientations relevant for lunar and Mars entry and descent. Refurbished for these tests, the NASA F/A-18 pod can support critical data collection for other technologies and users at a low cost. 
      Doppler Lidar sensors provide a highly accurate measurement of speed by measuring the frequency shift between laser light emitted from the sensor reflected from the ground. Lidar are extremely useful in sunlight-challenged areas that may have long shadows and stark contrasts, such as the lunar South Pole. Pairing PSNDL with cameras adds the ability to visually compare pictures with surface reconnaissance maps of rocky terrain and navigate to landing at interesting locations on Mars. All the data is fed into a computer to make quick, real-time decisions to enable precise touchdowns at safe locations. 
      Psionic Space Navigation Doppler Lidar (PSNDL) system installed in a testing pod on a NASA F/A-18 research aircraft ahead of February 2025 flight tests at NASA’s Armstrong Flight Research Center in Edwards, California.NASA Since licensing NDL in 2016, Psionic has received funding and development support from NASA’s Space Technology Mission Directorate through its Small Business Innovative Research program and Tipping Point initiative. The company has also tested PSNDL prototypes on suborbital vehicles via the Flight Opportunities program. In 2024, onboard a commercial lunar lander, NASA successfully demonstrated the predecessor NDL system developed by the agency’s Langley Research Center in Hampton, Virginia. 
      Explore More
      4 min read NASA Starling and SpaceX Starlink Improve Space Traffic Coordination
      Article 10 mins ago 6 min read How NASA’s Perseverance Is Helping Prepare Astronauts for Mars
      Article 36 mins ago 2 min read NASA Cloud Software Helps Companies Find their Place in Space 
      Article 20 hours ago Facebook logo @NASATechnology @NASA_Technology Share
      Details
      Last Updated Mar 26, 2025 EditorLoura Hall Related Terms
      Armstrong Flight Research Center Game Changing Development Program Space Communications Technology Space Technology Mission Directorate Technology Technology for Living in Space Technology for Space Travel View the full article
    • By NASA
      If you design a new tool for use on Earth, it is easy to test and practice using that tool in its intended environment. But what if that tool is destined for lunar orbit or will be used by astronauts on the surface of the Moon?

      NASA’s Simulation and Graphics Branch can help with that. Based at Johnson Space Center in Houston, the branch’s high-fidelity, real-time graphical simulations support in-depth engineering analyses and crew training, ensuring the safety, efficiency, and success of complex space endeavors before execution. The team manages multiple facilities that provide these simulations, including the Prototype Immersive Technologies (PIT) Lab, Virtual Reality Training Lab, and the Systems Engineering Simulator (SES).

      Lee Bingham is an aerospace engineer on the simulation and graphics team. His work includes developing simulations and visualizations for the NASA Exploration Systems Simulations team and providing technical guidance on simulation and graphics integration for branch-managed facilities. He also leads the branch’s human-in-the-loop Test Sim and Graphics Team, the Digital Lunar Exploration Sites Unreal Simulation Tool (DUST), and the Lunar Surface Mixed-Reality with the Active Response Gravity Offload System (ARGOS) projects.

      Lee Bingham demonstrates a spacewalk simulator for the Gateway lunar space station during NASA’s Tech Day on Capitol Hill in Washington, D.C. Image courtesy of Lee Bingham Bingham is particularly proud of his contributions to DUST, which provides a 3D visualization of the Moon’s South Pole and received Johnson’s Exceptional Software of the Year Award in 2024. “It was designed for use as an early reference to enable candidate vendors to perform initial studies of the lunar terrain and lighting in support of the Strategy and Architecture Office, human landing system, and the Extravehicular Activity and Human Surface Mobility Program,” Bingham explained. DUST has supported several human-in-the-loop studies for NASA. It has also been shared with external collaborators and made available to the public through the NASA Software Catalog.  

      Bingham has kept busy during his nearly nine years at Johnson and said learning to manage and balance support for multiple projects and customers was very challenging at first. “I would say ‘yes’ to pretty much anything anyone asked me to do and would end up burning myself out by working extra-long hours to meet milestones and deliverables,” he said. “It has been important to maintain a good work-life balance and avoid overcommitting myself while meeting demanding expectations.”

      Lee Bingham tests the Lunar Surface Mixed Reality and Active Response Gravity Offload System trainer at Johnson Space Center. Image courtesy of Lee Bingham Bingham has also learned the importance of teamwork and collaboration. “You can’t be an expert at everything or do everything yourself,” he said. “Develop your skills, practice them regularly, and master them over time but be willing to ask for help and advice. And be sure to recognize and acknowledge your coworkers and teammates when they go above and beyond or achieve something remarkable.”

      Lee Bingham (left) demonstrates a lunar rover simulator for Apollo 16 Lunar Module Pilot Charlie Duke. Image courtesy of Lee Bingham He hopes that the Artemis Generation will be motivated to tackle difficult challenges and further NASA’s mission to benefit humanity. “Be sure to learn from those who came before you, but be bold and unafraid to innovate,” he advised.
      View the full article
    • By NASA
      Tess Caswell, a stand-in crew member for the Artemis III Virtual Reality Mini-Simulation, executes a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. The simulation was a test of using VR as a training method for flight controllers and science teams’ collaboration on science-focused traverses on the lunar surface. Credit: NASA/Robert Markowitz When astronauts walk on the Moon, they’ll serve as the eyes, hands, and boots-on-the-ground interpreters supporting the broader teams of scientists on Earth. NASA is leveraging virtual reality to provide high-fidelity, cost-effective support to prepare crew members, flight control teams, and science teams for a return to the Moon through its Artemis campaign.
      The Artemis III Geology Team, led by principal investigator Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, participated in an Artemis III Surface Extra-Vehicular VR Mini-Simulation, or “sim” at NASA’s Johnson Space Center in Houston in the fall of 2024. The sim brought together science teams and flight directors and controllers from Mission Control to carry out science-focused moonwalks and test the way the teams communicate with each other and the astronauts.
      “There are two worlds colliding,” said Dr. Matthew Miller, co-lead for the simulation and exploration engineer, Amentum/JETSII contract with NASA. “There is the operational world and the scientific world, and they are becoming one.”
      NASA mission training can include field tests covering areas from navigation and communication to astronaut physical and psychological workloads. Many of these tests take place in remote locations and can require up to a year to plan and large teams to execute. VR may provide an additional option for training that can be planned and executed more quickly to keep up with the demands of preparing to land on the Moon in an environment where time, budgets, and travel resources are limited.
      VR helps us break down some of those limitations and allows us to do more immersive, high-fidelity training without having to go into the field. It provides us with a lot of different, and significantly more, training opportunities.
      BRI SPARKS
      NASA co-lead for the simulation and Extra Vehicular Activity Extended Reality team at Johnson.
      Field testing won’t be going away. Nothing can fully replace the experience crew members gain by being in an environment that puts literal rocks in their hands and incudes the physical challenges that come with moonwalks, but VR has competitive advantages.
      The virtual environment used in the Artemis III VR Mini-Sim was built using actual lunar surface data from one of the Artemis III candidate regions. This allowed the science team to focus on Artemis III science objectives and traverse planning directly applicable to the Moon. Eddie Paddock, engineering VR technical discipline lead at NASA Johnson, and his team used data from NASA’s Lunar Reconnaissance Orbiter and planet position and velocity over time to develop a virtual software representation of a site within the Nobile Rim 1 region near the south pole of the Moon. Two stand-in crew members performed moonwalk traverses in virtual reality in the Prototype Immersive Technology lab at Johnson, and streamed suit-mounted virtual video camera views, hand-held virtual camera imagery, and audio to another location where flight controllers and science support teams simulated ground communications.
      A screen capture of a virtual reality view during the Artemis III VR Mini-Simulation. The lunar surface virtual environment was built using actual lunar surface data from one of the Artemis III candidate regions. Credit: Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. The crew stand-ins were immersed in the lunar environment and could then share the experience with the science and flight control teams. That quick and direct feedback could prove critical to the science and flight control teams as they work to build cohesive teams despite very different approaches to their work.
      The flight operations team and the science team are learning how to work together and speak a shared language. Both teams are pivotal parts of the overall mission operations. The flight control team focuses on maintaining crew and vehicle safety and minimizing risk as much as possible. The science team, as Miller explains, is “relentlessly thirsty” for as much science as possible. Training sessions like this simulation allow the teams to hone their relationships and processes.
      Members of the Artemis III Geology Team and science support team work in a mock Science Evaluation Room during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Video feeds from the stand-in crew members’ VR headsets allow the science team to follow, assess, and direct moonwalks and science activities. Credit: NASA/Robert Markowitz Denevi described the flight control team as a “well-oiled machine” and praised their dedication to getting it right for the science team. Many members of the flight control team have participated in field and classroom training to learn more about geology and better understand the science objectives for Artemis.
      “They have invested a lot of their own effort into understanding the science background and science objectives, and the science team really appreciates that and wants to make sure they are also learning to operate in the best way we can to support the flight control team, because there’s a lot for us to learn as well,” Denevi said. “It’s a joy to get to share the science with them and have them be excited to help us implement it all.”
      Artemis III Geology Team lead Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, left, Artemis III Geology Team member, Dr. Jose Hurtado, University of Texas at El Paso, and simulation co-lead, Bri Sparks, work together during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz This simulation, Sparks said, was just the beginning for how virtual reality could supplement training opportunities for Artemis science. In the future, using mixed reality could help take the experience to the next level, allowing crew members to be fully immersed in the virtual environment while interacting with real objects they can hold in their hands. Now that the Nobile Rim 1 landing site is built in VR, it can continue to be improved and used for crew training, something that Sparks said can’t be done with field training on Earth.
      While “virtual” was part of the title for this exercise, its applications are very real.
      “We are uncovering a lot of things that people probably had in the back of their head as something we’d need to deal with in the future,” Miller said. “But guess what? The future is now. This is now.”
      Test subject crew members for the Artemis III Virtual Reality Mini-Simulation, including Grier Wilt, left, and Tess Caswell, center, execute a moonwalk in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Grier Wilt, left, and Tess Caswell, crew stand-ins for the Artemis III Virtual Reality Mini-Simulation, execute a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Engineering VR technical discipline lead Eddie Paddock works with team members to facilitate the virtual reality components of the Artemis III Virtual Reality Mini-Simulation in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: Robert Markowitz Flight director Paul Konyha follows moonwalk activities during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz




      Rachel Barry
      NASA’s Johnson Space Center
      Keep Exploring Discover More Topics From NASA
      Astromaterials



      Artemis Science


      A Time Capsule The Moon is a 4.5-billion-year-old time capsule, pristinely preserved by the cold vacuum of space. It is…


      Lunar Craters


      Earth’s Moon is covered in craters. Lunar craters tell us the history not only of the Moon, but of our…


      Solar System


      View the full article
    • By NASA
      3 Min Read March’s Night Sky Notes: Messier Madness
      Showing a large portion of M66, this Hubble photo is a composite of images obtained at visible and infrared wavelengths. The images have been combined to represent the real colors of the galaxy. Credits:
      NASA, ESA and the Hubble Heritage (STScI/AURA)-ESA/Hubble Collaboration; Acknowledgment: Davide De Martin and Robert Gendler by Kat Troche of the Astronomical Society of the Pacific
      What Are Messier Objects?
      During the 18th century, astronomer and comet hunter Charles Messier wanted to distinguish the ‘faint fuzzies’ he observed from any potential new comets. As a result, Messier cataloged 110 objects in the night sky, ranging from star clusters to galaxies to nebulae. These items are designated by the letter ‘M’ and a number. For example, the Orion Nebula is Messier 42 or M42, and the Pleiades are Messier 45 or M45. These are among the brightest ‘faint fuzzies’ we can see with modest backyard telescopes and some even with our eyes.
      Stargazers can catalog these items on evenings closest to the new moon. Some even go as far as having “Messier Marathons,” setting up their telescopes and binoculars in the darkest skies available to them, from sundown to sunrise, to catch as many as possible. Here are some items to look for this season:
      M44 in Cancer and M65 and 66 in Leo can be seen high in the evening sky 60 minutes after sunset. Stellarium Web Messier 44 in Cancer: The Beehive Cluster, also known as Praesepe, is an open star cluster in the heart of the Cancer constellation. Use Pollux in Gemini and Regulus in Leo as guide stars. A pair of binoculars is enough to view this and other open star clusters. If you have a telescope handy, pay a visit two of the three galaxies that form the Leo Triplet – M65 and M66. These items can be seen one hour after sunset in dark skies.
      Locate M3 and M87 rising in the east after midnight. Stellarium Web Messier 3 Canes Venatici: M3 is a globular cluster of 500,000 stars. Through a telescope, this object looks like a fuzzy sparkly ball. You can resolve this cluster in an 8-inch telescope in moderate dark skies. You can find this star cluster by using the star Arcturus in the Boötes constellation as a guide.
      Messier 87 in Virgo: Located just outside of Markarian’s Chain, M87 is an elliptical galaxy that can be spotted during the late evening hours. While it is not possible to view the supermassive black hole at the core of this galaxy, you can see M87 and several other Messier-labeled galaxies in the Virgo Cluster using a medium-sized telescope.
      Locate M76 and M31 setting in the west, 60 minutes after sunset. Stellarium Web Plan Ahead
      When gearing up for a long stargazing session, there are several things to remember, such as equipment, location, and provisions:
      Do you have enough layers to be outdoors for several hours? You would be surprised how cold it can get when sitting or standing still behind a telescope! Are your batteries fully charged? If your telescope runs on power, be sure to charge everything before you leave home and pack any additional batteries for your cell phone. Most people use their mobile devices for astronomy apps, so their batteries may deplete faster. Cold weather can also impact battery life. Determine the apparent magnitude of what you are trying to see and the limiting magnitude of your night sky. You can learn more about apparent and limiting magnitudes with our Check Your Sky Quality with Orion article. When choosing a location to observe from, select an area you are familiar with and bring some friends! You can also connect with your local astronomy club to see if they are hosting any Messier Marathons. It’s always great to share the stars! You can see all 110 items and their locations with NASA’s Explore the Night Sky interactive map and the Hubble Messier Catalog, objects that have been imaged by the Hubble Space Telescope.
      View the full article
  • Check out these Videos

×
×
  • Create New...