Jump to content

Recommended Posts

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      Curiosity Navigation Curiosity Home Mission Overview Where is Curiosity? Mission Updates Science Overview Instruments Highlights Exploration Goals News and Features Multimedia Curiosity Raw Images Images Videos Audio Mosaics More Resources Mars Missions Mars Sample Return Mars Perseverance Rover Mars Curiosity Rover MAVEN Mars Reconnaissance Orbiter Mars Odyssey More Mars Missions 2 min read
      Sols 4454-4457: Getting Ready to Fill the Long Weekend with Science
      NASA’s Mars rover Curiosity acquired this image, which includes the pyramid-shaped rock at left in the photo, the science target dubbed “Pyramid Lake,” using its Left Navigation Camera. The rover acquired the image on sol 4452, or Martian day 4,452 of the Mars Science Laboratory mission, on Feb. 13, 2025, at 14:22:06 UTC. NASA/JPL-Caltech Earth planning date: Friday, Feb. 14, 2025
      Curiosity is continuing to make progress along the strategic route, traversing laterally across the sulfate (salt) bearing unit toward the boxwork structures. The team celebrated the completion of another successful drive when we received the downlink this morning, and then we immediately got to work thinking about what’s next. There is a holiday in the United States on Monday, so instead of the typical three-sol weekend plan, we actually planned four sols, which will set us up to return to planning next Tuesday.
      The first sol of the plan focuses on remote sensing, and we’ll be taking several small Mastcam mosaics of features around the rover. One of my favorite targets the team picked is a delightfully pointy rock visible toward the left of the Navcam image shown above. The color images we’ll take with Mastcam will give us more information about the textures of this rock and potentially provide insight into the geologic forces that transformed it into this comical shape. The team chose what I think is a very appropriate name for this Martian pyramid-shaped target — “Pyramid Lake.” The terrestrial inspiration behind this name is a human-made reservoir (lake) near Los Angeles with a big (also human-made) pyramidal hill in it.
      On the second sol of the plan, we’ll use the instruments on Curiosity’s arm to collect data of rock targets at our feet, including “Strawberry Peak,” a bumpy piece of bedrock, “Lake Arrowhead,” a smooth piece of bedrock, and “Skyline Trail,” a dark float rock. ChemCam will also collect chemical data of Skyline Trail, “Big Tujunga” — which is similar to Strawberry Peak — and “Momyer.” We’ll also take the first part of a 360-degree color mosaic with Mastcam!
      In the third sol of the plan, we’ll complete the 360-degree mosaic and continue driving to the southwest along our strategic route. The fourth sol is pretty quiet, with some atmospheric observations and a ChemCam AEGIS. Atmospheric observations are additionally sprinkled throughout other sols of the plan. This time of year we are particularly interested in studying the clouds above Gale crater!
      I’m looking forward to the nice long weekend, and returning on Tuesday morning to see everything Curiosity accomplished.
      Written by Abigail Fraeman, Planetary Geologist at NASA’s Jet Propulsion Laboratory
      Share








      Details
      Last Updated Feb 17, 2025 Related Terms
      Blogs Explore More
      2 min read Sols 4452-4453: Keeping Warm and Keeping Busy


      Article


      3 days ago
      2 min read Sols 4450-4451: Making the Most of a Monday


      Article


      5 days ago
      3 min read Sols 4447–4449: Looking Back at the Marker Band Valley


      Article


      6 days ago
      Keep Exploring Discover More Topics From NASA
      Mars


      Mars is the fourth planet from the Sun, and the seventh largest. It’s the only planet we know of inhabited…


      All Mars Resources


      Explore this collection of Mars images, videos, resources, PDFs, and toolkits. Discover valuable content designed to inform, educate, and inspire,…


      Rover Basics


      Each robotic explorer sent to the Red Planet has its own unique capabilities driven by science. Many attributes of a…


      Mars Exploration: Science Goals


      The key to understanding the past, present or future potential for life on Mars can be found in NASA’s four…

      View the full article
    • By NASA
      An image of a coastal marshland combines aerial and satellite views in a technique similar to hyperspectral imaging. Combining data from multiple sources gives scientists information that can support environmental management.John Moisan When it comes to making real-time decisions about unfamiliar data – say, choosing a path to hike up a mountain you’ve never scaled before – existing artificial intelligence and machine learning tech doesn’t come close to measuring up to human skill. That’s why NASA scientist John Moisan is developing an AI “eye.”
      Oceanographer John MoisanNASA Moisan, an oceanographer at NASA’s Wallops Flight Facility near Chincoteague, Virginia, said AI will direct his A-Eye, a movable sensor. After analyzing images his AI would not just find known patterns in new data, but also steer the sensor to observe and discover new features or biological processes. 
      “A truly intelligent machine needs to be able to recognize when it is faced with something truly new and worthy of further observation,” Moisan said. “Most AI applications are mapping applications trained with familiar data to recognize patterns in new data. How do you teach a machine to recognize something it doesn’t understand, stop and say ‘What was that? Let’s take a closer look.’ That’s discovery.”
      Finding and identifying new patterns in complex data is still the domain of human scientists, and how humans see plays a large part, said Goddard AI expert James MacKinnon. Scientists analyze large data sets by looking at visualizations that can help bring out relationships between different variables within the data.
      Infrared images like this one from a marsh area on the Maryland/Virginia Eastern Shore coastal barrier and back bay regions reveal clues to scientists about plant health, photosynthesis, and other conditions that affect vegetation and ecosystems.John Moisan It’s another story to train a computer to look at large data streams in real time to see those connections, MacKinnon said. Especially when looking for correlations and inter-relationships in the data that the computer hasn’t been trained to identify. 
      Moisan intends first to set his A-Eye on interpreting images from Earth’s complex aquatic and coastal regions. He expects to reach that goal this year, training the AI using observations from prior flights over the Delmarva Peninsula. Follow-up funding would help him complete the optical pointing goal.
      “How do you pick out things that matter in a scan?” Moisan asked. “I want to be able to quickly point the A-Eye at something swept up in the scan, so that from a remote area we can get whatever we need to understand the environmental scene.” 
      Moisan’s on-board AI would scan the collected data in real-time to search for significant features, then steer an optical sensor to collect more detailed data in infrared and other frequencies. 
      Thinking machines may be set to play a larger role in future exploration of our universe. Sophisticated computers taught to recognize chemical signatures that could indicate life processes, or landscape features like lava flows or craters, might offer to increase the value of science data returned from lunar or deep-space exploration. 
      Today’s state-of-the-art AI is not quite ready to make mission-critical decisions, MacKinnon said.
      “You need some way to take a perception of a scene and turn that into a decision and that’s really hard,” he said. “The scary thing, to a scientist, is to throw away data that could be valuable. An AI might prioritize what data to send first or have an algorithm that can call attention to anomalies, but at the end of the day, it’s going to be a scientist looking at that data that results in discoveries.” 
      Share
      Details
      Last Updated Feb 10, 2025 Related Terms
      Goddard Space Flight Center Artificial Intelligence (AI) Goddard Technology People of Goddard Technology Wallops Flight Facility Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      5 min read
      February’s Night Sky Notes: How Can You Help Curb Light Pollution?
      Light pollution has long troubled astronomers, who generally shy away from deep sky observing under full Moon skies. The natural light from a bright Moon floods the sky and hides views of the Milky Way, dim galaxies and nebula, and shooting stars. In recent years, human-made light pollution has dramatically surpassed the interference of even a bright full Moon, and its effects are now noticeable to a great many people outside of the astronomical community. Harsh, bright white LED streetlights, while often more efficient and long-lasting, often create unexpected problems for communities replacing their old street lamps. Some notable concerns are increased glare and light trespass, less restful sleep, and disturbed nocturnal wildlife patterns. There is increasing awareness of just how much light is too much light at night. You don’t need to give in to despair over encroaching light pollution; you can join efforts to measure it, educate others, and even help stop or reduce the effects of light pollution in your community. 
      Before and after pictures of replacement lighting at the 6th Street Bridge over the Los Angeles River. The second picture shows improvements in some aspects of light pollution, as light is not directed to the sides and upwards from the upgraded fixtures, reducing skyglow. However, it also shows the use of brighter, whiter LEDs, which is not generally ideal, along with increased light bounce back from the road.  City of Los Angeles Amateur astronomers and potential citizen scientists around the globe are invited to participate in the Globe at Night (GaN) program to measure light pollution. Measurements are taken by volunteers on a few scheduled days every month and submitted to their database to help create a comprehensive map of light pollution and its change over time. GaN volunteers can take and submit measurements using multiple methods ranging from low-tech naked-eye observations to high-tech sensors and smartphone apps.
      Globe at Night citizen scientists can use the following methods to measure light pollution and submit their results:
      Their own smartphone camera and dedicated app Manually measure light pollution using their own eyes and detailed charts of the constellations A dedicated light pollution measurement device called a Sky Quality Meter (SQM). The free GaN web app from any internet-connected device (which can also be used to submit their measurements from an SQM or printed-out star charts) Night Sky Network members joined a telecon with Connie Walker of Globe at Night in 2014 and had a lively discussion about the program’s history and how they can participate. The audio of the telecon, transcript, and links to additional resources can be found on their dedicated resource page.
      Light pollution has been visible from space for a long time, but new LED lights are bright enough that they stand out from older street lights, even from orbit. The above photo was taken by astronaut Samantha Cristoforetti from the ISS cupola in 2015. The newly installed white LED lights in the center of the city of Milan are noticeably brighter than the lights in the surrounding neighborhoods. NASA/ESA DarkSky International has long been a champion in the fight against light pollution and a proponent of smart lighting design and policy. Their website (at darksky.org)  provides many resources for amateur astronomers and other like-minded people to help communities understand the negative impacts of light pollution and how smart lighting policies can not only help bring the stars back to their night skies but make their streets safer by using smarter lighting with less glare. Communities and individuals find that their nighttime lighting choices can help save considerable sums of money when they decide to light their streets and homes “smarter, not brighter” with shielded, directional lighting, motion detectors, timers, and even choosing the proper “temperature” of new LED light replacements to avoid the harsh “pure white” glare that many new streetlamps possess. Their pages on community advocacy and on how to choose dark-sky-friendly lighting are extremely helpful and full of great information. There are even local chapters of the IDA in many communities made up of passionate advocates of dark skies.
      DarkSky International has notably helped usher in “Dark Sky Places“, areas around the world that are protected from light pollution. “Dark Sky Parks“, in particular, provide visitors with incredible views of the Milky Way and are perfect places to spot the wonders of a meteor shower. These parks also perform a very important function, showing the public the wonders of a truly dark sky to many people who may have never before even seen a handful of stars in the sky, let alone the full, glorious spread of the Milky Way. 
      More research into the negative effects of light pollution on the health of humans and the environment is being conducted than ever before. Watching the nighttime light slowly increase in your neighborhood, combined with reading so much bad news, can indeed be disheartening! However, as awareness of light pollution and its negative effects increases, more people are becoming aware of the problem and want to be part of the solution. There is even an episode of PBS Kid’s SciGirls where the main characters help mitigate light pollution in their neighborhood!
      Astronomy clubs are uniquely situated to help spread awareness of good lighting practices in their local communities in order to help mitigate light pollution. Take inspiration from Tucson, Arizona, and other dark sky-friendly communities that have adopted good lighting practices. Tucson even reduced its skyglow by 7% after its own citywide lighting conversion, proof that communities can bring the stars back with smart lighting choices.
      Originally posted by Dave Prosper: November 2018
      Last Updated by Kat Troche: January 2025
      View the full article
    • By European Space Agency
      Video: 00:01:20 Listen to the ESA/JAXA BepiColombo spacecraft as it flew past Mercury on 8 January 2025. This sixth and final flyby used the little planet's gravity to steer the spacecraft on course for entering orbit around Mercury in 2026. 
      What you can hear in the sonification soundtrack of this video are real spacecraft vibrations measured by the Italian Spring Accelerometer (ISA) instrument. The accelerometer data have been shifted in frequency to make them audible to human ears – one hour of measurements have been sped up to one minute of sound.  
      BepiColombo is always shaking ever so slightly: fuel is slightly sloshing, the solar panels are vibrating at their natural frequency, heat pipes are pushing vapour through small tubes, and so forth. This creates the eerie underlying hum throughout the video.  
      But as BepiColombo gets closer to Mercury, ISA detects other forces acting on the spacecraft. Most scientifically interesting are the audible shocks that sound like short, soft bongs. These are caused by the spacecraft responding to entering and exiting Mercury's shadow, where the Sun's intense radiation is suddenly blocked. One of ISA's scientific goals is to monitor the changes in the ‘solar radiation pressure’ – a force caused by sunlight striking BepiColombo as it orbits the Sun and, eventually, Mercury. 
      The loudest noises – an ominous ‘rumbling’ – are caused by the spacecraft's large solar panels rotating. The first rotation occurs in shadow at 00:17 in the video, while the second adjustment at 00:51 was also captured by one of the spacecraft’s monitoring cameras. 
      Faint sounds like wind being picked up in a phone call, which grow more audible around 30 seconds into the video, are caused by Mercury's gravitational field pulling the nearest and furthest parts of the spacecraft by different amounts. As the planet's gravity stretches the spacecraft ever so slightly, the spacecraft responds structurally. At the same time, the onboard reaction wheels change their speed to maintain the spacecraft's orientation, which you can hear as a frequency shift in the background.    
      This is the last time that many of these effects can be measured with BepiColombo's largest solar panels, which make the spacecraft more susceptible to vibrations. The spacecraft module carrying these panels will not enter orbit around Mercury with the mission's two orbiter spacecraft. 
      The video shows an accurate simulation of the spacecraft and its route past Mercury during the flyby, made with the SPICE-enhanced Cosmographia spacecraft visualisation tool. The inset that appears 38 seconds into the video shows real photographs taken by one of BepiColombo's monitoring cameras.
      Read more about BepiColombo's sixth Mercury flyby 
      Access the related broadcast quality video material.
      View the full article
    • By European Space Agency
      Video: 00:01:36 Fly over Mercury with BepiColombo for the final time during the mission’s epic expedition around the Sun. The ESA/JAXA spacecraft captured these images of the Solar System's smallest planet on 7 and 8 January 2025, before and during its sixth encounter with Mercury. This was its final planetary flyby until it enters orbit around the planet in late 2026.  
      The video begins with BepiColombo's approach to Mercury, showing images taken by onboard monitoring cameras 1 and 2 (M-CAM 1 and M-CAM 2) between 16:59 CET on 7 January and 01:45 CET on 8 January. During this time, the spacecraft moved from 106 019 to 42 513 km from Mercury's surface. The view from M-CAM 1 is along a 15-metre-long solar array, whereas M-CAM 2 images show an antenna and boom in the foreground. 
      After emerging into view from behind the solar array, Mercury appears to jump to the right. Both the spacecraft and its solar arrays rotated in preparation for passing through Mercury's cold, dark shadow.   
      For several hours after these first images were taken, the part of Mercury’s surface illuminated by the Sun was no longer visible from the M-CAMs. BepiColombo's closest approach to Mercury took place in darkness at 06:58:52 CET on 8 January, when it got as close as 295 km.  
      Shortly after re-emerging into the intense sunlight, the spacecraft peered down onto the planet's north pole, imaging several craters whose floors are in permanent shadow. The long shadows in this region are particularly striking on the floor of Prokofiev crater (the largest crater to the right of centre) – the central peak of that crater casts spiky shadows that exaggerate the shape of this mountain.  
      Next, we have a beautiful view of Mercury crossing the field of view from left to right, seen first by M-CAM 1 then by M-CAM 2 between 07:06 and 07:49 CET. These images showcase the planet's northern plains, which were smoothed over billions of years ago when massive amounts of runny lava flowed across Mercury's cratered surface.  
      The background music is The Hebrides overture, composed by Felix Mendelssohn in 1830 after being inspired by a visit to Fingal’s Cave, a sea cave created by ancient lava flows on the island of Staffa, Scotland. Similarly shaped by lava is Mercury's Mendelssohn crater, one of the large craters visible passing from left to right above the solar array in M-CAM 1's views, and at the very bottom of M-CAM 2's views. The Mendelssohn crater was flooded with lava after an impact originally created it. 
      The end of the video lingers on the final three close-up images that the M-CAMs will ever obtain of Mercury. The cameras will continue to operate until September 2026, fulfilling their role of monitoring various parts of the spacecraft. After that point, the spacecraft module carrying the M-CAMs will separate from BepiColombo's other two parts, ESA's Mercury Planetary Orbiter (MPO) and JAXA's Mercury Magnetospheric Orbiter (Mio). MPO’s much more powerful science cameras will take over from the M-CAMs, mapping Mercury over a range of colours in visible and infrared light.
      View the full article
  • Check out these Videos

×
×
  • Create New...