Jump to content

Out of Whack Planetary System Offers Clues to a Disturbed Past


HubbleSite

Recommended Posts

low_STSCI-H-p1017-k-1340x520.png

For just over a decade, astronomers have known that three Jupiter-type planets orbit the yellow-white star Upsilon Andromedae. But to their surprise it's now been discovered that not all planets orbit this star in the same plane, as the major planets in our solar system orbit the Sun. The orbits of two of the planets are inclined by 30 degrees with respect to each other. Such a strange orientation has never before been seen in any other planetary system. This surprising finding will impact theories of how planetary systems form and evolve, say researchers. It suggests that some violent events can happen to disrupt planets' orbits after a planetary system forms. The discovery was made by joint observations with the Hubble Space Telescope, the giant Hobby-Eberly Telescope, and other ground-based telescopes.

These findings were presented in a press conference today at the 216th meeting of the American Astronomical Society in Miami.

View the full article

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      The powerhouse of Gateway, NASA’s orbiting outpost around the Moon and a critical piece of infrastructure for Artemis, is in the midst of several electric propulsion system tests.
      The Power and Propulsion Element (PPE), being manufactured by Maxar Technologies, provides Gateway with power, high-rate communications, and propulsion for maneuvers around the Moon and to transit between different orbits. The PPE will be combined with the Habitation and Logistic Outpost (HALO) before the integrated spacecraft’s launch, targeted for late 2024 aboard a SpaceX Falcon Heavy. Together, these elements will serve as the hub for early Gateway crewed operations and various science and technology demonstrations as the full Gateway station is assembled around it in the coming years.
      In this image, PPE engineers successfully tested the integration of Aerojet Rocketdyne’s thruster with Maxar’s power procession unit and Xenon Flow Controller.
      Image Credit: NASA
      View the full article
    • By NASA
      5 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Sonifications of three images have been released to mark the 25th anniversary of Chandra’s “First Light” image. For Cassiopeia A, which was one of the first objects observed by Chandra, X-ray data from Chandra and infrared data from Webb have been translated into sounds, along with some Hubble data. The second image in the sonification trio, 30 Doradus, also contains Chandra and Webb data. NGC 6872 contains data from Chandra as well as an optical image from Hubble. Each of these datasets have been mapped to notes and sounds based on properties observed by these telescopes.NASA/CXC/SAO/K.Arcand, SYSTEM Sounds (M. Russo, A. Santaguida) A quarter of a century ago, NASA released the “first light” images from the agency’s Chandra X-ray Observatory. This introduction to the world of Chandra’s high-resolution X-ray imaging capabilities included an unprecedented view of Cassiopeia A, the remains of an exploded star located about 11,000 light-years from Earth. Over the years, Chandra’s views of Cassiopeia A have become some of the telescope’s best-known images.
      To mark the anniversary of this milestone, new sonifications of three images – including Cassiopeia A (Cas A) – are being released. Sonification is a process that translates astronomical data into sound, similar to how digital data are more routinely turned into images. This translation process preserves the science of the data from its original digital state but provides an alternative pathway to experiencing the data.
      This sonification of Cas A features data from Chandra as well as NASA’s James Webb, Hubble, and retired Spitzer space telescopes. The scan starts at the neutron star at the center of the remnant, marked by a triangle sound, and moves outward. Astronomers first saw this neutron star when Chandra’s inaugural observations were released 25 years ago this week. Chandra’s X-rays also reveal debris from the exploded star that is expanding outward into space. The brighter parts of the image are conveyed through louder volume and higher pitched sounds. X-ray data from Chandra are mapped to modified piano sounds, while infrared data from Webb and Spitzer, which detect warmed dust embedded in the hot gas, have been assigned to various string and brass instruments. Stars that Hubble detects are played with crotales, or small cymbals.
      Another new sonification features the spectacular cosmic vista of 30 Doradus, one of the largest and brightest regions of star formation close to the Milky Way. This sonification again combines X-rays from Chandra with infrared data from Webb. As the scan moves from left to right across the image, the volume heard again corresponds to the brightness seen. Light toward the top of the image is mapped to higher pitched notes. X-rays from Chandra, which reveal gas that has been superheated by shock waves generated by the winds from massive stars, are heard as airy synthesizer sounds. Meanwhile, Webb’s infrared data show cooler gas that provides the raw ingredients for future stars. These data are mapped to a range of sounds including soft, low musical pitches (red regions), a wind-like sound (white regions), piano-like synthesizer notes indicating very bright stars, and a rain-stick sound for stars in a central cluster.
      The final member of this new sonification triumvirate is NGC 6872, a large spiral galaxy that has two elongated arms stretching to the upper right and lower left, which is seen in an optical light view from Hubble. Just to the upper left of NGC 6872 appears another smaller spiral galaxy. These two galaxies, each of which likely has a supermassive black hole at the center, are being drawn toward one another. As the scan sweeps clockwise from 12 o’clock, the brightness controls the volume and light farther from the center of the image is mapped to higher-pitched notes. Chandra’s X-rays, represented in sound by a wind-like sound, show multimillion-degree gas that permeates the galaxies. Compact X-ray sources from background galaxies create bird-like chirps. In the Hubble data, the core of NGC 6872 is heard as a dark low drone, and the blue spiral arms (indicating active star formation) are audible as brighter, more highly pitched tones. The background galaxies are played as a soft pluck sound while the bright foreground star is accompanied by a crash cymbal.
      More information about the NASA sonification project through Chandra, which is made in partnership with NASA’s Universe of Learning, can be found at https://chandra.si.edu/sound/.  The collaboration was driven by visualization scientist Kimberly Arcand (CXC), astrophysicist Matt Russo, and musician Andrew Santaguida, (both of the SYSTEM Sounds project), along with consultant Christine Malec.
      NASA’s Universe of Learning materials are based upon work supported by NASA under cooperative agreement award number NNX16AC65A to the Space Telescope Science Institute, working in partnership with Caltech/IPAC, Center for Astrophysics | Harvard & Smithsonian, and the Jet Propulsion Laboratory.
      More about Chandra
      Chandra, managed for NASA by Marshall in partnership with the CXC, is one of NASA’s Great Observatories, along with the Hubble Space Telescope and the now-retired Spitzer Space Telescope and Compton Gamma Ray Observatory. It was first proposed to NASA in 1976 by Riccardo Giacconi, recipient of the 2002 Nobel Prize for Physics based on his contributions to X-ray astronomy, and Harvey Tananbaum, who would later become the first director of the Chandra X-ray Center. Chandra was named in honor of the late Nobel laureate Subrahmanyan Chandrasekhar, who earned the Nobel Prize in Physics in 1983 for his work explaining the structure and evolution of stars.
      Learn more about the Chandra X-ray Observatory and its mission here:
      https://www.nasa.gov/mission/chandra-x-ray-observatory/
      https://cxc.harvard.edu
      News Media Contact
      Lane Figueroa
      Marshall Space Flight Center, Huntsville, Alabama
      256-544-0034
      lane.e.figueroa@nasa.gov
      Share
      Details
      Last Updated Sep 03, 2024 LocationMarshall Space Flight Center Related Terms
      Chandra X-Ray Observatory Marshall Space Flight Center Explore More
      5 min read Cassiopeia A, Then the Cosmos: 25 Years of Chandra X-ray Science
      Article 1 week ago 9 min read 25 Years Ago: STS-93, Launch of the Chandra X-Ray Observatory
      Article 1 month ago 5 min read 25 Years On, Chandra Highlights Legacy of NASA Engineering Ingenuity
      Article 1 month ago Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      “I didn’t always grow up knowing that I was going to be working for NASA. It was just the way my life unfolded, and I couldn’t be more grateful and lucky to have this opportunity to be here. I think hiking is what really got me into my passion for wanting to have this outdoors kind of career. I’ve always pursued environmental science and geology, and still at that point in time, I had no idea that I could apply that kind of science to outer space and work for NASA one day.
      “It wasn’t until I had these amazing mentors in front of me who were showing me, ‘Hey, what you’re doing, you can apply this to, for instance, Mars.’ And that’s what sparked my inspiration — [realizing] Mars had these ancient lakes and [wondering], ‘How can I use what I’m doing here on Earth to understand what was going on with those ancient lakes on Mars?’
      “I’m kind of lucky. It’s less of a job and more of this exciting career opportunity where I get to go out into the field and even hike for a good portion of [my workday]. For instance, I just got back from Iceland where I was for 10 days. On these field trips, I’m in my comfort zone wearing a flannel and winter hat, backpacking with my rock hammer and shovel, hiking for a few hours to pick up samples, and then come back home to analyze them in the lab. I couldn’t have written a better story for me to continue doing the stuff that I enjoyed as a child and now to be doing it now for NASA is something I couldn’t have even dreamed of.
      “Hiking and being in the field is the fun part. But then I get to come back to the lab and compare it to what Martian rovers are doing. They’re our hikers, our pioneers, our explorers, our geologists who are collecting samples for us on other planets.  It’s remarkable, often mind-blowing, to be able to work directly with our planetary geologists as well as the amazing people on the rover teams from around the globe to understand the surface of Mars and then eventually, compare it to what I see in the field here on Earth.
      “So, I’m still that young boy at heart with my backpack and flannel on and headed out into the field.”
      – Dr. Michael Thrope, Sedimentary and Planetary Geologist, NASA’s Goddard Space Flight Center
      Image Credit: Iceland Space Agency/Daniel Leeb
      Interviewer: NASA/Tahira Allen
      Check out some of our other Faces of NASA. 
      View the full article
    • By NASA
      5 Min Read NASA Optical Navigation Tech Could Streamline Planetary Exploration
      Optical navigation technology could help astronauts and robots find their ways using data from cameras and other sensors. Credits: NASA As astronauts and rovers explore uncharted worlds, finding new ways of navigating these bodies is essential in the absence of traditional navigation systems like GPS. Optical navigation relying on data from cameras and other sensors can help spacecraft — and in some cases, astronauts themselves — find their way in areas that would be difficult to navigate with the naked eye. Three NASA researchers are pushing optical navigation tech further, by making cutting edge advancements in 3D environment modeling, navigation using photography, and deep learning image analysis. In a dim, barren landscape like the surface of the Moon, it can be easy to get lost. With few discernable landmarks to navigate with the naked eye, astronauts and rovers must rely on other means to plot a course.
      As NASA pursues its Moon to Mars missions, encompassing exploration of the lunar surface and the first steps on the Red Planet, finding novel and efficient ways of navigating these new terrains will be essential. That’s where optical navigation comes in — a technology that helps map out new areas using sensor data.
      NASA’s Goddard Space Flight Center in Greenbelt, Maryland, is a leading developer of optical navigation technology. For example, GIANT (the Goddard Image Analysis and Navigation Tool) helped guide the OSIRIS-REx mission to a safe sample collection at asteroid Bennu by generating 3D maps of the surface and calculating precise distances to targets.
      Now, three research teams at Goddard are pushing optical navigation technology even further.
      Virtual World Development
      Chris Gnam, an intern at NASA Goddard, leads development on a modeling engine called Vira that already renders large, 3D environments about 100 times faster than GIANT. These digital environments can be used to evaluate potential landing areas, simulate solar radiation, and more.
      While consumer-grade graphics engines, like those used for video game development, quickly render large environments, most cannot provide the detail necessary for scientific analysis. For scientists planning a planetary landing, every detail is critical.
      Vira can quickly and efficiently render an environment in great detail.NASA “Vira combines the speed and efficiency of consumer graphics modelers with the scientific accuracy of GIANT,” Gnam said. “This tool will allow scientists to quickly model complex environments like planetary surfaces.”
      The Vira modeling engine is being used to assist with the development of LuNaMaps (Lunar Navigation Maps). This project seeks to improve the quality of maps of the lunar South Pole region which are a key exploration target of NASA’s Artemis missions.
      Vira also uses ray tracing to model how light will behave in a simulated environment. While ray tracing is often used in video game development, Vira utilizes it to model solar radiation pressure, which refers to changes in momentum to a spacecraft caused by sunlight.
      Vira can accurately render indirect lighting, which is when an area is still lit up even though it is not directly facing a light source.NASA Find Your Way with a Photo
      Another team at Goddard is developing a tool to enable navigation based on images of the horizon. Andrew Liounis, an optical navigation product design lead, leads the team, working alongside NASA Interns Andrew Tennenbaum and Will Driessen, as well as Alvin Yew, the gas processing lead for NASA’s DAVINCI mission.
      An astronaut or rover using this algorithm could take one picture of the horizon, which the program would compare to a map of the explored area. The algorithm would then output the estimated location of where the photo was taken.
      Using one photo, the algorithm can output with accuracy around hundreds of feet. Current work is attempting to prove that using two or more pictures, the algorithm can pinpoint the location with accuracy around tens of feet.
      “We take the data points from the image and compare them to the data points on a map of the area,” Liounis explained. “It’s almost like how GPS uses triangulation, but instead  of having multiple observers to triangulate one object, you have multiple observations from a single observer, so we’re figuring out where the lines of sight intersect.”
      This type of technology could be useful for lunar exploration, where it is difficult to rely on GPS signals for location determination.
      A Visual Perception Algorithm to Detect Craters
      To automate optical navigation and visual perception processes, Goddard intern Timothy Chase is developing a programming tool called GAVIN (Goddard AI Verification and Integration) Tool Suit.
      This tool helps build deep learning models, a type of machine learning algorithm that is trained to process inputs like a human brain. In addition to developing the tool itself, Chase and his team are building a deep learning algorithm using GAVIN that will identify craters in poorly lit areas, such as the Moon.
      “As we’re developing GAVIN, we want to test it out,” Chase explained. “This model that will identify craters in low-light bodies will not only help us learn how to improve GAVIN, but it will also prove useful for missions like Artemis, which will see astronauts exploring the Moon’s south pole region — a dark area with large craters — for the first time.”
      As NASA continues to explore previously uncharted areas of our solar system, technologies like these could help make planetary exploration at least a little bit simpler. Whether by developing detailed 3D maps of new worlds, navigating with photos, or building deep learning algorithms, the work of these teams could bring the ease of Earth navigation to new worlds.
      By Matthew Kaufman
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Share
      Details
      Last Updated Aug 07, 2024 EditorRob GarnerContactRob Garnerrob.garner@nasa.govLocationGoddard Space Flight Center Related Terms
      Goddard Technology Artificial Intelligence (AI) Goddard Space Flight Center Technology Explore More
      4 min read NASA Improves GIANT Optical Navigation Technology for Future Missions
      Goddard's GIANT optical navigation software helped guide the OSIRIS-REx mission to the Asteroid Bennu. Today…
      Article 10 months ago 4 min read Space Station Research Contributes to Navigation Systems for Moon Voyages
      Article 2 years ago 5 min read NASA, Industry Improve Lidars for Exploration, Science
      NASA engineers will test a suite of new laser technologies from an aircraft this summer…
      Article 5 months ago View the full article
    • By NASA
      Teams with NASA’s Exploration Ground Systems Program, in preparation for the agency’s Artemis II crewed mission to the Moon, conduct testing of four emergency egress baskets on the mobile launcher at Launch Complex 39B at the agency’s Kennedy Space Center in Florida in July 2024. The baskets are used in the case of a pad abort emergency to allow astronauts and other pad personnel to escape quickly from the mobile launcher to the base of the pad to be driven to safety by emergency transport vehicles.NASA/Amanda Arrieta Since NASA began sending astronauts to space, the agency has relied on emergency systems for personnel to safely leave the launch pad and escape the hazard in the unlikely event of an emergency during the launch countdown.  
      During the Mercury and Gemini programs, NASA used launch escape systems on spacecraft for the crew to safely evacuate if needed. Though these systems are still in use for spacecraft today, the emergency routes on the ground were updated starting with the Apollo missions to account for not only the crew, but all remaining personnel at the launch pad. 
      During Apollo, personnel relied on a ground-based emergency egress system – or emergency exit route – to allow for a quick and safe departure. Though the system has varied over time and different launch pads use different escape systems, the overall goal has stayed the same – quickly leave the launch pad and head to safety.  
      Beginning with Artemis II, the Exploration Ground Systems (EGS) Program at Kennedy Space Center in Florida, will use a track cable which connects the mobile launcher to the perimeter area of the launch pad where four baskets, similar to gondolas at ski lifts, can ride down. Once down at the ground level, armored emergency response vehicles are stationed to take personnel safely away from the launch pad to one of the triage site locations at Kennedy. 
      “We have four baskets that sit on the side of the mobile launcher tower at the same level as the crew access arm, the location where the crew enters the spacecraft,” said Amanda Arrieta, mobile launcher 1 senior element engineer for NASA’s EGS Program. “The intention is to provide another means of egress for the crew and the closeout crew in the event of an emergency. Each of these baskets will go down a wire. It’s a wire rope system that connects to the pad terminus, an area near the pad perimeter where the baskets will land after leaving the mobile launcher tower.” 
      Infographic shows the route astronauts and personnel would take during an emergency abort situation. Credit: NASA The Artemis system works like this: personnel will exit the Orion spacecraft or the white room (depending where teams are at the time of the emergency) inside the crew access arm of the mobile launcher. Located on the 274-foot-level, teams are approximately 375 feet above the ground. From there, they will head down the 1,335-foot-long cables inside the emergency egress baskets to the launch pad perimeter, or the pad terminus area. Each basket, which is similar in size to a small SUV, is designed to carry up to five people or a maximum weight of 1,500 pounds.
      Once teams have left the terminus area and arrive at the triage site location, emergency response crews are there to evaluate and take care of any personnel. 
      “When we send our crews to the pad during launch, their safety is always at the forefront of our minds. While it is very unlikely that we will need the emergency egress and pad abort systems, they are built and tested to ensure that if we do need them then they are ready to go,” said Charlie Blackwell-Thompson, Artemis launch director. “Our upcoming integrated ground systems training is about demonstrating the capability of the entire emergency egress response from the time an emergency condition is declared until we have the crews, both flight and ground, safely accounted for outside the hazardous area.”  
      For the agency’s Commercial Crew Program, SpaceX uses a slidewire cable with baskets that ride down the cable at the Launch Complex 39A pad. At Space Launch Complex 40, meanwhile, the team uses a deployable chute for its emergency egress system. Boeing and United Launch Alliance also use a slidewire, but instead of baskets, the team deploys seats that ride down the slide wires, similar to riding down a zip line, at Space Launch Complex 41 at Cape Canaveral Space Force Station.  
      Artemis II will be NASA’s first mission with crew aboard the SLS (Space Launch System) rocket and Orion spacecraft and will also introduce several new ground systems for the first time – including the emergency egress system. Though no NASA mission to date has needed to use its ground-based emergency egress system during launch countdown, those safety measures are still in place and maintained as a top priority for the agency. 
      View the full article
  • Check out these Videos

×
×
  • Create New...