Jump to content

Recommended Posts

Posted
Past, present and future stars that can see Earth as a transiting exoplanet is a new paper by Dr. Jackie Faherty and Dr. Lisa Kalteneggar. 

alien%20civilzation.jpg

It reports that "In the search for life in the cosmos, transiting exoplanets are currently our best targets. With thousands already detected, our search is entering a new era of discovery with upcoming large telescopes that will look for signs of ‘life’ in the atmospheres of transiting worlds. 

Previous work has explored the zone from which Earth would be visible while transiting the Sun1–4. However, these studies considered only the current position of stars, and did not include their changing vantage point over time. 

Here we report that 1,715 stars within 100 parsecs from the Sun are in the right position to have spotted life on a transiting Earth since early human civilization. 

With 1715 stars in the right position to have spotted life on a transiting Earth it is not unthinkable that we as a type 1 civilization are being watched by far more advanced alien civilizations Type 2 and Type 3 who maybe use these stars to monitor planet Earth for some reason without the possibility to discover them. 

That we have not yet been able to make contact with the aliens may be due to the idea that radio waves are outdated technology, which is probably right. Just think about the idea that we are being watched without being able to interact with the aliens, because we yet discovered the preferred means of communications among the alien civilizations Type 2 and type 3.

 

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      NASA’s Jet Propulsion Laboratory used radar data taken by ESA’s Sentinel-1A satellite before and after the 2015 eruption of the Calbuco volcano in Chile to create this inter-ferogram showing land deformation. The color bands west of the volcano indicate land sinking. NISAR will produce similar images.ESA/NASA/JPL-Caltech A SAR image — like ones NISAR will produce — shows land cover on Mount Okmok on Alaska’s Umnak Island . Created with data taken in August 2011 by NASA’s UAVSAR instrument, it is an example of polarimetry, which measures return waves’ orientation relative to that of transmitted signals.NASA/JPL-Caltech Data from NASA’s Magellan spacecraft, which launched in 1989, was used to create this image of Crater Isabella, a 108-mile-wide (175-kilometer-wide) impact crater on Venus’ surface. NISAR will use the same basic SAR principles to measure properties and characteristics of Earth’s solid surfaces.NASA/JPL-Caltech Set to launch within a few months, NISAR will use a technique called synthetic aperture radar to produce incredibly detailed maps of surface change on our planet.
      When NASA and the Indian Space Research Organization’s (ISRO) new Earth satellite NISAR (NASA-ISRO Synthetic Aperture Radar) launches in coming months, it will capture images of Earth’s surface so detailed they will show how much small plots of land and ice are moving, down to fractions of an inch. Imaging nearly all of Earth’s solid surfaces twice every 12 days, it will see the flex of Earth’s crust before and after natural disasters such as earthquakes; it will monitor the motion of glaciers and ice sheets; and it will track ecosystem changes, including forest growth and deforestation.  
      The mission’s extraordinary capabilities come from the technique noted in its name: synthetic aperture radar, or SAR. Pioneered by NASA for use in space, SAR combines multiple measurements, taken as a radar flies overhead, to sharpen the scene below. It works like conventional radar, which uses microwaves to detect distant surfaces and objects, but steps up the data processing to reveal properties and characteristics at high resolution.
      To get such detail without SAR, radar satellites would need antennas too enormous to launch, much less operate. At 39 feet (12 meters) wide when deployed, NISAR’s radar antenna reflector is as wide as a city bus is long. Yet it would have to be 12 miles (19 kilometers) in diameter for the mission’s L-band instrument, using traditional radar techniques, to image pixels of Earth down to 30 feet (10 meters) across.
      Synthetic aperture radar “allows us to refine things very accurately,” said Charles Elachi, who led NASA spaceborne SAR missions before serving as director of NASA’s Jet Propulsion Laboratory in Southern California from 2001 to 2016. “The NISAR mission will open a whole new realm to learn about our planet as a dynamic system.”
      Data from NASA’s Magellan spacecraft, which launched in 1989, was used to create this image of Crater Isabella, a 108-mile-wide (175-kilometer-wide) impact crater on Venus’ surface. NISAR will use the same basic SAR principles to measure properties and characteristics of Earth’s solid surfaces.NASA/JPL-Caltech How SAR Works
      Elachi arrived at JPL in 1971 after graduating from Caltech, joining a group of engineers developing a radar to study Venus’ surface. Then, as now, radar’s allure was simple: It could collect measurements day and night and see through clouds. The team’s work led to the Magellan mission to Venus in 1989 and several NASA space shuttle radar missions.
      An orbiting radar operates on the same principles as one tracking planes at an airport. The spaceborne antenna emits microwave pulses toward Earth. When the pulses hit something — a volcanic cone, for example — they scatter. The antenna receives those signals that echo back to the instrument, which measures their strength, change in frequency, how long they took to return, and if they bounced off of multiple surfaces, such as buildings.
      This information can help detect the presence of an object or surface, its distance away, and its speed, but the resolution is too low to generate a clear picture. First conceived at Goodyear Aircraft Corp. in 1952, SAR addresses that issue.
      “It’s a technique to create high-resolution images from a low-resolution system,” said Paul Rosen, NISAR’s project scientist at JPL.
      As the radar travels, its antenna continuously transmits microwaves and receives echoes from the surface. Because the instrument is moving relative to Earth, there are slight changes in frequency in the return signals. Called the Doppler shift, it’s the same effect that causes a siren’s pitch to rise as a fire engine approaches then fall as it departs.
      Computer processing of those signals is like a camera lens redirecting and focusing light to produce a sharp photograph. With SAR, the spacecraft’s path forms the “lens,” and the processing adjusts for the Doppler shifts, allowing the echoes to be aggregated into a single, focused image.
      Using SAR
      One type of SAR-based visualization is an interferogram, a composite of two images taken at separate times that reveals the differences by measuring the change in the delay of echoes. Though they may look like modern art to the untrained eye, the multicolor concentric bands of interferograms show how far land surfaces have moved: The closer the bands, the greater the motion. Seismologists use these visualizations to measure land deformation from earthquakes.
      Another type of SAR analysis, called polarimetry, measures the vertical or horizontal orientation of return waves relative to that of transmitted signals. Waves bouncing off linear structures like buildings tend to return in the same orientation, while those bouncing off irregular features, like tree canopies, return in another orientation. By mapping the differences and the strength of the return signals, researchers can identify an area’s land cover, which is useful for studying deforestation and flooding.
      Such analyses are examples of ways NISAR will help researchers better understand processes that affect billions of lives.
      “This mission packs in a wide range of science toward a common goal of studying our changing planet and the impacts of natural hazards,” said Deepak Putrevu, co-lead of the ISRO science team at the Space Applications Centre in Ahmedabad, India.
      Learn more about NISAR at:
      https://nisar.jpl.nasa.gov
      News Media Contacts
      Andrew Wang / Jane J. Lee
      Jet Propulsion Laboratory, Pasadena, Calif.
      626-379-6874 / 818-354-0307
      andrew.wang@jpl.nasa.gov / jane.j.lee@jpl.nasa.gov
      2025-006
      Share
      Details
      Last Updated Jan 21, 2025 Related Terms
      NISAR (NASA-ISRO Synthetic Aperture Radar) Earth Earth Science Earth Science Division Jet Propulsion Laboratory Explore More
      4 min read NASA Scientists, Engineers Receive Presidential Early Career Awards 
      Article 4 days ago 6 min read NASA International Space Apps Challenge Announces 2024 Global Winners
      Article 5 days ago 3 min read NASA Scientists Find New Human-Caused Shifts in Global Water Cycle
      Article 5 days ago Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By European Space Agency
      Image: This Copernicus Sentinel-2 image captures the borders between North and South Dakota and Minnesota blanketed with snow and ice. View the full article
    • By NASA
      Pandora, NASA’s newest exoplanet mission, is one step closer to launch with the completion of the spacecraft bus, which provides the structure, power, and other systems that will enable the mission to carry out its work.
      Watch to learn more about NASA’s Pandora mission, which will revolutionize the study of exoplanet atmospheres.
      NASA’s Goddard Space Flight Center “This is a huge milestone for us and keeps us on track for a launch in the fall,” said Elisa Quintana, Pandora’s principal investigator at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “The bus holds our instruments and handles navigation, data acquisition, and communication with Earth — it’s the brains of the spacecraft.”  
      Pandora, a small satellite, will provide in-depth study of at least 20 known planets orbiting distant stars in order to determine the composition of their atmospheres — especially the presence of hazes, clouds, and water. This data will establish a firm foundation for interpreting measurements by NASA’s James Webb Space Telescope and future missions that will search for habitable worlds.
      Pandora’s spacecraft bus was photographed Jan. 10 within a thermal-vacuum testing chamber at Blue Canyon Technologies in Lafayette, Colorado. The bus provides the structure, power, and other systems that will enable the mission to help astronomers better separate stellar features from the spectra of transiting planets. NASA/Weston Maughan, BCT “We see the presence of water as a critical aspect of habitability because water is essential to life as we know it,” said Goddard’s Ben Hord, a NASA Postdoctoral Program Fellow who discussed the mission at the 245th meeting of the American Astronomical Society in National Harbor, Maryland. “The problem with confirming its presence in exoplanet atmospheres is that variations in light from the host star can mask or mimic the signal of water. Separating these sources is where Pandora will shine.”
      Funded by NASA’s Astrophysics Pioneers program for small, ambitious missions, Pandora is a joint effort between Lawrence Livermore National Laboratory in California and NASA Goddard.
      “Pandora’s near-infrared detector is actually a spare developed for the Webb telescope, which right now is the observatory most sensitive to exoplanet atmospheres,” Hord added. “In turn, our observations will improve Webb’s ability to separate the star’s signals from those of the planet’s atmosphere, enabling Webb to make more precise atmospheric measurements.”
      Astronomers can sample an exoplanet’s atmosphere when it passes in front of its star as seen from our perspective, an event called a transit. Part of the star’s light skims the atmosphere before making its way to us. This interaction allows the light to interact with atmospheric substances, and their chemical fingerprints — dips in brightness at characteristic wavelengths — become imprinted in the light.
      But our telescopes see light from the entire star as well, not just what’s grazing the planet. Stellar surfaces aren’t uniform. They sport hotter, unusually bright regions called faculae and cooler, darker regions similar to sunspots, both of which grow, shrink, and change position as the star rotates.
      An artist’s concept of the Pandora mission, seen here without the thermal blanketing that will protect the spacecraft, observing a star and its transiting exoplanet. NASA’s Goddard Space Flight Center/Conceptual Image Lab Using a novel all-aluminum, 45-centimeter-wide (17 inches) telescope, jointly developed by Livermore and Corning Specialty Materials in Keene, New Hampshire, Pandora’s detectors will capture each star’s visible brightness and near-infrared spectrum at the same time, while also obtaining the transiting planet’s near-infrared spectrum. This combined data will enable the science team to determine the properties of stellar surfaces and cleanly separate star and planetary signals.
      The observing strategy takes advantage of the mission’s ability to continuously observe its targets for extended periods, something flagship missions like Webb, which are in high demand, cannot regularly do.
      Over the course of its year-long prime mission, Pandora will observe at least 20 exoplanets 10 times, with each stare lasting a total of 24 hours. Each observation will include a transit, which is when the mission will capture the planet’s spectrum. 
      Pandora is led by NASA’s Goddard Space Flight Center. Lawrence Livermore National Laboratory provides the mission’s project management and engineering. Pandora’s telescope was manufactured by Corning and developed collaboratively with Livermore, which also developed the imaging detector assemblies, the mission’s control electronics, and all supporting thermal and mechanical subsystems. The infrared sensor was provided by NASA Goddard. Blue Canyon Technologies provided the bus and is performing spacecraft assembly, integration, and environmental testing. NASA’s Ames Research Center in California’s Silicon Valley will perform the mission’s data processing. Pandora’s mission operations center is located at the University of Arizona, and a host of additional universities support the science team.

      Download high-resolution video and images from NASA’s Scientific Visualization Studio

      By Francis Reddy
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Media Contact:
      Claire Andreoli
      301-286-1940
      claire.andreoli@nasa.gov
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Facebook logo @NASAUniverse @NASAUniverse Instagram logo @NASAUniverse Share








      Details
      Last Updated Jan 16, 2025 Related Terms
      Astrophysics Astrophysics Division Exoplanet Atmosphere Exoplanet Exploration Program Exoplanet Science Exoplanet Transits Exoplanets Goddard Space Flight Center Studying Exoplanets The Universe View the full article
    • By NASA
      NASA On April 21, 1972, NASA astronaut John W. Young, commander of the Apollo 16 mission, took a far-ultraviolet photo of Earth with an ultraviolet camera. Young’s original black-and-white picture was printed on Agfacontour professional film three times, with each exposure recording only one light level. The three light levels were then colored blue (dimmest), green (next brightest), and red (brightest), resulting in the enhanced-color image seen here.
      Dr. George Carruthers, a scientist at the Naval Research Laboratory, developed the ultraviolet camera – the first Moon-based observatory – for Apollo 16. Apollo 16 astronauts placed the observatory on the Moon in April 1972, where it sits today on the Moon’s Descartes highland region, in the shadow of the lunar module Orion.
      Image credit: NASA
      View the full article
    • By NASA
      5 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      An equal collaboration between NASA and the Indian Space Research Organisation, NISAR will offer unprecedented insights into Earth’s constantly changing land and ice surfaces using synthetic aperture radar technology. The spacecraft, depicted here in an artist’s concept, will launch from India.NASA/JPL-Caltech A Q&A with the lead U.S. scientist of the mission, which will track changes in everything from wetlands to ice sheets to infrastructure damaged by natural disasters.
      The upcoming U.S.-India NISAR (NASA-ISRO Synthetic Aperture Radar) mission will observe Earth like no mission before, offering insights about our planet’s ever-changing surface.
      The NISAR mission is a first-of-a-kind dual-band radar satellite that will measure land deformation from earthquakes, landslides, and volcanoes, producing data for science and disaster response. It will track how much glaciers and ice sheets are advancing or retreating and it will monitor growth and loss of forests and wetlands for insights on the global carbon cycle.
      As diverse as NISAR’s impact will be, the mission’s winding path to launch — in a few months’ time — has also been remarkable. Paul Rosen, NISAR’s project scientist at NASA’s Jet Propulsion Laboratory in Southern California, has been there at every step. He recently discussed the mission and what sets it apart.
      NISAR Project Scientist Paul Rosen of NASA’s Jet Propulsion Laboratory first traveled to India in late 2011 to discuss collaboration with ISRO scientists on an Earth-observing radar mission. NASA and ISRO signed an agreement in 2014 to develop NISAR. NASA/JPL-Caltech How will NISAR improve our understanding of Earth?
      The planet’s surfaces never stop changing — in some ways small and subtle, and in other ways monumental and sudden. With NISAR, we’ll measure that change roughly every week, with each pixel capturing an area about half the size of a tennis court. Taking imagery of nearly all Earth’s land and ice surfaces this frequently and at such a small scale — down to the centimeter — will help us put the pieces together into one coherent picture to create a story about the planet as a living system.
      What sets NISAR apart from other Earth missions?
      NISAR will be the first Earth-observing satellite with two kinds of radar — an L-band system with a 10-inch (25-centimeter) wavelength and an S-band system with a 4-inch (10-centimeter) wavelength.
      Whether microwaves reflect or penetrate an object depends on their wavelength. Shorter wavelengths are more sensitive to smaller objects such as leaves and rough surfaces, whereas longer wavelengths are more reactive with larger structures like boulders and tree trunks.
      So NISAR’s two radar signals will react differently to some features on Earth’s surface. By taking advantage of what each signal is or isn’t sensitive to, researchers can study a broader range of features than they could with either radar on its own, observing the same features with different wavelengths.
      Is this new technology?
      The concept of a spaceborne synthetic aperture radar, or SAR, studying Earth’s processes dates to the 1970s, when NASA launched Seasat. Though the mission lasted only a few months, it produced first-of-a-kind images that changed the remote-sensing landscape for decades to come.
      It also drew me to JPL in 1981 as a college student: I spent two summers analyzing data from the mission. Seasat led to NASA’s Shuttle Imaging Radar program and later to the Shuttle Radar Topography Mission.
      What will happen to the data from the mission?
      Our data products will fit the needs of users across the mission’s science focus areas — ecosystems, cryosphere, and solid Earth — plus have many uses beyond basic research like soil-moisture and water resources monitoring.
      We’ll make the data easily accessible. Given the volume of the data, NASA decided that it would be processed and stored in the cloud, where it’ll be free to access.
      How did the ISRO partnership come about?
      We proposed DESDynI (Deformation, Ecosystem Structure, and Dynamics of Ice), an L-band satellite, following the 2007 Decadal Survey by the National Academy of Sciences. At the time, ISRO was exploring launching an S-band satellite. The two science teams proposed a dual-band mission, and in 2014 NASA and ISRO agreed to partner on NISAR.
      Since then, the agencies have been collaborating across more than 9,000 miles (14,500 kilometers) and 13 time zones. Hardware was built on different continents before being assembled in India to complete the satellite. It’s been a long journey — literally.
      More About NISAR
      The NISAR mission is an equal collaboration between NASA and ISRO and marks the first time the two agencies have cooperated on hardware development for an Earth-observing mission. Managed for the agency by Caltech, JPL leads the U.S. component of the project and is providing the mission’s L-band SAR. NASA is also providing the radar reflector antenna, the deployable boom, a high-rate communication subsystem for science data, GPS receivers, a solid-state recorder, and payload data subsystem.
      Space Applications Centre Ahmedabad, ISRO’s lead center for payload development, is providing the mission’s S-band SAR instrument and is responsible for its calibration, data processing, and development of science algorithms to address the scientific goals of the mission. U R Rao Satellite Centre in Bengaluru, which leads the ISRO components of the mission, is providing the spacecraft bus. The launch vehicle is from ISRO’s Vikram Sarabhai Space Centre, launch services are through ISRO’s Satish Dhawan Space Centre, and satellite mission operations are by ISRO Telemetry Tracking and Command Network. National Remote Sensing Centre in Hyderabad is primarily responsible for S-band data reception, operational products generation, and dissemination.
      To learn more about NISAR, visit:
      https://nisar.jpl.nasa.gov
      News Media Contacts
      Andrew Wang / Jane J. Lee
      Jet Propulsion Laboratory, Pasadena, Calif.
      626-379-6874 / 818-354-0307
      andrew.wang@jpl.nasa.gov / jane.j.lee@jpl.nasa.gov
      2025-001
      Share
      Details
      Last Updated Jan 06, 2025 Related Terms
      NISAR (NASA-ISRO Synthetic Aperture Radar) Climate Change Earth Earth Science Earth Science Division Ice & Glaciers Jet Propulsion Laboratory Seasat Shuttle Radar Topography Mission (SRTM) SIR-C/X-SAR (Shuttle Imaging Radar-C / X-Band Synthetic Aperture Radar) Explore More
      27 min read Summary of the Third Annual AEOIP Workshop
      Introduction The Applied Earth Observations Innovation Partnership (AEOIP) was established in 2018 to facilitate knowledge…
      Article 3 days ago 5 min read NASA’s LEXI Will Provide X-Ray Vision of Earth’s Magnetosphere
      A NASA X-ray imager is heading to the Moon as part of NASA’s Artemis campaign,…
      Article 3 days ago 2 min read Science Done by Volunteers Highlighted at December’s American Geophysical Union Meeting
      More than 30,000 scientists gathered in Washington, D.C. during the second week of December –…
      Article 2 weeks ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
  • Check out these Videos

×
×
  • Create New...