A critical failure that ended one mission has borne an unexpected and an exciting new science opportunity. The Kepler spacecraft, known for finding thousands of planets orbiting other stars, has a new job as the K2 mission.
Like its predecessor, K2 detects the tiny, telltale dips in the brightness of a star as an object passes or transits it, to possibly reveal the presence of a planet. Searching close neighboring stars for near-Earth-sized planets, K2 is finding planets ripe for follow-up studies on their atmospheres and to see what the planet is made of. A step up from its predecessor, K2 is revealing new info on comets, asteroids, dwarf planets, ice giants and moons. It will also provide new insight into areas as diverse as the birth of new stars, how stars explode into spectacular supernovae, and even the evolution of black holes.
K2 is expanding the planet-hunting legacy and has ushered in entirely new opportunities in astrophysics research, yet this is only the beginning.
Searching Nearby for Signs of Life
Image credit: ESO/L. Calçada
Scientists are excited about nearby multi-planet system known as K2-3. This planetary system, discovered by K2, is made of three super-Earth-sized planets orbiting a cool M-star (or red dwarf) 135 light-years away, which is relatively close in astronomical terms. To put that distance into perspective, if the Milky Way galaxy was scaled down to the size of the continental U.S. it would be the equivalent of walking the three-mile long Golden Gate Park in San Francisco, California. At this distance, our other powerful space-investigators – the Hubble Space Telescope and the forthcoming James Webb Space Telescope (JWST) – could study the atmospheres of these worlds in search of chemical fingerprints that could be indicative of life. K2 expects to find a few hundred of these close-by, near-Earth-sized neighbors.
K2 won’t be alone in searching for nearby planets outside our solar system. Revving up for launch around 2017-2018, our Transiting Exoplanet Survey Satellite (TESS) plans to monitor 200,000 close stars for planets, with a focus on finding Earth and Super-Earth-sized planets.
The above image is an artist rendering of Gliese 581, a planetary system representative of K2-3.
Neptune's Moon Dance
Movie credit: NASA Ames/SETI Institute/J. Rowe
Spying on our neighbors in our own solar system, K2 caught Neptune in a dance with its moons Triton and Nereid. On day 15 (day counter located in the top right-hand corner of the green frame) of the sped-up movie, Neptune appears, followed by its moon Triton, which looks small and faint. Keen-eyed observers can also spot Neptune's tiny moon Nereid at day 24. Neptune is not moving backward but appears to do so because of the changing position of the Kepler spacecraft as it orbits around the sun. A few fast-moving asteroids make cameo appearances in the movie, showing up as streaks across the K2 field of view. The red dots are a few of the stars K2 examines in its search for transiting planets outside of our solar system. An international team of astronomers is using these data to track Neptune’s weather and probe the planet’s internal structure by studying subtle brightness fluctuations that can only be observed with K2.
Dead Star Devours Planet
Image credit: CfA/Mark A. Garlick
K2 also caught a white dwarf – the dead core of an exploded star –vaporizing a nearby tiny rocky planet. Slowly the planet will disintegrate, leaving a dusting of metals on the surface of the star. This trail of debris blocks a tiny fraction of starlight from the vantage point of the spacecraft producing an unusual, but vaguely familiar pattern in the data. Recognizing the pattern, scientists further investigated the dwarf’s atmosphere to confirm their find. This discovery has helped validate a long-held theory that white dwarfs are capable of cannibalizing possible remnant planets that have survived within its solar system.
Searching for Far Out Worlds
NASA/JPL-Caltech
In April, spaced-based K2 and ground-based observatories on five continents will participate in a global experiment in exoplanet observation and simultaneously monitor the same region of sky towards the center of our galaxy to search for small planets, such as the size of Earth, orbiting very far from their host star or, in some cases, orbiting no star at all. For this experiment, scientists will use gravitational microlensing – the phenomenon that occurs when the gravity of a foreground object focuses and magnifies the light from a distant background star.
The animation demonstrates the principles of microlensing. The observer on Earth sees the source (distant) star when the lens (closer) star and planet pass through the center of the image. The inset shows what may be seen through a ground-based telescope. The image brightens twice, indicating when the star and planet pass through the observatory's line of sight to the distant star.
Full microlensing animation available HERE.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
Our Opportunity rover is facing one of the greatest challenges of its 14 ½ year mission on the surface of Mars--a massive dust storm that has turned day to night. Opportunity is currently hunkered down on Mars near the center of a storm bigger than North America and Russia combined. The dust-induced darkness means the solar-powered rover can’t recharge its batteries.
This isn’t the first time Opportunity has had to wait out a massive storm. In 2007, a monthlong series of severe storms filled the Martian skies with dust. Power levels reached critical lows, but engineers nursed the rover back to health when sunlight returned.
Martian breezes proved a saving grace for the solar-powered Mars rovers in the past, sweeping away accumulated dust and enabling rovers to recharge and get back to science. This is Opportunity in 2014. The image on the left is from January 2014. The image on the right in March 2014.
Back in 1971, scientists were eager for their first orbital views of Mars. But when Mariner 9 arrived in orbit, the Red Planet was engulfed by a global dust storm that hid most of the surface for a month. When the dust settled, geologists got detailed views of the Martian surface, including the first glimpses of ancient riverbeds carved into the dry and dusty landscape.
As bad as the massive storm sounds, Mars isn’t capable of generating the strong winds that stranded actor Matt Damon’s character on the Red Planet in the movie The Martian. Mars’ atmosphere is too thin and winds are more breezy than brutal. The chore of cleaning dusty solar panels to maintain power levels, however, could be a very real job for future human explorers.
Scientists know to expect big dust storms on Mars, but the rapid development of the current one is surprising. Decades of Mars observations show a pattern of regional dust storms arising in northern spring and summer. In most Martian years, nearly twice as long as Earth years, the storms dissipate. But we’ve seen global dust storms in 1971, 1977, 1982, 1994, 2001 and 2007. The current storm season could last into 2019.
Dust is hard on machines, but can be a boon to science. A study of the 2007 storm published earlier this year suggests such storms play a role in the ongoing process of gas escaping from the top of Mars' atmosphere. That process long ago transformed wetter, warmer ancient Mars into today's arid, frozen planet. Three of our orbiters, the Curiosity rover and international partners are already in position to study the 2018 storm.
Mission controllers for Mars InSight lander--due to land on Mars in November--will be closely monitoring the storm in case the spacecraft’s landing parameters need to be adjusted for safety.
Once on the Red Planet, InSight will use sophisticated geophysical instruments to delve deep beneath the surface of Mars, detecting the fingerprints of the processes of terrestrial planet formation, as well as measuring the planet's "vital signs": Its "pulse" (seismology), "temperature" (heat flow probe), and "reflexes" (precision tracking).
One saving grace of dust storms is that they can actually limit the extreme temperature swings experienced on the Martian surface. The same swirling dust that blocks out sunlight also absorbs heat, raising the ambient temperature surrounding Opportunity.
Track the storm and check the weather on Mars anytime.
A dust storm in the Sahara can change the skies in Miami and temperatures in the North Atlantic. Earth scientists keep close watch on our home planet’s dust storms, which can darken skies and alter Earth’s climate patterns.
Read the full web version of this article HERE.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
At NASA, we’re not immune to effects of climate change. The seas are rising at NASA coastal centers – the direct result of warming global temperatures caused by human activity. Several of our centers and facilities were built near the coast, where there aren’t as many neighbors, as a safety precaution. But now the tides have turned and as sea levels rise, these facilities are at greater risk of flooding and storms.
Global sea level is increasing every year by 3.3 millimeters, or just over an eighth of an inch, and the rate of rise is speeding up over time. The centers within range of rising waters are taking various approaches to protect against future damage.
Kennedy Space Center in Florida is the home of historic launchpad 39A, where Apollo astronauts first lifted off for their journey to the Moon. The launchpad is expected to flood periodically from now on.
Like Kennedy, Wallops Flight Facility on Wallops Island, Virginia has its launchpads and buildings within a few hundred feet of the Atlantic Ocean. Both locations have resorted to replenishing the beaches with sand as a natural barrier to the sea.
Native vegetation is planted to help hold the sand in place, but it needs to be replenished every few years.
At the Langley Research Center in Hampton, Virginia, instead of building up the ground, we’re hardening buildings and moving operations to less flood-prone elevations. The center is bounded by two rivers and the Chesapeake Bay.
The effects of sea level rise extend far beyond flooding during high tides. Higher seas can drive larger and more intense storm surges – the waves of water brought by tropical storms.
In 2017, Hurricane Harvey brought flooding to the astronaut training facility at Johnson Space Center in Houston, Texas. Now we have installed flood resistant doors, increased water intake systems, and raised guard shacks to prevent interruptions to operations, which include astronaut training and mission control.
Our only facility that sits below sea level already is Michoud Assembly Facility in New Orleans. Onsite pumping systems protected the 43-acre building, which has housed Saturn rockets and the Space Launch System, from Hurricane Katrina. Since then, we’ve reinforced the pumping system so it can now handle double the water capacity.
Ames Research Center in Silicon Valley is going one step farther and gradually relocating farther south and to several feet higher in elevation to avoid the rising waters of the San Francisco Bay.
Understanding how fast and where seas will rise is crucial to adapting our lives to our changing planet.
We have a long-standing history of tracking sea level rise, through satellites like the TOPEX-Poseidon and the Jason series, working alongside partner agencies from the United States and other countries.
We just launched the Sentinel-6 Michael Freilich satellite—a U.S.-European partnership—which will use electromagnetic signals bouncing off Earth’s surface to make some of the most accurate measurements of sea levels to date.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
What is the most fascinating thing about black hole research for you, personally?
Parachutes are a key part of the landing system for many of our spacecraft, but before we send them into orbit — or beyond — we have to make sure that they’re going to work as designed. One important component of testing is a video that captures every millisecond as the chute opens, to see if it’s working and if not, what went wrong.
Integrated Design Tools built a camera for us that could do just that: rugged and compact, it can film up to 1,000 frames per second and back up all that data almost as fast. Now that same technology is being used to record crash tests, helping ensure that we’re all safer on the roads.
We often use laser-imaging technology, or lidar, on missions in outer space. Thanks to lidar, snow was discovered on Mars, and the technology will soon help us collect a sample from an asteroid to bring home to Earth.
To do all that, we’ve helped make smaller, more rugged, and more powerful lidar devices, which have proven useful here on Earth in a lot of ways, including for archaeologists. Lidar scans can strip away the trees and bushes to show the bare earth—offering clues to help find bones, fossils, and human artifacts hidden beneath the surface.
A screw is a screw, right? Or is it?
When we were building the Space Shuttle, we needed a screw that wouldn’t loosen during the intense vibrations of launch. An advanced screw threading called Spiralock, invented by the Holmes Tool Company and extensively tested at Goddard Space Flight Center, was the answer.
Now it’s being used in golf clubs, too. Cobra Puma Golf built a new driver with a spaceport door (designed to model the International Space Station observatory) that allows the final weight to be precisely calibrated by inserting a tungsten weight before the door is screwed on.
And to ensure that spaceport door doesn’t pop off, Cobra Puma Golf turned to the high-tech threading that had served the Space Shuttle so well.
Neurosurgery tools need to be as precise as possible.
One important tool, bipolar forceps, uses electricity to cut and cauterize tissue. But electricity produces waste heat, and to avoid singeing healthy brain tissue, Thermacore Inc. used a technology we’ve been relying on since the early days of spaceflight: heat pipes. The company, which built its expertise in part through work it has done for us over more than 30 years, created a mini heat pipe for bipolar forceps.
The result means surgery is done more quickly, precisely — and most importantly, more safely.
The Ares 1 rocket, originally designed to launch crewed missions to the moon and ultimately Mars, had a dangerous vibration problem, and the usual solutions were way too bulky to work on a launch vehicle.
Our engineers came up with a brand new technology that used the liquid fuel already in the rocket to get rid of the vibrations. And, it turns out, it works just as well with any liquid—and not just on rockets.
An adapted version is already installed on a building in Brooklyn and could soon be keeping skyscrapers and bridges from being destroyed during earthquakes.
When excess fertilizer washes away into ground water it’s called nutrient runoff, and it’s a big problem for the environment. It’s also a problem for farmers, who are paying for fertilizer the plant never uses.
Ed Rosenthal, founder of a fertilizer company called Florikan, had an idea to fix both problems at once: coating the fertilizer in special polymers to control how quickly the nutrient dissolves in water, so the plant gets just the right amount at just the right time.
Our researchers helped him perfect the formula, and the award-winning fertilizer is now used around the world — and in space.
The sensor that records your selfies was originally designed for something very different: space photography.
Eric Fossum, an engineer at NASA’s Jet Propulsion Laboratory, invented it in the 1990s, using technology called complementary metal-oxide semiconductors, or CMOS. The technology had been used for decades in computers, but Fossum was the first person to successfully adapt it for taking pictures.
As a bonus, he was able to integrate all the other electronics a camera needs onto the same computer chip, resulting in an ultra-compact, energy-efficient, and very reliable imager. Perfect for sending to Mars or, you know, snapping a pic of your meal.
To learn about NASA spinoffs, visit: https://spinoff.nasa.gov/index.html
The satellite was little— the size of a small refrigerator; it was only supposed to last one year and constructed and operated on a shoestring budget — yet it persisted.
After 17 years of operation, more than 1,500 research papers generated and 180,000 images captured, one of NASA’s pathfinder Earth satellites for testing new satellite technologies and concepts comes to an end on March 30, 2017. The Earth Observing-1 (EO-1) satellite will be powered off on that date but will not enter Earth’s atmosphere until 2056.
“The Earth Observing-1 satellite is like The Little Engine That Could,” said Betsy Middleton, project scientist for the satellite at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.
To celebrate the mission, we’re highlighting some of EO-1’s notable contributions to scientific research, spaceflight advancements and society.
This animation shifts between an image showing flooding that occurred at the Arkansas and Mississippi rivers on January 12, 2016, captured by ALI and the rivers at normal levels on February 14, 2015 taken by the Operational Land Imager on Landsat 8. Credit: NASA’s Earth Observatory
EO-1 carried the Advanced Land Imager that improved observations of forest cover, crops, coastal waters and small particles in the air known as aerosols. These improvements allowed researchers to identify smaller features on a local scale such as floods and landslides, which were especially useful for disaster support.
On the night of Sept. 6, 2014, EO-1’s Hyperion observed the ongoing eruption at Holuhraun, Iceland as shown in the above image. Partially covered by clouds, this scene shows the extent of the lava flows that had been erupting.
EO-1’s other key instrument Hyperion provided an even greater level of detail in measuring the chemical constituents of Earth’s surface— akin to going from a black and white television of the 1940s to the high-definition color televisions of today. Hyperion’s level of sophistication doesn’t just show that plants are present, but can actually differentiate between corn, sorghum and many other species and ecosystems. Scientists and forest managers used these data, for instance, to explore remote terrain or to take stock of smoke and other chemical constituents during volcanic eruptions, and how they change through time.
EO-1 was one of the first satellites to capture the scene after the World Trade Center attacks (pictured above) and the flooding in New Orleans after Hurricane Katrina. EO-1 also observed the toxic sludge in western Hungary in October 2010 and a large methane leak in southern California in October 2015. All of these scenes, which EO-1 provided quick, high-quality satellite imagery of the event, were covered in major news outlets. All of these scenes were also captured because of user requests. EO-1 had the capability of being user-driven, meaning the public could submit a request to the team for where they wanted the satellite to gather data along its fixed orbits.
This image shows toxic sludge (red-orange streak) running west from an aluminum oxide plant in western Hungary after a wall broke allowing the sludge to spill from the factory on October 4, 2010. This image was taken by EO-1’s Advanced Land Imager on October 9, 2010. Credit: NASA’s Earth Observatory
This image of volcanic activity on Antarctica’s Mount Erebus on May 7, 2004 was taken by EO-1’s Advanced Land Imager after sensing thermal emissions from the volcano. The satellite gave itself new orders to take another image several hours later. Credit: Earth Observatory
EO-1 was among the first satellites to be programmed with a form of artificial intelligence software, allowing the satellite to make decisions based on the data it collects. For instance, if a scientist commanded EO-1 to take a picture of an erupting volcano, the software could decide to automatically take a follow-up image the next time it passed overhead. The Autonomous Sciencecraft Experiment software was developed by NASA’s Jet Propulsion Laboratory in Pasadena, California, and was uploaded to EO-1 three years after it launched.
This image of Nassau Bahamas was taken by EO-1’s Advanced Land Imager on Oct 8, 2016, shortly after Hurricane Matthew hit. European, Japanese, Canadian, and Italian Space Agency members of the international coalition Committee on Earth Observation Satellites used their respective satellites to take images over the Caribbean islands and the U.S. Southeast coastline during Hurricane Matthew. Images were used to make flood maps in response to requests from disaster management agencies in Haiti, Dominican Republic, St. Martin, Bahamas, and the U.S. Federal Emergency Management Agency.
The artificial intelligence software also allows a group of satellites and ground sensors to communicate and coordinate with one another with no manual prompting. Called a "sensor web", if a satellite viewed an interesting scene, it could alert other satellites on the network to collect data during their passes over the same area. Together, they more quickly observe and downlink data from the scene than waiting for human orders. NASA's SensorWeb software reduces the wait time for data from weeks to days or hours, which is especially helpful for emergency responders.
This animation shows the Rodeo-Chediski fire on July 7, 2002, that were taken one minute apart by Landsat 7 (burned areas in red) and EO-1 (burned areas in purple). This precision formation flying allowed EO-1 to directly compare the data and performance from its land imager and the Landsat 7 ETM+. EO-1’s most important technology goal was to test ALI for future Landsat satellites, which was accomplished on Landsat 8. Credit: NASA’s Goddard Space Flight Center
EO-1 was a pioneer in precision “formation flying” that kept it orbiting Earth exactly one minute behind the Landsat 7 satellite, already in orbit. Before EO-1, no satellite had flown that close to another satellite in the same orbit. EO-1 used formation flying to do a side-by-side comparison of its onboard ALI with Landsat 7’s operational imager to compare the products from the two imagers. Today, many satellites that measure different characteristics of Earth, including the five satellites in NASA's A Train, are positioned within seconds to minutes of one another to make observations on the surface near-simultaneously.
For more information on EO-1’s major accomplishments, visit: https://www.nasa.gov/feature/goddard/2017/celebrating-17-years-of-nasa-s-little-earth-satellite-that-could
This month hosts the best meteor shower of the year and the brightest stars in familiar constellations.
The Geminds peak on the morning of the 14th, and are active from December 4th through the 17th. The peak lasts for a full 24 hours, meaning more worldwide meteor watchers will get to see this spectacle.
Expect to see up to 120 meteors per hour between midnight and 4 a.m. but only from a dark sky. You'll see fewer after moonrise at 3:30 a.m. local time.
In the southern hemisphere, you won't see as many, perhaps 10-20 per hour, because the radiant never rises above the horizon.
Take a moment to enjoy the circle of constellations and their brightest stars around Gemini this month.
Find yellow Capella in the constellation Auriga.
Next-going clockwise--at 1 o'clock find Taurus and bright reddish Aldebaran, plus the Pleiades.
At two, familiar Orion, with red Betelguese, blue-white Rigel, and the three famous belt stars in-between the two.
Next comes Leo, and its white lionhearted star, Regulus at 7 o'clock.
Another familiar constellation Ursa Major completes the view at 9 o'clock.
There's a second meteor shower in December, the Ursids, radiating from Ursa Minor, the Little Dipper. If December 22nd and the morning of December 23rd are clear where you are, have a look at the Little Dipper's bowl, and you might see about ten meteors per hour. Watch the full What’s Up for December Video:
There are so many sights to see in the sky. To stay informed, subscribe to our What’s Up video series on Facebook. Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
What did you major in? What was your college experience?
On July 23, 1999, the Space Shuttle Columbia blasted off from the Kennedy Space Center carrying the Chandra X-ray Observatory. In the two decades that have passed, Chandra’s powerful and unique X-ray eyes have contributed to a revolution in our understanding of the cosmos.
Check out Chandra’s 20th anniversary page to see how they are celebrating.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
From 2009 through 2019, our Operation IceBridge flew planes above the Arctic, Antarctic and Alaska, measuring the height, depth, thickness, flow and change of sea ice, glaciers and ice sheets.
IceBridge was designed to “bridge” the years between NASA’s two Ice, Cloud, and land Elevation Satellites, ICESat and ICESat-2. IceBridge made its final polar flight in November 2019, one year after ICESat-2’s successful launch.
A lot of amazing science happens in a decade of fundamentally changing the way we see ice. Here, in chronological order, are 10 of IceBridge’s most significant and exciting achievements.
The first ICESat monitored ice, clouds, atmospheric particles and vegetation globally beginning in 2003. As ICESat neared the end of its life, we made plans to keep measuring ice elevation with aircraft until ICESat-2’s launch.
ICESat finished its service in August 2009, leaving IceBridge in charge of polar ice tracking for the next decade.
To measure how thick sea ice is, we first have to know how much snow is accumulated on top of the ice. Using a snow radar instrument, IceBridge gathered the first widespread data set of snow thickness on top of both Arctic and Antarctic sea ice.
IceBridge mapped hundreds of miles of grounding lines in both Antarctica and Greenland. Grounding lines are where a glacier’s bottom loses contact with the bedrock and begins floating on seawater – a grounding line that is higher than rock that the ice behind it is resting on increases the possibility of glaciers retreating in the future.
The team mapped 200 glaciers along Greenland’s coastal areas, as well as coastal areas, the interior of the Greenland Ice Sheet and high-priority areas in Antarctica.
While flying Antarctica in 2011, IceBridge scientists spotted a massive crack in Pine Island Glacier, one of the fastest-changing glaciers on the continent. The crack produced a new iceberg that October.
Pine Island has grown thinner and more unstable in recent decades, spawning new icebergs almost every year. IceBridge watched for cracks that could lead to icebergs and mapped features like the deep water channel underneath Pine Island Glacier, which may bring warm water to its underside and make it melt faster.
Using surface elevation, ice thickness and bedrock topography data from ICESat, IceBridge and international partners, the British Antarctic Survey created an updated map of the bedrock beneath Antarctic ice.
Taking gravity and magnetic measurements helps scientists understand what kind of rock lies below the ice sheet. Soft rock and meltwater make ice flow faster, while hard rock makes it harder for the ice to flow quickly.
IceBridge’s airborne radar data helped map the bedrock underneath the Greenland Ice Sheet, revealing a previously unknown canyon more than 400 miles long and up to a half mile deep slicing through the northern half of the country.
The “grand canyon” of Greenland may have once been a river system, and today likely transports meltwater from Greenland’s interior to the Arctic Ocean.
After mapping the bedrock under the Greenland Ice Sheet, scientists turned their attention to the middle layers of the ice. Using both ice-penetrating radar and ice samples taken in the field, IceBridge created the first map of the ice sheet’s many layers, formed as thousands of years of snow became compacted downward and formed ice.
Making the 3D map of Greenland’s ice layers gave us clues as to how the ice sheet has warmed in the past, and where it may be frozen to bedrock or slowly melting instead.
ICESat-2 launched on September 15, 2018, rocketing IceBridge into the final phase of its mission: Connecting ICESat and ICESat-2.
IceBridge continued flying after ICESat-2’s launch, working to verify the new satellite’s measurements. By conducting precise underflights, where planes traced the satellite’s orbit lines and took the same measurements at nearly the same time, the science teams could compare results and make sure ICESat-2’s instruments were functioning properly.
Using IceBridge data, an international team of scientists found an impact crater from a meteor thousands of years in the past. The crater is larger than the city of Washington, D.C., likely created by a meteor more than half a mile wide.
In 2019, IceBridge continued flying in support of ICESat-2 for its Arctic and Antarctic campaigns. The hundreds of terabytes of data the team collected over the decade will fuel science for years to come.
IceBridge finished its last polar flight on November 20, 2019. The team will complete one more set of Alaska flights in 2020.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
We only have one universe. That’s usually plenty – it’s pretty big after all! But there are some things scientists can’t do with our real universe that they can do if they build new ones using computers.
The universes they create aren’t real, but they’re important tools to help us understand the cosmos. Two teams of scientists recently created a couple of these simulations to help us learn how our Nancy Grace Roman Space Telescope sets out to unveil the universe’s distant past and give us a glimpse of possible futures.
Caution: you are now entering a cosmic construction zone (no hard hat required)!
This simulated Roman deep field image, containing hundreds of thousands of galaxies, represents just 1.3 percent of the synthetic survey, which is itself just one percent of Roman's planned survey. The full simulation is available here. The galaxies are color coded – redder ones are farther away, and whiter ones are nearer. The simulation showcases Roman’s power to conduct large, deep surveys and study the universe statistically in ways that aren’t possible with current telescopes.
One Roman simulation is helping scientists plan how to study cosmic evolution by teaming up with other telescopes, like the Vera C. Rubin Observatory. It’s based on galaxy and dark matter models combined with real data from other telescopes. It envisions a big patch of the sky Roman will survey when it launches by 2027. Scientists are exploring the simulation to make observation plans so Roman will help us learn as much as possible. It’s a sneak peek at what we could figure out about how and why our universe has changed dramatically across cosmic epochs.
This video begins by showing the most distant galaxies in the simulated deep field image in red. As it zooms out, layers of nearer (yellow and white) galaxies are added to the frame. By studying different cosmic epochs, Roman will be able to trace the universe's expansion history, study how galaxies developed over time, and much more.
As part of the real future survey, Roman will study the structure and evolution of the universe, map dark matter – an invisible substance detectable only by seeing its gravitational effects on visible matter – and discern between the leading theories that attempt to explain why the expansion of the universe is speeding up. It will do it by traveling back in time…well, sort of.
Looking way out into space is kind of like using a time machine. That’s because the light emitted by distant galaxies takes longer to reach us than light from ones that are nearby. When we look at farther galaxies, we see the universe as it was when their light was emitted. That can help us see billions of years into the past. Comparing what the universe was like at different ages will help astronomers piece together the way it has transformed over time.
This animation shows the type of science that astronomers will be able to do with future Roman deep field observations. The gravity of intervening galaxy clusters and dark matter can lens the light from farther objects, warping their appearance as shown in the animation. By studying the distorted light, astronomers can study elusive dark matter, which can only be measured indirectly through its gravitational effects on visible matter. As a bonus, this lensing also makes it easier to see the most distant galaxies whose light they magnify.
The simulation demonstrates how Roman will see even farther back in time thanks to natural magnifying glasses in space. Huge clusters of galaxies are so massive that they warp the fabric of space-time, kind of like how a bowling ball creates a well when placed on a trampoline. When light from more distant galaxies passes close to a galaxy cluster, it follows the curved space-time and bends around the cluster. That lenses the light, producing brighter, distorted images of the farther galaxies.
Roman will be sensitive enough to use this phenomenon to see how even small masses, like clumps of dark matter, warp the appearance of distant galaxies. That will help narrow down the candidates for what dark matter could be made of.
In this simulated view of the deep cosmos, each dot represents a galaxy. The three small squares show Hubble's field of view, and each reveals a different region of the synthetic universe. Roman will be able to quickly survey an area as large as the whole zoomed-out image, which will give us a glimpse of the universe’s largest structures.
A separate simulation shows what Roman might expect to see across more than 10 billion years of cosmic history. It’s based on a galaxy formation model that represents our current understanding of how the universe works. That means that Roman can put that model to the test when it delivers real observations, since astronomers can compare what they expected to see with what’s really out there.
In this side view of the simulated universe, each dot represents a galaxy whose size and brightness corresponds to its mass. Slices from different epochs illustrate how Roman will be able to view the universe across cosmic history. Astronomers will use such observations to piece together how cosmic evolution led to the web-like structure we see today.
This simulation also shows how Roman will help us learn how extremely large structures in the cosmos were constructed over time. For hundreds of millions of years after the universe was born, it was filled with a sea of charged particles that was almost completely uniform. Today, billions of years later, there are galaxies and galaxy clusters glowing in clumps along invisible threads of dark matter that extend hundreds of millions of light-years. Vast “cosmic voids” are found in between all the shining strands.
Astronomers have connected some of the dots between the universe’s early days and today, but it’s been difficult to see the big picture. Roman’s broad view of space will help us quickly see the universe’s web-like structure for the first time. That’s something that would take Hubble or Webb decades to do! Scientists will also use Roman to view different slices of the universe and piece together all the snapshots in time. We’re looking forward to learning how the cosmos grew and developed to its present state and finding clues about its ultimate fate.
This image, containing millions of simulated galaxies strewn across space and time, shows the areas Hubble (white) and Roman (yellow) can capture in a single snapshot. It would take Hubble about 85 years to map the entire region shown in the image at the same depth, but Roman could do it in just 63 days. Roman’s larger view and fast survey speeds will unveil the evolving universe in ways that have never been possible before.
Roman will explore the cosmos as no telescope ever has before, combining a panoramic view of the universe with a vantage point in space. Each picture it sends back will let us see areas that are at least a hundred times larger than our Hubble or James Webb space telescopes can see at one time. Astronomers will study them to learn more about how galaxies were constructed, dark matter, and much more.
The simulations are much more than just pretty pictures – they’re important stepping stones that forecast what we can expect to see with Roman. We’ve never had a view like Roman’s before, so having a preview helps make sure we can make the most of this incredible mission when it launches.
Learn more about the exciting science this mission will investigate on Twitter and Facebook.
Make sure to follow us on Tumblr for your regular dose of space!
Explore the universe and discover our home planet with the official NASA Tumblr account
1K posts