Jackie Faherty at Hayden Planetarium

Jackie Faherty using OpenSpace software to present an early release of GAIA data with an audience in New York’s Hayden Planetarium.

0

For the past three weeks in this space, we’ve been ruminating on the nature and role of planetariums. Planetarium presenter Ethan Scardina kicked things off with pretty strong opinions on the merits of digital planetariums. Morrison Planetarium Assistant Director Bing Quock stepped in to defend the more traditional “opto-mechanical” planetariums, especially the one built by the Academy in the early 1950s. And last week, Manager of Planetarium Engineering Dan Tell took things little more meta by emphasizing the experience over the technology. Now I’m chiming in with a somewhat different perspective—breaking news on how planetariums can support modern astronomical research.

Back in the day, scientists actually used opto-mechanical planetariums for research. Right here at (the original) Morrison Planetarium in the early 1960s, William Hamilton III studied bobolink navigation for his postdoctoral work. And a decade later, Cornell University researcher Stephen Emlen furthered work in bird migration, confirming that indigo buntings could navigate by the stars—and using planetarium technology to trick birds into believing that Betelgeuse was the north star, among other things. And more recently, opto-mechanical planetariums have been used to reconstruct indigenous knowledge of celestial navigation by Polynesians.

What modern, digital planetariums offer is the opportunity to integrate astronomical data sets into their immersive spaces. Manipulation of these data can take place in real time, enabling researchers to visualize complex astronomical datasets and search for meaningful patterns. Sure, astronomers can do this with their laptops, too, but we believe that planetariums offer a different, more powerful perspective—and enable collaboration as well. One can imagine a group of astronomers sitting under the same dome, discussing and iterating on ideas while manipulating data in an immersive environment.

In just the past few years, as we’ve hosted lecturers and created shows for Morrison Planetarium, we’ve received anecdotal feedback that our immersive visualizations offer new perspectives on familiar data. Planetary scientists have gathered in our dome to take virtual flights over the surfaces of the Moon and Mars, with visuals driven by data from the latest NASA missions—and many have commented that they see relationships that they have missed on their wimpy little computer monitors. (Bow to the might of the immersive dome display!) And when we worked with data from a computational simulation done under the guidance of (infamous Pluto Killer) Mike Brown, the resulting fulldome planetarium visualization prompted him to comment that he’d noticed things in the simulation that he’d never seen before.

What has remained elusive is a means by which to get astronomers and planetary scientists—who are conversant in the analytical software tools of their trade—more familiar with the real-time software used in planetariums. A small but meaningful breakthrough took place this past summer at a gathering entitled “Astrographics: Interactive Data-Driven Journeys through Space,” which took place in Dagstuhl, Germany, at the end of June. Programmers took a first pass at creating connections between two powerful open-source tools: glue, an analytical tool astronomers (and others) use to bring together and analyze multiple spatial data sets, and OpenSpace, a NASA-funded three-dimensional visualization platform that runs on everything from laptops to planetarium domes.

Not bad for a two-and-a-half-day summit! And it came at a perfect time.

Once a decade, the professional astronomical community takes a step back from the day to-day to evaluate where astronomy is going, and how the community can make the most effective decisions over the next 10 years. We call this process the decadal survey, and while much of the attention is focused on the really ginormous projects—such as what missions NASA should invest in or where to build the next ridiculously huge telescope—the process also allows for the opportunity to submit “white papers” that highlight more modest (but nonetheless strategic) concepts. This allows for the influx of new and innovative ideas into the bigger picture. The deadline for this decade’s white papers? July 10th, just a few weeks after the gathering in Germany…

A few of us who attended the Dagstuhl summit, led by Jackie Faherty from the American Museum of Natural History in New York, put our heads together to assemble something in time for the deadline. The resulting white paper, entitled “IDEAS: Immersive Dome Experiences for Accelerating Science” (because every proposal needs a good acronym), got submitted right on time, although there’s a long process ahead to see how much traction it gets.

Regardless, the idea (or rather IDEAS) is out there. And with the first few baby steps taken by programmers at a small summit in Germany, maybe we’ll see progress on tighter integration of these tools.

For me personally, this represents an opportunity to achieve something that’s been on my mind for decades. When I first started working in an early digital planetarium—moonlighting as a graduate student at Rice University—I spent my off hours bringing three-dimensional data into the system, trying to figure out how to show the time evolution of, for example, galaxies colliding, and generally fiddling with software that offered certain advantages over the tools I was using to do astronomical research. So I hope our community can figure out how to take planetariums into a new, complementary paradigm, supporting research as much as research supports our educational efforts.

About the Planetarian

Ryan

Ryan Wyatt assumed his role as Senior Director of Morrison Planetarium and Science Visualization at the California Academy of Sciences in April 2007. He has written and directed the Academy’s six award-winning fulldome video planetarium programs: Fragile Planet (2008), Life: A Cosmic Story (2010), Earthquake (2012), Habitat Earth (2015), Incoming! (2016), and Expedition Reef (2018). All six shows are science documentaries that rely on visualization to tell their stories, but topics range from astronomy to geology, ecosystem science, and conservation. Trained as an astronomer, Wyatt has worked in the planetarium field since 1991; prior to arriving in San Francisco, he worked for six years as Science Visualizer at the American Museum of Natural History in New York City. Wyatt is cofounder and vice president of Immersive Media Entertainment, Research, Science, and Art (IMERSA), a professional organization dedicated to advancing the art and technology of immersive digital experiences. He served as co-chair of the 2019 Gordon Research Conference on Visualization in Science and Education (GRC/VSE), and served as the vice co-chair of the 2017 GRC/VSE.

0

Share This