Eyes on the Aurora, Part 3: Exploring Over a Thousand Nights of Aurora on Your Phone

Guest post by Jeremy Kuzub

Attending AGU 20? Jeremy will be presenting Keogramist as a poster in The MacGyver Session: The Place for Novel, Exciting, Self-Made, Hacked, or Improved Sensors and Software Solutions to Understand Space Weather eLightning on December 15, 2020 at 6:00 AKT/7:00 PT/8:00 MT/9:00 CT/10:00 ET/15:00 UTC. There will be a Q&A element, so bring any questions for Jeremy!

Not attending this year? The information on his poster is publicly available here.

 

An image of three smartphones shows different pages of the Keogramist mobile site

Figure 1. Exploring all-sky camera video from AuroraMAX using keograms and the immersive first person viewer at Keogramist.com

This is it, the final part of our three-part tour of how researchers and citizen scientists observe and catalog the aurora!

In Part 1 we looked at All Sky Cameras, which are specially designed to capture the entire sky in every frame and record thousands of images a night for months and years on end.

An animated gif shows green aurora playing across and slightly distorted by a fish-eye lens that shows the entire sky.

Figure 2. An all-sky camera’s view of the night sky. Its “fish-eye” lens sees the entire sky from horizon-to-horizon. The northern horizon is at the top of the image, southern at the bottom. (Illustration by author, data from AuroraMAX played faster than real-time)

 

In part 2, we looked at “keograms”, a special data visualization which compresses an entire night’s all-sky camera recording into a single image, like a seismograph of the sky so that specific events can be identified at a glance.

An animation shows how the center slices of each moment in a moving all-sky camera are placed next to each other to create a keogram

Figure 3. Creating a keogram from an all-sky camera timelapse video. The center column of pixels is sampled from each frame, then stacked left-to-right to make a keogram image of the entire night’s activity. (Animation by author, data from AuroraMAX)

In Part 3, we’ll look at how a project created by a citizen scientist used these data sources to make a web app that enhances the explorablability, immersiveness of all-sky camera imagery.

Aurora observation by citizen scientists

Researchers are not the only ones looking at all-sky cameras and keograms. Aurora reseasearch has long been a collaborative effort with citizen scientists, for example the worldwide Visual Observer volunteers during the International Geophysical Year of 1957-8 [1], and the members of the Alberta Aurora Chasers that first identified the STEVE phenomenon in 2016 [2]. As a continued part of this collaboration, many all-sky cameras are accessible online.

The AuroraMAX all-sky camera and archive

One such camera is the AuroraMAX camera in Yellowknife, NWT. Maintained by the University of Calgary’s Auroral Imaging Group and in cooperation with Astronomy North and the Canadian Space Agency. This is an uncalibrated full-color all-sky camera that provides  real-time images approximately every 5 seconds. The AuroraMAX website also features a deep archive of nightly time-lapse videos. For example, here is the night of March 1, 2019 which had active aurora conditions and a clear sky:

A circular image labeled "AuroraMAX, shows waves of aurora somewhat distorted and bulging toward the center.

Figure 4. A single frame from one night’s timelapse video from the AuroraMAX all-sky camera video archive, the night of March 1, 2019

The challenge: lots of nights, lots of data

With such a wealth of time-lapse imagery, a big challenge is simply finding your way around the archive. If you don’t know what night you are specifically looking for, how do you find moments of aurora activity and filter out nights of cloudy weather? How do you find aurora overhead, or on the horizon, recurring patterns throughout the night, or rarer events like red aurora? I spent a lot of time searching through this video archive and felt that there was an opportunity to help navigate and experience the years of timelapse videos in a new way. 

A project created by a citizen scientist: Keogramist

A screenshot of the Keogramist webpage

Figure 5. The Keogramist web app landing page introduces users to the concept of keograms, Kp Index, and the first-person perspective.

As a citizen scientist, using my background in computer vision and software development, I envisioned a web app that would help users explore years of AuroraMAX all-sky camera video using keograms, and watch any night’s aurora activity from a first-person view, as if they were there in Yellowknife. In other words, I wanted to take hundreds of hours of video and provide a big-picture view and a very personal experience for the viewer using the same data set.

Why on the Web and on Phones?

I wanted this project reach as many people as possible – this meant it had to work on the devices they use most often to access the internet, which is often their smartphones [4]. The app had to work on desktops, laptops, tablets, and smartphones, so had to adapt to whatever screen was available. 

On smartphones, apps often need to be installed from an app store. However modern web browsers are very powerful, and a standard called “Progressive Web Applications”[5] allows a website to have similar functionality to a downloaded app, without the commitment and hassle of installation.

Seeing hundreds of nights at a glance using a keogram bookshelf

The first task was to make over a thousand nights of video easy to triage at a glance. I wanted users to be able to find the most interesting nights throughout the year and the best aurora moments during those nights, which ebb and flow on the scale of minutes. They should be able to see if any given night was cloudy, had low contrast due to moonlight, or if the all-sky camera dome was obscured by snow. This was a perfect job for keograms, arranged chronologically and split by year:

Keogram images appear in a vertical chart

Figure 6. A stack of keograms as an index of aurora activity. The interface is like a bookshelf, where each shelf is a single night from dusk till dawn. These are aligned by start and end time of the associated timelapse video so consistent patterns can be seen comparing night and morning activity.

The keograms were created with a Python program. It processed every frame of the archived timelapse video for each night to create the keogram image files. Another program extracted the corresponding timestamp from the corner of the video frame using machine vision and stored descriptions of each night’s observations in a “metadata” file that could be used to sort the keograms and timelapse videos.

Now that we had keogram images and context for them,  it was time to organize them into an intuitive interface.

Correlating geomagnetic activity with visual aurora

Increased auroral activity is correlated with increased geomagnetic activity, as measured by magnetometers across the globe. Forecasts of this activity include a scale called “Kp index” [3] — analogous to the Richter scale for seismic activity (See this Aurorasaurus blog post). I wanted to align this data with the keogram images using a “heat-map” visualization strip that runs along the bottom of each keogram. This way, the visual aurora can be correlated with the magnetic activity in a more intuitive way.

A color-coded chart of KP levels

Figure 7. Three broad ranges of activity type, each with strength levels. Blues for ‘quiet’ periods, purples for ‘unsettled’ and oranges for ‘storm’ conditions.

This Kp data is available from NOAA’s space weather archive, and a Python program combined this data with the keogram description data for use in the app interface.  Citizen science often involves combining and processing data from multiple places, and programming with Python opens up a wide range of software tools to help in these tasks.

Making an Immersive Video Experience

When a user finds an intriguing a keogram, they probably want to dive into the data to find out more. I wanted the app to make them feel like they were transported to Yellowknife under that night’s aurora. Practically speaking, this meant showing them a ‘first person viewpoint’ of the timelapse video from the AuroraMAX camera – turning the circular fish-eye perspective into a 360 degree virtual sky so they could look up and around with a mouse or with a finger-tip.

The approach I chose was a ‘virtual planetarium’ model. A planetarium is essentially a movie screen, but instead of a flat rectangle, the screen is the inside of a dome. Projecting video onto a dome creates a fully immersive first-person viewpoint because it surrounds the viewers just like the night sky. The projector system (one or more projectors) has to cover the whole dome with imagery to make for a convincing illusion.

Diagram of a projection and projector superimposed on an image of a domed planetarium

Figure  8. A planetarium works by reversing the path of light from the sky to the camera, so that the image is projected back through a fish-eye lens onto a dome which acts as the sky. (modern planetariums don’t just use one camera but an array of them, however the final effect is the same as a single central camera)

In the app, I created a virtual dome surface in 3D space and “projected” the AuroraMAX timelapse video onto it. Every 3D environment has a virtual observer’s point of view, and I put this observer right in the center, so that the dome surrounds the viewer. This allows the user to look around in any direction and see what they would have seen had they actually been standing in the same spot as the AuroraMAX all-sky camera that night. There is some fine tuning needed as well: the dome’s geometry has to precisely “undo” the distortion AuroraMAX fish-eye camera lens. This level of graphics performance is possible because modern web browsers use the hardware-based graphics accelerators on their host devices via a standard called “WebGL”.

A gif shows the viewer "walking" into a virtual dome onto which is projected a very rapid aurora

Figure 9. Early prototype of projecting the all-sky camera video onto a virtual dome to create an immersive experience using JavaScript 3D code in a web browser. Note that the timelapse video is running at about 100 times real time speed (1 frame every 5 seconds), so it is slowed down in the final app

Navigating the night

These timelapse videos projected overhead can look convincing, but I wanted the user to be able to jump to any part of the night to see highlights and interesting features. I decided to put the keogram right at the bottom of the view, like a dashboard, along with the local time. Clicking or tapping on any part of the keogram fast-forwarded to that point in time. The video also had to be slowed down a bit, and other fine-tuning controls were needed, so these controls were added as ‘url parameters’ which advanced users could tweak.

An image shows aurora, trees, and a tent in snow, with a mouse cursor with directional arrows

Figure 9. Virtual trees hide the edge of the video and a tent and campfire illumination give a little sense of story and presence.

Animated gif showing how a user can turn the camera toward the sky to focus on dfiferent areas

Looking straight up at the aurora overhead. Clicking and dragging or touching and dragging allows the user to look all around them as the night plays out. This is sped up a bit from the actual app

A big part of understanding the rhythms and patterns of aurora is not just time, but direction. Substorms and bursts of activity cause the main body of auroral activity to move north and south over a viewer, so I felt it was important to add an unobtrusive compass to the interface. The approach I chose was an unobtrusive ‘hula-hoop’ that surrounded the user on the snow around them. All-sky cameras are usually aligned with magnetic north, so both true and magnetic directions are indicated.

An image looking at the ground, with a circle labeled W and N at right angles

Looking down at the compass ‘hula-hoop’. Both magnetic and true north directions are indicated. Hey look – there is your virtual tent, a reminder of the ‘virtual dome’ technique of projecting the aurora video overhead.

“Cozy” details

Finally A line of trees and snow, and a tent glowing warmly make the experience more immersive, like the viewer was out camping for the night. This is not just for looks – the edges of the dome have artifacts that need to be concealed like buildings and lights,  and the user needs something to stand on! The tent acts as a reference point when the user gets a bit dizzy from looking up and around.

Future work

There is still so much to explore with this wealth of data. For example, machine learning and pattern detection could be used to classify and rank auroral activity with users participating in the ranking process. A virtual camera with different lenses and controls could be added to allow users to take ‘photos’ of what they are seeing and understand aurora photography technique. Finally, more all-sky cameras could be added from other locations. Someone might even find a pattern or auroral behaviour that has never been classified before!

Challenge

For part 3, we want to link you to a few specific nights of aurora activity in the app, so you can see the variety of auroral patterns and activity:

  1. March 17, 2015 – The St. Patrick’s Day Storm – a Kp7/8 night with rare strong red aurora emissions from oxygen higher in the atmosphere
  2. August 22, 2017 – A moonless clear night of almost constant late-summer aurora activity from dusk till dawn, with the aurora overhead and violet emissions from nitrogen
  3. April 20, 2018 – the sky clears in time for a strong substorm that fills the sky overhead with corona aurora.  Just before dawn, another substorm creates some amazing coronal aurora patterns overhead
  4. January 5, 2017 – pulsating aurora fill the sky after local midnight
  5. October 30, 2016 – the bottom edge of the aurora curtain is bright magenta from nitrogen emissions. The playback speed is slowed down to really focus on this fast-moving feature

Jeremy’s Bio

Jeremy Kuzub is an interactive software simulation developer and aurora photographer originally from Alberta and now based in Ottawa, Canada. He has created an aurora visualization web app to explore years of AuroraMAX public outreach video of Yellowknife aurora, which can be viewed at keogramist.com and write articles about aurora photography, equipment, and history at capturenorth.com.

More to Explore

Try the Keogramist web app right here

You can read more about all-sky cameras in part 1 of this article series: Eyes on the Aurora, Part 1: What is an All-Sky Camera?.

You can read more about keograms in part 2 of this article series: Eyes on the Aurora, Part 2: What is a Keogram?

The famous paper “The Development of the Auroral Substorm” by Syun‐Ichi Akasofu
http://www.ss.ncu.edu.tw/~lyu/lecture_files_en/Lyu_Aurora/Ref_Papers_AuroraSubstorm/Akasofu_1964.pdf

The AuroraMAX website, an amazing resource with years of all-sky camera imagery
https://www.asc-csa.gc.ca/eng/astronomy/auroramax/default.asp

Astronomy North is a northern sky outreach and education website with aurora forecasts
http://astronomynorth.com/

References

  1. “Rockets, Radar, and Computers: The International Geophysical Year”, NOAA
    https://celebrating200years.noaa.gov/magazine/igy/welcome.html
  2. “Introducing Steve – A Newly Discovered Astronomical Phenomenon”, https://www.sciencealert.com/introducing-steve-a-newly-discovered-light-in-the-sky, retrieved November 2, 2020
  3. “What is Kp index?”, http://blog.aurorasaurus.org/?p=187, retrieved November 2, 2020
  4. “Mobile vs. Desktop Usage in 2019”, https://www.perficient.com/insights/research-hub/mobile-vs-desktop-usage-study, retrieved November 2, 2020
  5. “Introduction to Progressive Web Apps”,  https://developer.mozilla.org/en-US/docs/Web/Progressive_web_apps/Introduction, retrieved November 2, 2020

Acknowledgements

Special Thanks to Dr. Liz MacDonald and Laura Brandt at Aurorasaurus, Dr. Eric Donovan, Dr. Emma Spanswick, and Darren Chaddock at the University of Calgary Auroral Imaging Group, Dr. Don Hampton at the University of Alaska, Fairbanks Geophysical Institute, the Royal Astronomical Society of Canada, and the Alberta Aurora Chasers Group.

 

Share this post by clicking an icon below!
Share on Google+Share on TumblrShare on FacebookEmail this to someoneShare on RedditTweet about this on Twitter