Behind the Scenes

2021-11-16 / Rob Spearman / Technique

We recently created a short feature highlight video (above) to play at the Great Lakes PLanetarium Conference, in Kalamazoo, MI. I thought I would share a little of the background work to better explain what we were trying to show and how we did it.

First of all, let me just say that we were so excited to have everyone get together again in person after these long pandemic restrictions (Karrie attended, I did not). We wanted to show off our simulation software, Nightshade NG, in the host dome but that had to be through a pre-recorded video. We paricularly wanted to emphasize how easy it is to do everything LIVE, which is a bit of a challenge to communicate in... a pre-recorded video.

What we decided to do was record a script (StratoScript) using a gamepad for movement, and an iPad running the Universal Console for turning things like planet orbits on/off and loading datasets. This would give a good sense of live use, and we could edit as needed and play it back to output 4k fulldome frames. We intentionally did not use the traditional keyframe animations as we were trying to demonstrate live flight.

What went wrong?

OK, a few things went wrong with our plan! I was using a development version of Nightshade NG and discovered some gamepad recording bugs as we were working. This is a relatively new feature that we added to support live Domecasting. On the flip side, I could fix these problems and keep going. (These fixes have already been pushed out in a patch release.)

Next, it became apparent that trying to make an elegant and very short trip through the universe is a bit outside typical live use. People expect a demo video to look extremely polished, and they aren't sitting there engrossed in a conversation with the presenter. As they say, speed kills, as I tried to remember what Karrie wanted me to do next as I was flying fairly quickly around.

We ended up recording until making a mistake, and then recording again from that break point. I think we did three or four segments total.

What went well?

After recording the script, we went through and optimized the timings of turning on data layers, labels, etc. We also added the intro and outro screens.

Then we rendered out 4k fulldome frames. From feedback we realized we needed to adjust our star settings, so we tweaked these and rendered again. That made things very easy to update.

Karrie wanted the demo to be shorter, so I changed the framerate to be about 1/3 faster. I did this at the render step, but it could have been done also at a video endcoding step.

Final Thoughts

Overall, I think it gives a quick view of some of our feature highlights, while also giving the feeling of a live presenter. It is definitely not a pure recording, and neither is it a fully polished demo video.

One other thing you can hear on the video above is one of our soundscapes, adaptive music compositions we commissioned for our planetarium systems. As you fly around, the soundscape changes with you to add a subtle extra dimension to your shows.

About the Author

Rob is President and co-founder of Digitalis. He spends his time between product development and management responsibilties.

Older News

News Archive

© 2003-2024, Digitalis Education Solutions, Inc.

  • +1.360.616.8915
  • https://DigitalisEducation.com
  • info @ DigitalisEducation.com