Computational Media Innovation Centre, Victoria University of Wellington
{cmic}@vuw.ac.nz
We present TeleFest, a novel system for live-streaming mixed reality 360° videos to online streaming services. TeleFest allows a producer to control multiple cameras in real time, providing viewers with different locations for experiencing the concert and an intermediate software stack allows virtual content to be overlaid with coherent illumination that matches the real-world footage.
TeleFest allows real-world concerts to be viewed immersively through a mixed reality 360° live stream controlled in real time by a producer. The system is designed to take multiple 360° video streams from cameras placed around the stage and crowd and presents them to this producer, who can freely choose which stream is being showed to the remote audience. The producer can also control and place virtual 3D content around the captured space in real time, allowing them to add effects such as weather, pyrotechnics, creatures, and other augmentations to enhance and react to the mood of the performance. These augmentations are realistically lit based on real-world lighting conditions detected from the incoming video streams, seamlessly blending them into their surroundings and making them appear as a real part of the performance.
TeleFest was implemented using the Unity game engine. We built two main interfaces to handle the cameras and asset management; both are broken into multiple windows so that we can take advantage of Unity’s window tab system and rearrange the interface as desired. FFmpeg2 is used for all video and audio related tasks, and is used to decode the incoming video streams from the 360° cameras. Real-world lighting conditions are then detected from this video and used to coherently illuminate any introduced virtual content using the MR360 Unity plugin. The virtual content is then baked into the currently selected camera stream and rendered as an equirectangular texture, which is then re-encoded using FFMPEG and streamed to YouTube. The manual calibration process for aligning the positions of virtual augmentations between multiple cam- eras. 3D points are found in relation to the camera’s real-world positions; in this case we use the corners of the stage and the large display behind the performers. These points after being matched to the equirectangular image. This is done independently for each camera.
Virtual fireworks are let off to add emphasis to certain parts of the music. Virtual wildlife is introduced to add a serene feeling to the music. Their red hue illustrates how the virtual lighting is based on real-world conditions, in this case a heavy use of red stage lights.
The TeleFest live stream lasted approximately three hours, consisted of several live musical performances of varying genres. TeleFest was evaluated by 1,908 online viewers, allowing them to watch the performance from the crowd, the stage, or via a catered experience controlled by a producer in real time that included camera switching and augmented content. The results of an online survey completed by virtual and physical attendees of the festival are presented, showing positive feedback for our setup and suggesting that the addition of virtual and immersive content to live events could lead to a more enjoyable experience for viewers.
@InProceedings
{young2022telefest,
title = {TeleFest: Augmented Virtual Teleportation for Live Concerts},
author = {Young, Jacob and Thompson, Stephen and Downer, Holly and Allen, Benjamin and Pantidi, Nadia and Stoecklein, Lukas and Rhee, Taehyun},
publisher = {Association for Computing Machinery},
booktitle = {ACM International Conference on Interactive Media Experiences},
year = {2022}
}
This project was supported by Victoria University of Wellington, K-Festival, and the Entrepreneurial University Programme by the Tertiary Education Commission in New Zealand.