How WePlay Studios is Transforming the Future of Live Event Storytelling

How WePlay Studios is Transforming the Future of Live Event Storytelling

May 2, 2024

The global esports market is booming and projected to grow $9.29 billion by 2032[1], as the active esports player count continues to expand worldwide, and championship matches like the 2023 League of Legends World Championship Tournament draw in audiences topping 6.4 million viewers. While esports arena patrons contribute to these audience numbers, most viewers tune in via a live stream. With such a large global fanbase watching from afar, the pressure is on to create exceptional quality live productions, which is WePlay’s sweet spot. We recently sat down with the content production company’s Head of Virtual Production Aleksii Gutiantov to talk about their innovative live event production work, including a fully virtual venture it took on for the The VTuber Awards 2023 that was supported by a ton of AJA gear. We’ve compiled key interview highlights below:

Tell us more about WePlay Studios.
We’re an award-winning content production company (previously known as WePlay Esports) that fuses gaming, technology, and storytelling to craft second-to-none viewer experiences. We’ve made our mark by organizing memorable gaming shows and esports tournament experiences for top-rated titles like Dota2CS:GOValorant, and Rocket League. Our efforts have earned us a Sports Emmy Award nomination and many other accolades, including NYX, Muse, and Reagan Video Awards, among others. Presently, we operate in Europe and North America, with dual headquarters in Kyiv, Ukraine, and Los Angeles, California.

What kind of clients do you serve? 
Our focus is on producing creator-driven gaming shows, a venture bolstered by strategic partnerships with OTK streamers, who hold 7 percent of the entire Twitch gaming audience, and Grammy-winning Music Producer Larrance Dopson. These collaborations have led to additional work on projects like NFL Tuesday Night Gaming; the esports season for Genshin Impact; launch shows for new miHoYo game releases; and gaming shows with OTK – including the OTK Game ExpoWheel of Chaos, and the Awards.

What makes WePlay unique? 
Storytelling and technological innovation drive every show we do, and we pride ourselves on creating iconic content that leaves a lasting viewer impression. We believe that while 80 percent of an esports broadcast might naturally focus on the game, the remaining 20 percent offers an opportunity to create a truly memorable and engaging audience experience. Unlike many esports events that follow a standard template with a familiar aesthetic and structure, WePlay's events feature distinct identities; we delve beyond the games to tap into additional audience interests and weave in compelling narratives that transform broadcasts into adventures. For our AniMajor event, we married the look of Dota2 with anime, resulting in an exceptional quality production featuring anime-styled team intros, an opening ceremony teeming with Easter eggs, and bespoke content specifically crafted for the tournament. 

Describe your history and current role with the company. 
I lead the Virtual Production Department, where I introduce innovations to WePlay and enhance our virtual production and augmented reality (AR) offering across live broadcasts and gaming shows. I joined WePlay in 2018, initially as a freelancer specializing in virtual production and augmented reality for broadcasts, and have since worked closely with Maksym Bilonogov and Yura Lazebnikov on a number of small-scale esports events. It was during these early collaborations that WePlay began to explore the use of virtual production techniques to create the added value of real-time game analytics and data-driven AR graphics for broadcasts.

This successful partnership eventually led to my promotion as head of the Virtual Production Division. Over the years, I've steered the integration of new virtual production technologies across our global studios, which has transformed the way we tell stories for live events. It’s enabled us to add AR extensions to filming practical set locations. Such advancements have made it possible for even the smallest of analytical studios at minor events to present game statistics in a visually engaging story and easily understandable format for our audience.

I've also devoted a lot of time to developing cost-efficient methods for creating render clusters to handle real-time content processing for broadcasts. This involves customizing our approach to meet unique venue demands, enhancing VP technology efficiency, and ensuring high-quality production amidst the fast-paced and challenging environment of live esports tournaments. My responsibilities today include ensuring smooth and continuous content production for a broad range of WePlay projects; overseeing the management, design, and optimization of technological routines within my Virtual Production Division; developing new visions for the department and our annual budget; handling procurement through tender procedures; and managing collaboration with foreign contractors. Part of my role also involves providing our team with training and support on all technology like our AJA tools.  

Are there any projects you can tell us about? 
The VTuber Awards 2023, hosted by Filian in partnership with talent management agency Mythic agency, comes to mind. It wasn’t just a production but a milestone technology achievement. The five-hour show blended physical production facilities with extensive engineering and design. While we’d previously incorporated AR into live productions, this show marked our first foray into a fully virtual event managed with virtual cues; it’s the most challenging technological endeavor I’ve ever taken on. I managed and coordinated everything remotely from my laptop from Europe, using intercom communication with over 16 team members, and orchestrating eight days of non-stop pre-production to deliver a several hour-long broadcast. 

What did the project involve? 
We facilitated real-time rendering of a fully virtual VTuber character into a live virtual production using Vicon technology with 20 witness cameras for comprehensive, full-body performance capture, including precise finger movements, combined with ARKit for facial mocap data streaming. Our initial ambition was to craft a virtual character that could harmonize with and stand out from the event stage's visual ambiance, and amplify it with dynamic lighting effects.

We used AJA technology for the preview infrastructure. Due to the unique setup of our arena and its preview infrastructure requirements, we employed cross converter devices to facilitate 12G-SDI signal down conversion and forward 3G-SDI signals to an AJA KUMO 3232 video router. Then, through AJA HD5DA SDI distribution amplifiers, we spread preview signals across all arena monitors. This configuration allowed for straightforward management of all preview signals via a user-friendly web interface, enabling precise control over what our partners, talent, camera operators, our motion capture team, and the entire production crew saw at any moment using pre-programmed salvo routing configurations for SDI signals, regardless of the data source's nature. We achieved this using AJA ROI-DP DisplayPort to SDI Mini-Converters for duplicating computer monitors into the broadcast pipeline to manage conversion with region-of-interest scaling from DisplayPort to SDI.

Tell us more about the result for viewers and what else you did to achieve it. 
The stream showcased a vast arena setting, but Filian's virtual character was effectively placed on a smaller stage, encircled by screens - a digitally reconstructed version of WePlay's physical LA Arena. In reality, camera operators managed three cameras that were synced to virtual cameras, ensuring every physical pan, tilt, and focus pull translated directly into the virtual render environment. The camera people in the practical/physical set could switch among various angles within the virtual stadium using iPads connected to virtual cameras, creating the illusion of using a dozen cameras instead of three.

We linked the physical stage lights to the corresponding virtual lights, which allowed us to manipulate the virtual stadium's light environment through the activation of a real environment, by lighting control console. Video playback was also integrated into the virtual world, with software for live event visuals connected to the virtual venue used to launch and control the graphics displayed on the virtual stage's screens. A critical part of our 12G-SDI broadcast pipeline involved AJA video I/O boards, specifically the KONA 5. These were instrumental in allowing us to receive 12G-SDI signals and integrate them into an Unreal Engine 5 environment and to composite the final in SDI. It’s the best product on the market in my experience. The final SDI was forwarded to the AJA KUMO 3232-12G video router for availability across the entire broadcast pipeline, which provided unparalleled performance and reliability. 

What do you like most about your AJA gear? 
The KONA 5 allows us to leverage the remarkable power of Epic Games’ Unreal Engine to create a comprehensive virtual production hub capable of handling 12G-SDI workflows. Its user interface is simple enough to understand and control, even amidst the pressures of live production, and we love that we can preview last-minute changes in real-time. This lets us fully harness the potential of AR technology, from camera tracking to motion capture and data-driven graphics, while also ensuring flawless live virtual production broadcasts without any technical mishaps in compositing. The card also enables us to produce UltraHD fill and key signals from one card in all known formats; we just use Pixotope as a keyer for 4K as well with the failover features known from FHD workflows.

What’s more, KONA 5 offers up to four reconfigurable I/Os, from SD to 4K, along with support for AES/EBU, LTC, RS-422/GPI, which is key for transforming video from interlaced to progressive formats if we are working in Saudi Arabia or China; it helps accelerate operations that require a lot of compute power for motion-adaptive deinterlacing. Furthermore, the card’s multi-channel hardware processing accelerates compute-intensive operations, which lets us combine multiple video sources into a single output in Unreal Engine 5, up/down/cross scale, and mix/composite for all resolutions. These processes are essential for handling video content of any resolution, ensuring that the final output meets the broadcast quality standards.

Our Ki Pro Ultra 12G cluster also plays an important role in our setup. These recorders can perform four channels of simultaneous HD recording or a single channel of UltraHD, with the capability to swap out recording media on the fly, which is convenient and reliable for long-format live broadcasts. This is especially true when client recordings require high-bitrate UltraHD materials for post; our Ki Pro Ultra 12Gs are indispensable. They’ve also enabled smooth operation throughout the entirety of more than twelve-hour live events. The flexibility and reliability of the Ki Pro recorders ensured that we could meet the high-quality recording standards demanded by the project, without any interruptions or compromises.

With our KUMO routers, it’s easy to build infrastructure for large remote and on-site productions and manage everything from a single web interface thousands of kilometers away from the matrix itself. It offers an exceptionally convenient user interface and the ability to save pre-programmed salvo routing configurations for SDI signals. I've been working with these products since 2017 on various projects and they've never let me down.

Which technology trends are you following? 
We’re very interested in technology that drives innovation across the global virtual production market, from AI-driven background characters and landscapes to large, 360° LED volumes, and live motion capture, as well as cluster rendering, DMX-driven lighting in Unreal Engine, and various camera tracking solutions.

What’s on the horizon for WePlay? 
Virtual production is revolutionizing how we experience entertainment, offering audiences the chance to become part of the game scenes. This level of interactivity opens up exciting new possibilities for live entertainment genres, blurring the lines between viewers and the virtual worlds we create. WePlay is not just staying within the confines of the gaming industry; we're branching out to music and broader entertainment directions. We're currently in the early stages of planning and discussions for projects that straddle these new frontiers.

About AJA Video Systems 
Since 1993, AJA Video Systems has been a leading manufacturer of cutting-edge technology for the broadcast, cinema, proAV, and post production markets. The company develops a range of powerful, flexible video interface and conversion technologies, digital video recording solutions, and color management, streaming, and remote production tools. All AJA products are designed and manufactured at our facilities in Grass Valley, California, and sold through an extensive sales channel of resellers and systems integrators around the world. For further information, please see our website at www.aja.com

 All trademarks and copyrights are the property of their respective owners. 

Media Contact:
Katie Weinberg
Raz Public Relations, LLC 
310-450-1482, aja@razpr.com