BSC 2023: Sony Exec on virtual production tools

Last weekend, the British Society of Cinematographers (BSC) hosted their Expo 2023 between February 17th and 18th. At the event, various suppliers of camera, lighting and film production solutions presented the latest audiovisual recording equipment.

Extended Reality (XR) technology is reaching numerous industries, from healthcare to education. A new vertical XR technology is conquering the media and broadcast sector. Virtual production tools are an emerging talking point for many audio/visual capture teams.

Using virtual production tools, vendors are proving that XR technology has distinct use cases and applications outside of headset and smartphone-based solutions.

What is virtual production?

Virtual production solutions are emerging as a new tool for the film, media and broadcast markets. Many industry innovators are using immersive technology to replace or enhance cinematic techniques such as backdrops and in-vehicle footage.

Virtual production hardware allows on-set staff to use immersive solutions such as high-quality LED walls or volumes to create realistic environments for actors.

The technology enables media production specialists and department heads to save time and money with XR, for example by displaying real-time 3D environments (RT3D) as an immersive backdrop.

The actors and camera crew can see a volume display showing an RT3D background that reacts to camera movements to allow for a realistic, dynamic and flexible background solution.

In addition, other immersive and virtual production tools are available from other industry providers, ranging from MR Studios to volumetric capture and AR-enhanced broadcasting. Broadcasters are using Augmented Reality (AR) visualizations as an already popular form of branded media delivery. In addition, many sports broadcasting companies use AR devices to enhance programming.

Sony Volume Stages and virtual production tools

Sony is trying to move forward in the virtual production market. The company produces a premium immersive production ecosystem that leverages Sony’s Crystal LED display technology and Venice camera capture technology to create an RT3D background solution.

READ :  Why the Online Safety Act Needs a "Metaverse" Change

The combined hardware solutions use the Unreal engine to import and display an immersive environment behind a subject. The volume product and the Venice camera system work together to create a dynamic and responsive studio setting.

Sony recently opened a virtual production studio in Seine Saint Denis, north of Paris, France. The company founded the studio together with project partners Plateau Virtuel and Studios de France. The move allows film production companies to use a studio fully equipped with Sony’s virtual production solutions.

The location hosts a 90m2 Sony Crystal LED B-Series screen, fully optimized and ready to deliver unrivaled picture quality for producers and cinematographers.

At the time, Yasuharu Nomura, General Manager and VP of Business Division of Sony Corporation said:

We are the only company in the world that offers LED panels and cinema cameras. We know every technical specification and how best to use it. In order to maximize the potential of these two solutions, the engineering teams have developed and designed both in close cooperation.

In addition, the collaboration between the three companies includes the establishment of a “laboratory” studio that encourages innovation in audiovisual technology.

Content Acquisition Solutions Specialist Daniel Listh spoke to XR Today at BSC Expo 2023 to showcase Sony’s virtual production ecosystem and the impact of immersive technologies on film, media and broadcast production solutions.

Introducing Sony’s XR filmmaking ecosystem.

Daniel Listh: As a Sony brand, we’re definitely not among the cheapest out there, especially when it comes to our current range of display technology.

But we give you a premium product. We offer pixel pitches that are very low. A pixel pitch of 1.2 or 1.5 is incredibly low when it comes to building such a modular volume.

We have color accuracy that goes through the display, but it also goes through camera capture, creating harmony between the people viewing the screen and also being able to capture exactly what’s seen with the eyes.

READ :  AI Identity Verification: The key changes in digital verification and onboarding

It also looks identical on camera, so you don’t have to change the color contrast in post[-production] too much because everything is done and believable on the set.

Because you have a small pixel pitch, sets can be much smaller, meaning people can have a smaller studio and participate in virtual production services.

You don’t need, as I like to say, a huge volume in which to fit spaceships. With our solutions you don’t need that, you can opt for something much smaller and you can get closer to the subject because with a distance of 1.5 or 1.2 pixels before you get much closer to the screen You get some of the artifacts you usually want to avoid when creating in-camera VFX.

A flexible solution and audiovisual production ecosystem

Daniel Listh: It’s a very flexible solution, but it’s also a solution where you have to be careful with your investment because if you want to have a huge volume then our system is very expensive.

Of course, if you talk to us, we will find an ideal solution. We know we’re not the cheapest out there today, but we also know we have some of the best technology available.

When you go for something that is more expensive, you have less to worry about and generally have a smoother customer journey.

Thinking about customer feedback

Daniel Listh: In virtual production, especially camera VFX, people tend to see the benefits or ‘wow factor’ of walking into a volume. It’s easy to be amazed at how big and oppressive it is.

For your workflow and for the director or actor, everyone needs to understand that, and it’s quite difficult to get everyone on the same page right away.

READ :  NRF 2023: Three solutions worth seeing

It [onboarding] might need to be repeated a few times before everyone hits the penny, and that’s a really tough one, especially when you’re trying to get an investor onboard.

Should my CFO understand this, and how can I get my CFO to understand the value it brings to production? So explain it in a way that everyone understands, even those who aren’t tech savvy, because it’s a very expensive path, especially if you get it wrong.

How can virtual production technology transform the film and broadcast industry?

Daniel Listh: You can use some of our smaller cameras and start working on virtual productions, be a lot more creative and get some unique opportunities that you couldn’t do before.

We have cameras that are very affordable and have similar tools you would expect from a full flagship Venice camera.

people are very creative. It’s about getting their hands on the right tools so they can have a project where they can show off their creativity and get recognition.

Technology is becoming more aware of how to be flexible without having a lot of money.

But we also have an automation process. We’ve also used AI to generate content and it’s very interesting to see where that’s going. I don’t know exactly where it leads to, but it seems to be in a very interesting place.

Once we identify the gaps, we can target and fix them. In some such cases we talk about firmware engineering, software engineering, a firmware update, and then people have devices from us that differ from version 1 to version 2.

Other things need new hardware and we’re very excited to see what’s to come in the future based on customer feedback and insights and how we can improve with future hardware.

Limitations today are possibilities in the future.