Introduction and Terminology

This chapter gives a brief introduction to Viz Virtual Studio (VS). It is intended for readers who are not familiar with Virtual Studio and will introduce a few key concepts.

The term virtual studio in general normally refers to tools which seek to simulate a physical television and/or movie studio. Viz Virtual Studio is a set of software and technologies to create virtual studios.

A virtual studio is a television studio that allows the real-time combination of people or other real objects and computer generated environments and objects in a seamless manner. A key point of a virtual studio is that the real camera can move in 3D space, while the image of the virtual camera is being rendered in real-time from the same perspective. The virtual scene has to adapt at any time to the camera settings such as zoom, pan, angle, traveling and more. This is what differentiates a virtual studio from the traditional technique of Chroma keying.

A major difference between a virtual studio and the bluescreen special effects used in movies is that the computer graphics are rendered in real-time, removing the need for any post production work, and allowing it to be used in live television broadcasts. The virtual studio technique is also different from Augmented Reality (AR) which is a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data.

graphics/overview_vs_illustration1a.png
graphics/overview_vs_illustration1b.png
Key Technologies and Procedures

The background (typically green) is masked away in real-time with a virtual software-created background. The physical objects in front (persons, desks, etc.) is merged with the virtual background creating the illusion of physical presence.

The background will be created and prepared in advance, typically using Viz Artist or by importing 3D-scenes into Viz Artist and saving them as scenes in the Graphic Hub.

The Viz Engine, a real-time compositing engine, is responsible for combining the physical and virtual objects and to display the results. To display the results could be to send the signal to a display, an IP stream or to send the signal downstream to a video mixer.

An important part of the virtual studio is the camera tracking that uses either optical or mechanical measurements to create a live stream of data describing the perspective of the camera. Exact camera and sensor tracking with as little delay as possible is the key difficulty to solve in a virtual studio.

Software Components
  • Tracking Hub: a service process (runs as a console program without a GUI) responsible for receiving and forwarding measurements from cameras, trackers and other sensors. Must be kept running at all times.

  • Studio Manager: a GUI for setting up, defining and controlling a Tracking Hub and the Virtual Studio environment.

Viz Virtual Studio uses the Viz Engine to render the output signal, whereas a client application such as Viz Trio, Viz Mozart or Viz Opus is normally used to control the overall broadcast.

Documentary

If reading this on an Internet-connected device you can view the Viz Virtual Studio documentary (see links below). See also Words and terminology in the appendix chapter for common words and their meaning.

The Secrets of virtual set production documentary is divided into five parts:

  1. What is virtual set and why should broadcasters use them?

  2. Virtual sets versus augmented reality

  3. Essential Vizrt tools

  4. Virtual set tools

  5. Design your story