Download PDF [PDF 1.05 MB]
Ubisoft’s Rainbow Six Siege* is the latest release in a long line of hit games based on the late Tom Clancy’s 1998 novel. No ordinary game franchise, it has sold millions of copies across multiple platforms and continues to rank among the elite worldwide titles. During this round of development, Ubisoft set about to integrate Intel® RealSense™ technology in clever ways to bring new value to a hungry public, and with new advances in socialization and in-game features, they delivered.
The Rainbow Six franchise defined the tactical shooter genre, forcing players to form up in squads, where the focus is on positioning, timing, teamwork, and tactics rather than on sheer firepower from imaginary weapons. Communication is crucial, and the new installment puts a heavy emphasis on cooperation between players. Continual practice and adjustment of tactics can put teams over the top, so giving teams a way to watch replays of their training can allow them to refine their skills and fix mistakes.
Gaming Broadcast for Glory
As gaming evolves, it’s no longer enough to simply post a high score on a bulletin board somewhere—players now regularly upload personal videos that replay their victories and share the glory. The term “gaming broadcast” encapsulates that trend, and it’s a hot segment that took off when Twitch* started livestreaming popular tournaments.
Professionals can promote their upcoming tournaments, livecast their training sessions, provide real-time tips, and more. With approximately 100 million global views in a single month, Twitch offers a steady source of viewers and has an easy-to-use and well-developed monetization system. Twitch dominates the eSports industry, and as growth continues, viewers demand higher quality content. It was only natural that top developers such as Ubisoft would add value to that capability.
Figure 1: Rainbow Six Siege* is the latest title in a long-running and hugely successful franchise.
A growing segment of PC users are now consuming more hours of live-streamed content than they do traditional TV. Breaking through and personalizing that stream is key. The generators of this content like to overlay their physical presence as well. When viewers are watching their gameplay online, they want to see more than just gameplay; they also want to see how the gamer reacts.
Razer Stargazer Webcam Pushing the Boundaries
Razer Stargazer webcam is the camera on the market that uses Intel® RealSense™ technology, and it has the ability to easily cut the background behind a talking head and incorporate just the person, without distractions. The most successful streamers are the ones who have a professional look and feel, where their video background is removed so you see only the headshot. Without an Intel® RealSense™ camera, a streamer would need an expensive green-screen setup, but once that barrier to entry is removed, costs come way down.
Within Stargazer webcam there are three cameras that act as one: a 1080p high-definition camera, an infrared camera, and an infrared laser projector. They combine to “see” like the human eye to sense depth and track human motion.
The new Stargazer webcam is the short-range camera to have. It improves in various respects over its predecessor, https://en.wikipedia.org/wiki/Intel_RealSense - cite_note-12 including:
- 60-percent improved range
- 5x reduction in standby power
- 2x improvement in gesture speed
- 8x improvement in depth and red-green-blue (RGB) latency
Developing for RealSense
RealSense is a combination of hardware and software - the camera, the driver and the SDK. All three components are required to develop for RealSense. End users will need the camera and the driver and can install just the RealSense runtime components instead of the full SDK. The SDK comprises of many components designed for different applications. Background removal is one such component. In the SDK, this component is known as 3D segmentation or user segmentation.
After the RealSense SDK is installed, its 3D segmentation component can be utilized by an application. An application would invoke the PXCSenseManager::Enable3DSeg() function to enable the component. The code snippet in Figure 2 illustrates this.
// Create a SenseManager instance PXCSenseManager* pSenseManager = PXCSenseManager::CreateInstance(); // Enable the user segmentation module pxcStatus result = pSenseManager->Enable3DSeg(); // Get a segmentation handler PXC3DSeg* pSeg = pSenseManager->Query3DSeg(); // Initialize the pipeline result = pSenseManager->Init();
Figure 2: Enable the 3D segmentation component by invoking PXCSenseManager::Enable3DSeg().
Once the 3D segmentation component is enabled, the image retrieved from the camera will be processed by the RealSense runtime. Background in the image will be removed. The RGBA data in the background removed image can be accessed as illustrated in Figure 3. The alpha channel is used to distinguish pixels corresponding to the user from background pixels. Background pixels have zero alpha value whereas pixels which correspond to the user have an alpha value greater than zero. The RGBA data can be copied to a texture, and subsequently overlaid onto an area in the game.
// Get the segmented image from the PXC3DSeg video module
result = pSenseManager->AcquireFrame(true); PXCImage* segmented_image = pSeg->AcquireSegmentedImage(); // Acquire access to the RGBA data in the segmented image PXCImage::ImageData segmented_image_data; segmented_image->AcquireAccess(PXCImage::ACCESS_READ_WRITE, PXCImage::PIXEL_FORMAT_RGB32, &segmented_image_data); ... // Release access, image and frame when done segmented_image->ReleaseAccess(&segmented_image_data); segmented_image->Release(); pSenseManager->ReleaseFrame();
Figure 3: Copy the 3D segmented frame to a texture to be rendered.
Running 3D segmentation can use up CPU cycles. 3D segmentation is usually not run in isolation. Often, the system is also running a game. The video stream from the camera is processed by the 3D segmentation component, overlaid onto the game and the composed frame is encoded before it gets streamed to Twitch. Sufficient computational cycles must be available on the system to yield a good experience. Most scenarios will require a high performance system with a quad core CPU and a discrete GPU.
The 3D segmentation component has a performance setting that can be adjusted. This setting offers a way to trade off quality for performance. The application should allow this setting to be adjusted by the user. This is no different than a game having adjustable graphics effects depending on the hardware that it is running on. The function to adjust this performance setting is PXC3DSeg::SetFrameSkipInterval(pxcI32 nframes). It sets the number of incoming frames to skip processing on.
The RealSense SDK and documentation are available on the Intel RealSense website. The SDK contains samples to illustrate different applications of the technology. RealSense requires a 4th generation Intel Core processor or later, Windows 8.1 64-bit or later, and a USB 3 port for the RealSense camera.
The Intel® RealSense™ Technology Evolution Continues
In Rainbow Six Siege, the drone camera used in the Siege scenario is upgraded to calculate and show distances and measurements inside the rooms. In the real world, because Intel RealSense technology can measure distance with a 3D depth camera, Ubisoft applied that in a virtual way inside the game. If you have an Intel RealSense camera plugged into the computer you’re playing on, the game enables this extra feature within the playing area. It expands on the drone surveillance camera functionality, so that when you send in the drone, you get this extra capability, where the drone now reports the distances to characters and targets. That’s an additional unique exclusive feature for Intel RealSense technology customers plugged into the game.
Figure 4: When Rainbow Six Siege* detects a Stargazer* webcam, the game’s drone displays room measurements Adding your image into the game increases socialization and competition.
Virtual reality will be a strong growth area in 2016 and 2017, and Intel RealSense technology can help drive that potential. With the arrival of several head-mounted displays in the marketplace, the future is quickly taking shape. VR is all about the immersive experience, about the reality. Games enabled with RealSense can bring scanned characters and objects into the virtual reality.