The ultra-compact yet powerful Intel® NUC computers offer both value and convenience. The Intel NUC 8 is a powerful workstation that meets most of my production and creative needs as a multimedia immersive artist. It is more than capable of creating interesting content for HD video, virtual reality (VR), augmented reality (AR), and immersive, 180-degree video-based projection known as fulldome.
The Intel NUC 8’s 4 x 4-inch size makes it easy to use in my three locations, each of which is set up differently:
- At home, I connect the Intel NUC 8 to a monitor, which shows the view from a head-mounted display (HMD).
- At the Vortex Immersion Media* dome in Los Angeles, I connect the Intel NUC 8 to a server mapping the fulldome.
- At Will Michaelsen’s Los Angeles studio, CutMod, I connect the Intel NUC 8 to a monitor and a ceiling projector. A projection on the wall shows the view from an HMD.
Figure 1. The ceiling projector(left); Me in the HTC Vive*, in front of the wall being projected on (right) . The Intel® NUC 8 is out of shot. This is from a VR project, Freedom Festival, that I am creating in Unreal Engine* on the Intel® NUC 8.
The Intel NUC 8 is sold as a bare-bones device, to which you add RAM, storage, and an operating system. But it is also available as a kit with these installed. I use the Intel NUC 8i7HVK with the following specs:
- Intel® Core™ i7-8809G processor, 3.10 GHz
- Memory up to 32 GB
- 64-bit operating system, x64-based processor
- 256 GB solid state drive
- Radeon* RX Vega M GH graphics
Installation and Configuration
Installing software on the Intel® NUC 8
Installing the HTC Vive* on the Intel NUC 8
The HTC Vive* works very well with the Intel NUC 8, with easy-to-follow instructions on Steam*. It requires only two ports: HDMI and USB. At home I put up the two base stations required for tracking, installed the drivers, and set the room space (the area in which you walk and are tracked).
At the CutMod studio, the Intel NUC 8 and HTC Vive work beautifully. I reset the room space for this much larger area, but otherwise it is easy to set up the HTC Vive in new environments. Changing the HTC Vive from one computer to another is also straightforward.
Installing Windows* Mixed Reality on the Intel NUC 8
The Intel NUC 8 came ready to install Windows* Mixed Reality. I’ve yet to create content for it, but plenty is available for download that works well.
Like the HTC Vive, it works in 360 VR, although lifting the HMD to check where you are is easier. The setup is straightforward. Boundaries for the space must be outlined with the sensors but, unlike the HTC Vive, no base stations are required for tracking.
Figure 2. The Intel® NUC 8 desktop, with the Windows* Mixed Reality app highlighted (left); Intel® NUC 8 monitor during installation (right).
Figure 3. Will Michaelsen and Ryan Legge testing Windows* Mixed Reality at CutMod (left to right).
VR Projects and the Intel® NUC 8
The Intel NUC 8 is ideal for VR games, such as those downloadable from Steam. It is also suited to creating VR and mixed reality projects.
Using the Intel NUC 8 for a VR Project in Unreal Engine for the HTC Vive
The Unreal Engine provides templates that make it easy to set up a VR project. The steps are outlined below.
This demo assumes you have a basic understanding of Unreal Engine.
Launch the newest version of Unreal Engine. I use version 4.20.0.
Figure 4. Starting page for Unreal Engine* online learning.
The Projects browser window will appear. Ignore any projects you already have that appear in this window and select New Project.
Figure 5. Select New Project to get started.
Select the Blueprint tab, select the Virtual Reality icon.Note: under the C++ tab there is no Virtual Reality icon.
Select Desktop/Console, because this project is for the HTC Vive. Otherwise, select Mobile/Tablet. On the same screen, select Maximum Quality and With Starter Content. (You can select No Content, but my steps use the supplied starter content).
Select Create Project in the bottom-right corner.
Figure 6. Select Virtual Reality project and Desktop/Console settings.
When Unreal Engine reloads, your newly created project will open with the template screen. Select the Virtual Reality BP folder, and click Maps/Motion Controller Map.
Figure 7. VR template screen.
Your VR project is set up for building. The VR Pawn—your VR camera—is selected.
Figure 8. VR project begins to take shape.
From the Starter Content/Content/Props folder, drag the SM chair onto the set. You can add and import other assets from the starter content folders or from 3D packages such as Maya or Blender.
Figure 9. Dragging the SM chair onto the set.
Select Play>VR Preview to view the project in the HTC Vive and set multiplayer options.
Figure 10. Setting multiplayer options.
Wearing the HTC Vive, view the image on the preview screen.
Figure 11. Preview screen, showing the SM Chair asset.
Complex projects can be created in Unreal Engine for the Intel NUC 8. Below are rough starts of several of mine. I created assets in Maya on the Intel NUC 8 and imported them—using the FBX* extension—into Unreal Engine.
The first project, Freedom Festival, is VR via the Intel NUC 8.
Figure 12. Rough start of the VR project Freedom Festival, projected on the studio wall. (The center image is on a monitor.) The project depicts the grounds of a music festival. Music plays when an animated creature or abstract form is released from a cage.
The robot animation in Robots on Platforms utilizes motion capture by Noitom. I ran it through Autodesk MotionBuilder* and Maya, applying it to robot figures. I imported the result from Maya, as an FBX file, into Unreal Engine.
I need to optimize it to improve the frame rate when I take it into VR on the Intel NUC 8. But I expect frame rate problems to have been eliminated by the time the project is packaged as a VR app.
Figure 13. Rough start of the Unreal Engine project Robots on Platforms,shown on a monitor to which the Intel® NUC 8 is connected.
Using the Intel NUC 8 to Start a VR Project in TouchDesigner* for the HTC Vive
I use the visual development platform TouchDesigner in live performances and to render videos including VR. On the TouchDesigner website are these words to describe the program:
“TouchDesigner is a visual development platform that equips you with the tools you need to create stunning real time projects and rich user experiences. Whether you're creating interactive media systems, architectural projections, live music visuals, or simply rapid-prototyping your latest creative impulse, TouchDesigner is the platform that can do it all.”
Note: TouchDesigner has a node based interface with 5 different categories of Nodes (operators), TOP, COMP, MAT, DAT, and SOP.
A basic VR file—created by Jarrett Smith, product architect at Derivative—can be used as a start for your own project.
Jarrett Smith, product architect at Derivative says:
“It’s designed to make it easy. It is set up so you don’t need to know any programming to make something interactive. It’s also set up for easily sharing assets, and supports drag and drop of assets (components) to and from your desktop. Some of the optimizations we put into the main TouchDesigner engine over the past year greatly benefit the VR system. It’s now possible to design systems that run at a solid 90 Hz.”
TouchDesigner includes a category of specialized VR focused components which can be used to get a head start building interactive VR experiences for either the Vive* or Oculus* headsets. A special version of TouchDesigner was tested for this article. This version can be downloaded here.
The TDVR system provides a preset component for both “OpenVR and Vive” or Oculus hardware headsets. It’s also set up for easily sharing assets, supporting drag and drop of assets (components) to and from your desktop. According to Derivative there have been a wide range of optimizations, tuning the TouchDesigner engine for VR. It’s now possible to design systems that run at a high frame rates such as 90 FPS.
The TDVR system also includes an “in VR” workstation that permits editing of the VR experience using the TouchDesigner user interface directly in the VR world. To get you started, the system includes samples of interactivity; for example, a teapot changes color depending on your gaze in the HMD. Should you not want a teapot in your project, it is easily deleted, or modified.
Make sure you have the latest version of TouchDesigner, and I recommend acquiring basic knowledge of the program before attempting a VR project. Those familiar with TouchDesigner know it is a node-based program with 5 different categories of Nodes (operators), TOP, COMP, MAT, DAT, and SOP. Sometimes I will be referring to these nodes as components.
I am going to be adding a piece of geometry (a geo node COMP) and turning it into particles to demonstrate how easy it is to take this VR demo and start building your own VR project from it.
Figure 14. TouchDesigner* opening screen.
Above is the opening screen when you open the VR file. In Figure 14, I placed a red dot next to a geo COMP, which has been titled World. This holds all the components we will see in the virtual world. In the parameters of the node titled vrHMD, check to see that Render Settings/Device VR, Vive/OpenVR is selected. Next, double left-click the World component to enter it.
Figure 15. Creating a geo COMP.
In the World component, create a geo COMP. In its parameters under Xform, translate it up two on the Y axis. Note that, as you already have a geo COMP, this one will automatically be geo1.
Figure 16. Translating the geo COMP.
To enter the geo1 COMP you created, double left-click on it. There you will see a torus1 SOP. Create an out SOP (as each new node of the same type is numbered up one it will be out1 SOP) and hook it up to the torus1 SOP. Go back up one level to the geo1 COMP by typing “u” or zooming out.
Figure 17. Creating a large torus in the sky above the teapot.
In the parameters of the geo1 COMP, click Add Tag in the top-right corner. In the field that appears, type RenderGeo. A large torus will appear in the sky above the teapot. Congratulations: You have added geometry to your VR project! Put on your HTC Vive to view.
Figure 18. Creating a particle SOP.
Double left-click the geo1 COMP to go inside it. Create a particle SOP between the torus1 SOP and the out SOP. In the particle1 SOP parameters, under State/Particle Type, select Render as Point Sprites.
Figure 19. Creating a point sprite MAP.
Create a point sprite MAT and a material SOP. In the parameters of the point sprite1 MAT, set the constant point scale to six. Drag the point sprite1 MAT into the parameters material slot of the material1 SOP.
Figure 20. Creating a ramp TOP.
Create a ramp TOP. In the parameters, choose type circular. Choose the colors you want in the ramp graph. Drag the ramp TOP into the point sprite1 MAT Color Map parameter.
Figure 21. Adjusting the torus.
If you do not want the torus to appear in the VR scene, simply turn off the render button circle at the bottom of the torus SOP. Adjust the life span and birth of the particles in the parameters of the particle SOP. In the parameters, under Forces, set the movement of the particles.
Figure 22. Preview screen.
There are many tools for creating interactivity in VR in the palette. The PDF that comes with the download details these tools.
Figure 23. Two views of the VR demo template from TouchDesigner* with the example particles I added: on the left, the monitor; on the right, projected on a wall.
Figure 24. More complex animations on TouchDesigner* running on the Intel® NUC 8.
I took old TouchDesigner animation files (not VR projects) onto the Intel NUC 8 and found they easily performed at 60 frames a second (see Figure 23).
Will Michaelsen and I hooked up Microsoft Kinect* so he could try his interactive TouchDesigner projects on the Intel NUC 8. They performed very well.
Figure 25. Will Michaelsen in front of two interactive projects, projected on the studio wall. He used the Microsoft Kinect*, but the Intel® RealSense™ camera would also facilitate the interactivity.
Using the Intel NUC 8 In a Fulldome
Fulldomes are immersive, 180-degree, video-based projection environments; used in settings from planetariums to temporary installations at conferences, performances, and festivals. Amid increasing interest in immersive entertainment, the number of permanent fulldomes worldwide has grown to more than 1,300. They provide shared experiences for live audiences, unlike the isolation of a VR headset.
Different projection systems are used. One is a single, central projector with a fisheye lens. Its drawbacks are limited resolution and the projector obscuring the center of the dome.
More popular is a system using two or more projectors. Blending the edges of images from the two means the dome is seamlessly covered. However, unless the projectors are precisely aligned, with matching brightness and color, the overlap of the images can be noticeable.
Domes with multiple projectors use a variety of mapping systems to split the videos or images. When I create a video, it is in dome master form: a fisheye circle specifically for dome projection.
Figure 26. Dome projection.
Using the Intel NUC 8 for Projection in a Dome
The Intel NUC 8 can convey real-time output from programs such as TouchDesigner, Resolume, or Modulate* to the server that maps the dome. At the Vortex Immersion dome I use Resolume Arena 6.
The Vortex Immersion dome is mapped by three projectors that receive input from one server. The Intel NUC 8’s multiple ports are very handy: I connect it to the server via a capture card, using the mini display port with an active adapter. A USB 3 port amps up the power to the adapter. I connect one of the Intel NUC 8 USB ports to a keyboard, another to a mouse, and a third to a hard drive. I use an HDMI port to connect to a monitor, and one of the two ethernet ports to connect to the internet. I use the audio jack for the audio out.
Figure 27. Intel® NUC 8 rear ports.
Figure 28. Intel® NUC 8 front ports.
Figure 29. The Intel® NUC 8 (back monitor) works with the dome server (front monitor) to project video on the Vortex Immersion dome.
Figure 30. The Intel® NUC 8, running Resolume Arena 6 and connected to the server, projects video onto the dome.
Figure 31. Another example of the Intel® NUC 8 working with Resolume Arena 6 to project video onto the Vortex Immersion dome.
Using the Intel NUC 8 in Innovative Ways
Russ Haines, CTO/CEO of Stratus Systems, Inc. and a founder of Eye Vapor, suggests that instead of one server distributing video to multiple projectors, quality and resolution would be improved by connecting one Intel NUC 8 to each projector.
The Intel NUC 8, he says,
“lets us treat complex, high resolution maps with a commodity-level solution, instead of a very expensive, multi-server solution.”
Jeff Smith—Haines’s former partner at Eye Vapor—says:
“The new NUC, with a dedicated graphics processing unit (GPU), should enable all kinds of abilities [where] the original NUCs were limited.”
Will Michaelsen, founder and operator of CutMod, an experiential design studio based in Los Angeles says:
“I was very pleased to see the NUC 8 can handle my Joy Displacement interactive art piece. My NUC6i7KYK struggled to hold a decent frame rate, but the discrete GPU of the NUC 8 enables smooth performance. I had resorted to a bulkier mini PC from a competing brand, but now consider the NUC a viable solution for installations. It fits nicely behind a flat screen and is less expensive than the competing PC. Furthermore, the USB 3 implementation is more robust on the NUC 8, allowing me to use Kinect V2. I’m excited to try NUC 8 with the new Intel® RealSense™ cameras — real-time point cloud video ideas abound!”
For years I wanted a powerful yet small computer with which I could easily travel, work, create, and display cutting edge graphics and immersive media. The Intel NUC 8 makes this possible. With only 16 GB of RAM—not the full 32 GB it can support—the performance is great.
The possibilities for the Intel NUC 8 have yet to be fully explored. I would like to configure it with an Intel RealSense camera, especially for fulldome and multiscreen projection mapping.
The Intel NUC 8’s compact size makes it ideal for commercial and art installations. This makes it useful for my work, but I can also see possibilities in robotics, especially teamed with the Intel RealSense camera.
Special thanks to everyone at Vortex Immersion, but especially to Kate McCallum, Ed Lantz, and Kalin Holmes at Vortex Immersion Media.
About the Author
Audri Phillips is a Los Angeles-based visual designer/art director, 3D animator, content creator, and immersive media specialist. Her experience includes over 25 years in the visual effects and entertainment industry, at studios such as Sony*, Rhythm & Hues*, Digital Domain*, Disney*, and DreamWorks* Animation. Audri is a member of the Intel Innovators group, writes online articles for Intel, is a member of Autodesk Developers Network, is a resident artist at Vortex Immersion Media, and is a consultant and researcher for the start-up patent technology company Stratus Systems. She is the creator, cofounder, and director of Robot Prayers, an immersive transmedia project that explores AI and our identities in a world where man and machine are melding.
Audri has a BFA from Carnegie Mellon University and has been an adjunct professor at universities in Los Angeles. She taught a fulldome masterclass in Shanghai, China, and spoke at and had her work shown at the VRTO* VR conference in Toronto, Canada as well as Siggraph*.