Are you among the many eLearning developers yearning to create immersive video for eLearning? Are you facing the prospect that a full-fledged virtual reality studio where you can create digitally generated immersive environments is not in next year’s budget? Don’t give up!

A new kid on the video block, 360-degree video, offers developers an affordable alternative with a low entry threshold in terms of technical skills and knowledge. Paired with some inexpensive VR headsets, a 360-degree video can form the foundation of eLearning that will transport learners into compelling, immersive stories and environments.

A tight budget is no problem: Equipment to shoot, stitch together, and produce 360-degree video can be assembled for around $1,000! While the effect won’t be as dramatic as an elaborate, interactive virtual environment with a full cast of avatars, the impact on learners can still be impressive.

Packing the kit bag

Shooting 360-degree video requires at least two cameras or lenses. A high-end 360-degree setup might include six cameras. Equipment ranges from a basic camera setup that will produce good-quality video or stills for a few hundred dollars to professional-level gear that runs $5,000 or more.

Ben Kreimer is a journalism technologist and drone journalism expert. In a recent webinar on 360-degree video journalism, Kreimer described assembling a basic two-lens kit based on a Samsung Gear 360 camera and Samsung Galaxy S7 phone for $1,060—including a tripod and memory cards. New and updated equipment appears on the market constantly, though, so pricing and other gear-related information is subject to change.

With anywhere from two to six cameras, a developer will end up with multiple video streams. The video must then be stitched together into the “surround” view. The need to stitch together multiple video streams occurs because no single camera actually shoots in 360 degrees. That could change soon: Publicity for the Orah 4i camera, which might start shipping in early 2017, promises that the camera will shoot “live 360”—stitching will occur in the camera in real time. No currently available equipment does that, according to Kreimer.

Motion is a key element of many 360-degree videos, such as video tours. These require that the videographer move the cameras through the scene. The videographer can hand-carry the cameras, but it’s common to use a vehicle—a cart or dolly. For aerial shots, you can rig a drone to carry the cameras. The cameras are generally controlled via a smartphone, laptop, or tablet. There is not (yet) a turnkey solution on the market for aerial 360-degree video, like a drone with built-in cameras, according to Kreimer, but some videographers have come up with solutions for mounting the camera rigs on drones; some of these mounts can even be “printed” with 3-D printers.

Stitching the pieces together

So, the video streams have been recorded; next up is editing and production—stitching the video streams together and adding logos and a soundtrack.

Most smartphone or computer apps that control the shooting can also manage the stitching. When the cameras are mounted on a moving dolly or hand-carried, the resulting video usually needs to be “stabilized” as part of the editing and production; otherwise, viewers might become motion-sick from wobbly video. Often, during editing, video producers will superimpose a still image, such as a logo, on the video. Editing software generally can adjust the edges of these added images to match the curve of the 360-degree video.

It’s also possible to add a soundtrack during editing. Developers who want to include audio from the scene—captured while recording the video—will need an external ambisonic mic. Ambisonics is a technique for producing full-sphere surround sound; an ambisonic mic will capture all the sound in the surrounding environment and, during playback, will “position” that sound correctly relative to the video. Typical stereo sound is not matched with—does not move with—the 360-degree video as people turn and move within the video. A stereo 360-degree mic will do a better job than a simple directional mic, but the sound will not be as well integrated with the video as ambisonic sound would be. A simpler option would be to add a voice-over or other soundtrack to the completed video; again, the sound will not be integrated with the video as ambisonic sound would be.

Getting technical: Understanding parallax error

It is during the stitching process that eLearning developers working in 360 degrees encounter a technical challenge: parallax error. Parallax error is the difference in how an object appears when it is viewed from different lines of sight. To get an idea of how it looks, cover one eye, then the other, while looking at a stationary object that is right in front of you. It will appear to move. The closer an object is to the lens (or to your eyes), the greater the distortion. Parallax error can occur with 360-degree video when the videos from the different cameras—different visual perspectives—are stitched together. Compact camera rigs where the lenses are close together produce less parallax error than rigs with more distance between cameras or lenses.

Parallax error is responsible for a key difference between 360-degree and standard video: the inability to take close-up shots in 360. With a single camera, the videographer can zoom in to create shots from very close proximity. With 360-degree video, though, parallax error is generally very noticeable with objects that are closer than about five feet from the camera, according to Kreimer. This generally rules out close-up shots. In his webinar, Kreimer suggested that videographers test a camera setup by recording a person walking around it in concentric, narrowing circles. The resulting video will show the distance at which parallax errors begin to be noticeable.

Videographers often exploit the parallax effect by “hiding” a dolly or tripod and microphones, if these are used. The stitching software looks for areas of overlap, and removes or corrects these areas by deleting pixels representing the mismatch of the different visual perspectives. Thus a mic or tripod can be “hidden” in the overlap area; it will essentially disappear during the stitching process. See this blog post by VideoStitch for a more detailed explanation with illustrations.

While not a fully immersive experience—viewers cannot interact with the environment in a 360-degree video as they can in a digitally generated virtual environment—360-degree video places the learner inside the story in a way that standard two-dimensional video or photography cannot accomplish. It’s a valuable addition to the eLearning toolkit for storytelling, simulations, virtual tours, and other types of visual experiences, as described in “Degrees of Immersion.” For more examples, see Matt Sparks’s “Metafocus” column in tomorrow’s Learning Solutions Magazine!