It’s Not a Dream: 360-Degree Video Puts Immersive Storytelling Within Reach

Are you among the many eLearning developers yearning tocreate immersive video for eLearning? Are you facing the prospect that afull-fledged virtual reality studio where you can create digitally generatedimmersive environments is not in next year’s budget? Don’t give up!

A new kid on the video block, 360-degree video, offers developersan affordable alternative with a low entry threshold in terms of technicalskills and knowledge. Paired with some inexpensive VR headsets, a 360-degreevideo can form the foundation of eLearning that will transport learners into compelling,immersive stories and environments.

A tight budget is no problem: Equipment to shoot, stitchtogether, and produce 360-degree video can be assembled for around $1,000!While the effect won’t be as dramatic as an elaborate, interactive virtualenvironment with a full cast of avatars, the impact on learners can still beimpressive.

Packing the kit bag

Shooting 360-degree video requires at least two cameras orlenses. A high-end 360-degree setup might include six cameras. Equipment rangesfrom a basic camera setup that will produce good-quality video or stills for afew hundred dollars to professional-level gear that runs $5,000 or more.

Ben Kreimer is ajournalism technologist and drone journalism expert. In a recent webinar on 360-degreevideo journalism, Kreimer described assembling a basic two-lens kit based on a SamsungGear 360 camera and Samsung Galaxy S7 phone for $1,060—including a tripod andmemory cards. New and updated equipment appears on the market constantly,though, so pricing and other gear-related information is subject to change.

With anywhere from two to six cameras, a developer will endup with multiple video streams. The video must then be stitched together intothe “surround” view. The need to stitch together multiple video streams occursbecause no single camera actually shoots in 360 degrees. That could changesoon: Publicity for the Orah 4i camera,which might start shipping in early 2017, promises that the camera will shoot“live 360”—stitching will occur in the camera in real time. No currentlyavailable equipment does that, according to Kreimer.

Motion is a key element of many 360-degree videos, such asvideo tours. These require that the videographer move the cameras through thescene. The videographer can hand-carry the cameras, but it’s common to use a vehicle—acart or dolly. For aerial shots, you can rig a drone to carry the cameras. The camerasare generally controlled via a smartphone, laptop, or tablet. There is not(yet) a turnkey solution on the market for aerial 360-degree video, like adrone with built-in cameras, according to Kreimer, but some videographers havecome up with solutions for mounting the camera rigs on drones; some of these mountscan even be “printed” with 3-D printers.

Stitching the pieces together

So, the video streams have been recorded; next up is editingand production—stitching the video streams together and adding logos and asoundtrack.

Most smartphone or computer apps that control the shooting canalso manage the stitching. When the cameras are mounted on a moving dolly orhand-carried, the resulting video usually needs to be “stabilized” as part ofthe editing and production; otherwise, viewers might become motion-sick fromwobbly video. Often, during editing, video producers will superimpose a stillimage, such as a logo, on the video. Editing software generally can adjust theedges of these added images to match the curve of the 360-degree video.

It’s also possible to add a soundtrack during editing.Developers who want to include audio from the scene—captured while recordingthe video—will need an external ambisonic mic. Ambisonics is a technique forproducing full-sphere surround sound; an ambisonic mic will capture all thesound in the surrounding environment and, during playback, will “position” thatsound correctly relative to the video. Typical stereo sound is not matchedwith—does not move with—the 360-degree video as people turn and move within thevideo. A stereo 360-degree mic will do a better job than a simple directionalmic, but the sound will not be as well integrated with the video as ambisonicsound would be. A simpler option would be to add a voice-over or othersoundtrack to the completed video; again, the sound will not be integrated withthe video as ambisonic sound would be.

Getting technical: Understanding parallax error

It is during the stitching process that eLearning developersworking in 360 degrees encounter a technical challenge: parallax error.Parallax error is the difference in how an object appears when it is viewedfrom different lines of sight. To get an idea of how it looks, cover one eye,then the other, while looking at a stationary object that is right in front ofyou. It will appear to move. The closer an object is to the lens (or to youreyes), the greater the distortion. Parallax error can occur with 360-degreevideo when the videos from the different cameras—different visualperspectives—are stitched together. Compact camera rigs where the lenses areclose together produce less parallax error than rigs with more distance betweencameras or lenses.

Parallax error is responsible for a key difference between360-degree and standard video: the inability to take close-up shots in 360.With a single camera, the videographer can zoom in to create shots from veryclose proximity. With 360-degree video, though, parallax error is generallyvery noticeable with objects that are closer than about five feet from thecamera, according to Kreimer. This generally rules out close-up shots. In his webinar,Kreimer suggested that videographers test a camera setup by recording a personwalking around it in concentric, narrowing circles. The resulting video willshow the distance at which parallax errors begin to be noticeable.

Videographers often exploit the parallax effect by “hiding”a dolly or tripod and microphones, if these are used. The stitching softwarelooks for areas of overlap, and removes or corrects these areas by deletingpixels representing the mismatch of the different visual perspectives. Thus amic or tripod can be “hidden” in the overlap area; it will essentiallydisappear during the stitching process. See this blog post by VideoStitchfor a more detailed explanation with illustrations.

While not a fully immersive experience—viewers cannot interact with theenvironment in a 360-degree video as they can in a digitally generated virtualenvironment—360-degree video places the learner inside the story in a way thatstandard two-dimensional video or photography cannot accomplish. It’s avaluable addition to the eLearning toolkit for storytelling, simulations,virtual tours, and other types of visual experiences, as described in “Degrees of Immersion.” For more examples, see Matt Sparks’s “Metafocus” column intomorrow’s Learning Solutions Magazine!

Share:


Contributor

Topics:

Related