A New Take on Augmented Reality: Cydalion Navigation App Aids People with Low Vision

Cydalion exemplifiesinnovation: The new augmented reality app from Float departs from the common understandingof augmented reality, implements a new understanding of mobile eLearning-based performancesupport—and makes navigating through an unfamiliar environment a whole loteasier for millions of people who are blind or have low vision.

The pioneering app uses sound and vibration to augmentusers’ understanding of their environment and inform them of obstacles.Cydalion is the namesake of a character from Greek mythology, Cedalion, whostood on the shoulders of the blind hunter, Orion, to guide him.

The term augmentedreality, or AR, often refers to a visual overlay of information on top of anindividual’s view of the world; many people first heard about AR when themobile game Pokémon Go exploded intopopularity in July 2016. But augmented reality does not have to be visual.

Guild Master Chad Udell, managing director at Float and amember of the team that designed and developed Cydalion, offers an alternativedefinition of augmented reality: “All it needs to do is take in informationabout the real world and then add an overlay and display it in a different andnew way.” Cydalion does exactly that. “The display is actually an audiooverlay,” he said.

About 285 million people worldwide—including 6.8 millionAmericans—have a vision-related disability, according to 2014 data from the US Census and the World Health Organization. Less than 2 percent of Americanadults who are blind use guide dogs—Guiding Eyes for the Blind, in Yorktown Heights, New York, estimates that there areabout 10,000 guide dog teams working in the US—meaning that millions of people inthe US alone navigate using a cane or other assistive device. One drawback of acane is that it doesn’t detect overhead obstacles, like a tree branch or alow-hanging light fixture, putting the user at risk of bumping into these obstacles.Cydalion aids users in detecting overhead obstacles as well as items that theymight trip over or crash into, enabling them to move more freely through theworld. When paired with bone-conduction or standard headphones that allow usersto simultaneously hear Cydalion’s feedback and monitor environmental sound,Cydalion can unobtrusively increase users’ safety and confidence as they traversecrowds, traffic, and other constantly changing environments.

Cydalion runs on a Tango-compatibledevice, ideally worn or held at chest level. (Currently, the onlyTango-compatible consumer device is the Lenovo Phab 2 Pro smartphone. A Tangodeveloper kit is also available. Cydalion is available for purchase in the Google Play Store.)

Navigating by ear

Cydalion works with Google’s Tango technology, which uses acombination of hardware and software to map and “visualize” the user’senvironment. Tango devices use multiple cameras and sensors, including awide-angle “fisheye” camera and a depth camera, to get accurate images ofthree-dimensional objects and distinguish items from their backgrounds.

“[Tango has] additional software hooks to tap into new ordifferent sensors that are on board these devices: some additional cameras;sensor technologies including infrared, depth perception; a really nicewide-angle fisheye lens; and a new and improved inertial measurement unit (IMU)chip, so it has really good, precise position data,” Udell said. “So thetablets, smartphones, etc., that are enabled with Tango sensors are much moreaware of the environment that they operate in than the traditional devices.”

The sensors and cameras gather data points from around anyobject they detect, using a process called computervision. Computer vision does not rely on GPS or any external signal todetermine the device’s position relative to other objects in the environment.Instead, Tango uses cameras and sensors to bounce infrared beams off of objectsand produce three-dimensional images of an object or environment. Computervision is used in autonomous vehicles, medical imaging—and now, as anavigational aid for low-vision or blind pedestrians. It uses algorithms tooutline objects in an image and separate them—say, separating a box on thefloor from the backdrop of the room’s walls or from other objects on the floor.

“We take that data and turn it into a ‘point cloud’ of theobject and translate that point cloud into a nonvisual interface; we are usingaudio as a sensory substitution. Sight becomes tones, essentially,” Udell said,comparing navigation using this sound information to using echolocation orsonar.

Cydalion “has a library of different sounds for objects thatare on your left or in front of you or on your right,” he said. Based on theposition, height, and proximity of the object, the app plays a combination ofsounds and provides haptic cues, letting the user know where potentialobstacles are located. Haptic feedback is perceived using the sense of touch;Cydalion causes the phone to vibrate if the haptic option is turned on.Vibration or tone intensity and speed vary according to the proximity of theobjects. Users can configure the vibration and tones. The user interface iscustomizable, too; users with low vision or colorblindness can choose aconfiguration that is easier for them to read.

In addition to computer vision, Cydalion uses elements of machine learning, a type of artificialintelligence. The software can “understand” what it detects in the environment,including depth and distance, and enable Cydalion to respond appropriately.

Tango “solved” a problem common to many augmented realityapps: In Pokémon Go, for example, thePokémon characters can “drift” as a player moves toward them, and the playerhas to keep finding them again. “Tango nullifies that drift so objects that areplaced into a spot stay anchored in that spot,” Udell said. “It’s a technologyknown as area learning, and that’s what we actually use in Cydalion toestablish some level of object permanence.” Thus Cydalion provides users anaccurate idea of where they are in relation to those objects, even when theusers are moving.

Machine learning offers future directions for Cydalion

Cydalion fits into Float’s overarching goal of seeking newdirections in eLearning, Udell said. While it obviously doesn’t fit the typicalmodel of an eLearning “course,” it is a form of performance support: “You’reproviding just-in-time information to somebody,” he said. “They can use it; itaffects their behavior; and the outcome is more successful on the other end ofit all.”

Outside-the-box thinking is essential to creating innovativeeLearning. “Float’s heritage is building mobile learning and, by extension,performance support applications,” Udell said. “We’ve had a significantlydifferent view of what eLearning is and what it can be, what the possibilitiesof all these types of cool technologies are, and what can happen with them whenthey’re used in different and new ways.”

The company’s research led them to examine augmentedreality. Lots of people look at AR and think of ways to use it in gaming andentertainment, Udell said, but the “level of utility and usefulness in thisspace, especially in terms of solving real-world problems, serious problems, seemedto be somewhat lacking.”

When studying the AR platforms available to them as applicationdevelopers, Udell said, they started to wonder: “If these devices can see somuch about the world and understand so much about the spaces that are aroundus, why couldn’t we maybe try to translate that into something that peoplecould benefit from, people that live with blindness or low vision?”

That was the genesis of the Cydalion concept, but, Udellsaid, the initial conversations weren’t grandiose: “Comically, one of the thingsthat we first thought about was using AR in the Tango platform to assist peoplewith sending text messages while they walked down the street. But if we coulduse it for something as frivolous and goofy as that, why couldn’t we try andapply it to something real and meaningful and useful?”

Tango and Cydalion are self-contained; the sensors do notrequire any web or data connection. Future features might rely on datalibraries that users would download, though, which could require at least atemporary connection. Udell said that future versions of Cydalion might add“wayfinding tools” that will help users locate doors, for example, or providenavigation inside of buildings—directions to a specific shop at a mall, or acourtroom, or an office in a building, for example. This service, called micronavigation, essentially picks upwhere GPS leaves off. Cydalion could use stored information from the user’sprevious visits to a place, or data that other users have uploaded to aCydalion data library. If a user flies into Chicago’s O’Hare InternationalAirport, for example, the app could recognize the location and offer the userdirections to a gate or a specific restaurant.

Another possible future feature would use technology thatFloat is already working on: recognizing faces or objects. Using machinelearning, it’s possible to “train” an application to recognize specificindividuals from photos; if Cydalion implemented this feature, the app couldalert a user when those individuals were nearby. It’s also possible that theapp could be “taught” to recognize categories of items, like chairs, byperforming what is essentially high-powered pattern detection. With thisinformation, Cydalion could help users find doors, trash cans, chairs, elevators—anythingit had been taught to recognize. Those are long-term goals, according to Udell.“We do have the basics of those types of technology building blocks inside thisapplication,” and the company is exploring the possibilities for future enhancements,Udell said. “The possibilities are amazing.”

Share:


Contributor

Topics:

Related