A recent conference on accessible games, #GAConf, highlighted some of the ways in which games present unique challenges to designers and developers seeking to be inclusive.
Learning Solutions Magazine has published several articles with pointers for creating accessible eLearning content (see sidebar). Those principles apply equally to games, but they’re only a starting point. Games have a number of characteristics that require additional design and development attention if they are to be accessible. This article presents five of the more significant—and addressable—issues.
“Accessibility from the Ground Up” series:
|
Improve Engagement, Focus, and Comprehension with Closed Captions for eLearning Videos (1/5/17) |
Describe Visual Elements to Enhance eLearning Usability (2/23/17) |
What makes games different?
Designers of eLearning content generally make assumptions about learners, such as what type of input device the learner will use, typically a mouse, keyboard, or trackpad, and that learners will use a computer, tablet, or smartphone to access the content. Inclusive eLearning provides keyboard equivalents for all touch or mouse-controlled actions, closed captioning for audio content, and audio description of video. But most eLearning content has an advantage over game or immersive content: The designer knows where the learner’s attention will be focused. Games often plunge players into an environment where multiple things occur simultaneously. Immersive experiences share this problem with games, adding a 360-degree environment to the mix. When a player can be looking anywhere, focusing on any of a number of items, characters, sounds, and sights, where do developers put the captions? When do they add visual cues? What do they describe?
An additional question is how players “input” actions. Many games use controllers other than a mouse or keyboard. While game platforms and controllers might be less of an issue in eLearning games than in commercial games, the question of input is still relevant.
A third key difference is player behavior. In much eLearning content, the learner’s response is predictable, likely limited to a few defined options or branching scenarios. In many challenging games, though, players have multiple—perhaps infinite—options for how to respond to rapidly unfolding situations. A player’s response often has a bearing on their character’s “survival” or success in the game.
Some changes to game design that can increase accessibility despite these challenges are simple and add little or nothing to development costs, while others might incur a larger cost in time and resources. Among the most common accessibility features requested, according to developers at #GAConf, are mappable input controls and high-contrast color schemes. Following close behind are subtitling, audio enhancements, and control over timed events. This article addresses each of these issues, offering suggestions for how developers can improve accessibility.
1. Input controls
Games and immersive experiences are often controlled by devices other than keyboards or mice. Basic accessibility mandates keyboard shortcuts for all mouse-driven activities and compatibility with screen readers, but learning games might need additional tweaking so that players with limited mobility can navigate and move game pieces or characters.
Providing players the ability to map controls onto different devices and decide which or how many buttons or keys to use greatly expands accessibility. Developers should not assume that everyone will play the game using the same sort of control. Some players can use only a single button; others might use four arrow keys; some might use a modified controller or joystick or require a voice input.
While the specifics depend on the game and type of control needed, it’s important to offer more than one way for players to control game characters or input information. User tests at all stages of development should include lots of different control devices, as well as players who have various mobility impairments.
And remember: Input control goes beyond actual play and controllers and includes access to the user interface menus, intro screens, trailers, and tutorials. If gameplay is accessible but a player can’t actually get to the game screens, all that careful, inclusive design is for naught.
2. Color schemes
Specific combinations of colors can cause problems for people who are color-blind or have low vision. Remedies include choosing or offering high-contrast color palettes; avoiding problematic color schemes, such as those using a lot of red and green; and making it easy for players to change the color scheme. Those basic steps are the same for game content and other eLearning content.
One area of design that is especially relevant in games and interactive content is the use of colors to identify items or provide directional or navigational cues—for example, using red and green circles for “right” and “wrong” buttons or red and green arrows. Using different shapes or adding icons gets around the color issue by providing additional visual cues. These fixes aid many players, not only those who have impaired vision (Figure 1).
Other configurable options that enhance inclusivity are letting players choose the size and color of text, selecting clear fonts (or offering a choice of fonts), designing large, well-spaced hotspots and buttons, and adding clear visual cues that something is a button, link, or interactive element.
Figure 1: Different shapes and icons can provide additional visual cues besides color
3. Subtitling and captioning
Subtitles and captions are generally added with hard-of-hearing or deaf learners in mind. The truth is, though, that lots of learners and gamers will use them, even if they do not have impaired hearing. For that reason, a best practice is to make captions and subtitles highly configurable—allow players to choose size, color, font, and features like drop caps and letterboxing (which can make subtitles easier to see for some people), and position. “Advanced” options include adding the speaker’s name or a symbol to indicate who is speaking, or varying the color or placement of the subtitle according to who is speaking. Note that these features should be individually configurable, since some players will find them distracting.
A key difference with captioning or subtitling games versus instructional eLearning content is placement of the captions. On instructional content, ordinary videos, and most simulations, the developer knows where learners are looking, making placement of captions relatively straightforward. But with games where players move a character at will, there is no way to know where the character will be and what direction the player will be looking. In addition, many games use sounds as cues to the location of characters or items or even to indicate danger, potential bonuses, and more. Placing captions where the player will see them is a unique challenge of games and immersive environments. Presenting an equivalent experience to players who cannot hear ambient or locational sounds that provide crucial information is a second challenge.
Solutions suggested by Kari Hattner, a producer at Hangar 13 Games and a conference presenter, include:
- Add visual cues, such as lighting or color changes, to location-specific sound cues. It’s even possible to use visual cues for footsteps.
- Place dialogue subtitles and caption information in bubbles over characters’ heads so that a player will see who is speaking or where the sound originates.
- Represent all essential sounds with visual cues, including cues as to which direction a shot or thrown item is coming from, the splash of something or someone falling into water, etc.
An additional consideration for players with impaired or no hearing in multiplayer games is communication between players. Enabling communication via text chat in addition to voice, and including visual ways to ping or signal other players, enhances inclusivity. These also aid players in groups where there are language barriers or when play is in a noisy environment.
4. Audio enhancement
An opposite challenge arises when considering players with low or no vision. Audio enhancement of visual cues is one answer. This generally includes audio description of an environment as well as of characters and objects, and might include voice-overs of instructions and menu items as well, particularly on platforms that are not compatible with screen readers. Distinct sounds can be paired with locations, characters, objects, or events so that blind players can track their location and know who and what is around them.
Developers can offer additional orientation in the form of “pingable” maps or a spoken “GPS” that players can activate to figure out their location. Players should be able to configure all sounds separately, adjusting the volume of dialogue separately from the volume of ambient sounds or turning on and off sound cues separately from audio description and dialogue.
Like captioning, audio enhancement improves the experience of players outside the target group, aiding players who have low literacy or language barriers in addition to those with visual impairments.
5. Timed events
In gaming, a “quick-time event” is an opportunity to manipulate or control a character or item, often to gain a benefit or simply to stay “alive.” The opportunity is limited and usually requires a quick response from the player, hence the name. These pose challenges to many players—particularly those with limited mobility or vision and anyone with a slow reflex or reaction time. Eliminating quick-time events completely is one option for inclusive play, but in some games, that would impair the experience of all players. An alternative is to make quick-time events configurable or optional. Some game developers limit access to scoring or leaderboards when some features, such as quick-time events, are adjusted or turned off.
Some players find any timed events, actions that require fast or repeated button presses, or interactive elements that require dragging and dropping or other manipulation tasks, inaccessible, particularly if speed factors into a player’s score. Again, building in configurable options—allowing players to have additional time or eliminating time constraints on these elements, for example—enhances inclusivity. The amount of configurability to provide depends on the target audience, the nature of the game, and the amount of development time and resources available.
Design with inclusivity in mind
The developers who attended #GAConf are highly motivated to create inclusive games that are accessible to a broad population of gamers—and even they acknowledge that not every game can be made accessible to every gamer. Even so, attention to inclusivity early in the design process can lead to small changes that improve the experience of all players.
Joshua Straub, editor-in-chief of DAGERS, a game journalism site for disabled gamers, said, “Not every game can be or has to be accessible to every single person,” but he encouraged developers to make sure that “when you choose to put a barrier in front of any player, you know why you are doing it.”
Some features, such as mappable controls, high-contrast color schemes, and configurable audio and text options, make an enormous difference for many players. “Accessibility equals flexibility, and flexibility sells games,” Straub said. “Meeting the needs of people with disabilities also meets the preferences of other gamers.”
Designing with inclusivity in mind is essential; things that are simple to build in, like alternative color schemes for low-vision or color-blind players, are complicated to retrofit. Developers who are looking for a place to start can take a look at the Game Accessibility Guidelines or the Includification guidelines published by the AbleGamers organization. These guidelines group accessibility features into levels—basic, intermediate, and advanced or good, better, best—and explain who benefits from each adaptation.