Joe Clark: Media access

You are here: joeclark.orgCaptioning and media access
Adding words to the screen

See also: Sidebar on audio description

Updated 2001.07.15

Adding words to the screen

The MoPix project opens up first-run movies for deaf moviegoers

It’s been almost 20 years since captioning for deaf and hard-of-hearing viewers debuted on TV and more than a decade since closed-captioning – which requires a separate decoder to display the captions onscreen – made all kinds of television programming widely accessible. But what about first-run films? Until now, a deaf or hard-of-hearing moviegoer keen on seeing Forrest Gump or Four Weddings and a Funeral had to wait until the movie appeared on home video, and even then it might not be closed-captioned. Subtitled movies are of little benefit: They are rare at North American movie houses, and subtitles, intended for hearing people, don’t convey the information deaf and hard-of-hearing viewers need (for example, clear indication of who is speaking and notation of sound effects like ringing telephones and thunderclaps).

Now, researchers at the National Center for Accessible Media at WGBH – the Boston PBS Überstation whose Caption Center division has been turning out high-quality captions for two decades – have devised a gamut of prototype technologies to make first-run movies in cinemas accessible to deaf and hard-of-hearing viewers. NCAM’s Motion Picture Access Project, known in-house as MoPix, got its start with a modest U.S. government grant in 1992. Researchers began designing prototype systems for displaying captions in cinemas.

To be a success, a movie-captioning system faces stringent criteria. It has to be cheap and easy to use, legible, compact, and above all, unattractive to thieves. And, says NCAM director Larry Goldberg, a system would have to function virtually anywhere in the house “so you didn’t have to have a specialized ’deaf section’ in the theatre.”

NCAM has conducted field tests of three different technologies for making cinemas accessible. The tests, which took place in Boston and Washington cinemas (including a massive Imax screen at the National Air and Space Museum), included hearing people in the audience, Goldberg explains, “because we wanted to see what hearing people would think if they were going to a theatre with these devices around them. Would these devices interfere with their pleasure?” (The “necessity” to avoid bothering finicky hearing viewers has forced designers of captioning systems around the world to come up with sophisticated, expensive techniques to keep captions out of hearing viewers’ sight. However, there is no solid research showing that most hearing people will always object to captioning, and no meaningful public trials of always-visible captioning have ever been held.)

The technologies MoPix engineers tested were:

  • A seatback display: Mounted in the back of each seat is a green fluorescent display (the kind you find on many supermarket cash registers) which displays the caption text. The display can be moved up or down to place it in comfortable viewing area.
  • A rearview display: A large light-emitting diode (LED) screen is installed at the back of the cinema which displays caption text in reverse. Using a transparent but somewhat reflective plexiglas panel secured to the seat by a gooseneck stalk, a viewer can watch the film through the plexiglas and read the reflected captions at the same time.
  • Virtual Vision glasses: These high-tech gizmos, akin to a video Walkman, were originally designed to make TV viewing portable. Virtual Vision glasses incorporate a tiny TV screen whose image is reflected onto the front of the glasses, making the image appear to float in mid-air. Caption text is displayed on the “virtual” TV screen while the viewer watches the movie through the glasses.

All the technologies have pros and cons, and all received mixed reviews from deaf and hard-of-hearing users in the field tests. The seatback display is readable but potentially bothersome to people seated alongside or behind such a display (though directional filters can limit the spillover of light). The rearview display is dirt cheap and nearly disposable, but the LED screen needed to drive it costs good money (over US$12,000). Virtual Vision glasses sell for about $400 each and start to feel heavy when worn for an entire two-hour movie.

Goldberg doesn’t expect to see real-world production versions of any of these technologies for another two years. Production versions may make use of Cine-text, a National Film Board of Canada system for synchronizing titles of all kinds to a film’s audio. Cine-text digitizes the film’s soundtrack (dialogue and all) and compares the stored version with the soundtrack as it is actually heard in a theatre. If some snippet of film is missing, or if a projectionist is slow at changing reels, the system automatically compensates by dropping or delaying the appropriate titles, according to Ed Zwaneveld of the NFB’s R&D division.

By combining MoPix captioning technologies and Cine-text – possibly supplemented by “audio description” for blind people (see sidebar) – it may someday be possible to run a feature film in its original version with captions for deaf viewers, subtitles for hearing viewers in a variety of languages, and narrations for blind viewers, with all those groups seated shoulder-to-shoulder in the same theatre at the same time. Talk about national unity!