A couple of my children were home from University recently. Just like when they were young, I suggested a movie night. Settle in on the couch, snuggle under your favourite blanket, watch trailers and argue about which movie to watch for thirty minutes and then sit back and enjoy the chosen movie. Ahhh – the nostalgia of it all.
Everything went according to plan until the movie started. A couple of flicks on the remote, and subtitles were up on screen. I checked the language of the movie. Sure enough – it was in English. No foreign movie subtitles required. I asked the kids about their health and if they had any sudden hearing difficulties and I received a negative response.
So why did we have subtitles turned on?
This was simply a part of a modern trend. Seventy per cent of Gen Z viewers prefer to watch video content across all of their devices with closed captions! Who knew? I hadn’t received the Memo – or for that matter I also hadn’t received the video with captions that told me this was now a thing!
It was 1979 that the BBC in the UK was the first broadcaster to include closed captions (or subtitles – there is a difference). American networks followed in 1980 with Australia following in 1982.
In 1980 there was no option to just turn subtitles on or off. You needed to purchase a separate set-top box to plug in to your television. That was a $1,300 purchase in current terms. It was a lot to pay for only a small number of programs that had pre-recorded captioning. Real-time captioning started in 1982 with only a few live programs and used stenotype operators who were able to type at speeds over 225 words per minute.
In 1990 in the US the Television Decoder Circuitry Act was passed which required all television receivers to have captioning circuitry integrated in to the set by 1 July 1993. Now you could just click a button on your remote control to see captioning. For the programs that had it.
People often see logos with CC displayed (for Closed Captioning) but terms such as Open Captioning and Subtitles are also used. The different terms have different meanings in different countries but normally Subtitles are used to display words only and are typically displayed without the ability to turn them on or off. These are used, for example, with foreign language films where it is assumed you can hear but you can’t understand the words. Captioning gives words and also describes sounds (for example gunshot or siren blaring) with the assumption that the person watching the program has hearing difficulties so needs help with the context of important sounds. Open Captioning is displayed at all times while Closed Captioning is able to be turned on and off.
Live text is now mostly generated by voice to text technology while pre-recorded shows often use the show script turned in to an .SRT file that has text and timecodes which is then matched with the video. You can sometimes pick up on this method when you watch a show with captioning where the spoken words have the same intent as the text but are obviously different. The actor made a mistake but the meaning was the same so the Director has let it slide.
Watching shows with captioning is happening. Netflix has seen a doubling since 2017. Captions appear automatically in videos on YouTube and Facebook and our smartphones now have the ability to live caption our conversations.
Tell me if you are a captioning fan at ask@techtalk.digital
Mathew Dickerson