In our earlier post on accessibility services in 360° environments (here) we have described how these services can enhance user experiences and make 360° videos more accessible.
In the EU-funded Immersive Accessibility project ImAc subtitles are extended with guiding cues to support easier speaker identification within the 360° space to enhance the user experience and to make 360° videos more accessible. For instance, this can be an arrow shown left or right of the subtitle line pointing toward the current speaker. User tests have proved that these cues are highly appreciated.
In addition to the presentation of subtitles for the end user, we were inspired to investigate if authoring of subtitles for 360° content can be enhanced as well. Traditionally, editing and authoring of subtitles is done by means of a “regular” PC application, using mouse and keyboard to control the video content and to edit the subtitles. To support the above-mentioned guiding cues, the subtitle editor needs to set the speaker position related to each subtitle in the 360° scene.
For 360° content it may feel more natural and intuitive, if an editor can produce subtitles from within the VR experience he is creating; this might lead to a benefit for the production process and maybe even for the final product itself.
To test this, we developed a prototype targeting a very basic scenario with a straight-forward interaction: a subtitle editor watches the 360° video through VR glasses, including pre-prepared subtitles. He can set the speaker position for each subtitle directly inside the 360° environment using a VR controller. Again, the idea is that the editor is closer to the final product, i.e. to the experience that the end user will have.
The video below shows a short summary of the prototype
We invited professional subtitle editors to evaluate the positioning process via the ImAc subtitle editing tool (running on a PC) and our VR prototype. The qualitative study contained two expert groups, one coached by IRT and one by the Autonomous University of Barcelona (UAB). The experts tried out both tools for setting the speaker position, had a lively discussion on how the methods performed and gave suggestions for improvements.
The participants in these focus groups had no major problems with either method of setting the speaker position, both being described as intuitive. Although the participants in the IRT focus group found it hard to directly compare the tools, they found the VR environment easier to navigate and appreciated that the process needs less steps than the PC-based method. As well-established subtitle editing workflows currently are based on PC-environments, a solution for setting the speaker position based on the ImAc editing tool would be easier to incorporate. Since our prototype does not yet support all features of a professional subtitle editing application, it was difficult to say whether a workflow entirely in VR would be equally productive. The UAB focus group could imagine the PC and VR version to be synchronized so that one person would work on the PC and another on the VR version, focusing on different aspects of editing.
The focus groups also provided several suggestions for improvement. For example, the possibility to mark a subtitle for later changes. The participants at UAB suggested the option to save a position for specific speakers to be re-used in similar scenes. The IRT focus group would like to have the ability to set the speaker position while the video is playing (currently the video stops when a speaker position must be set for a subtitle). They also stated that the tool should be expandable so that relevant editing functions may be used directly in VR.
Interaction techniques like shown in our prototype bring many opportunities that encourage to rethink production processes and open up new ways of content creation. Maybe, VR production environments can help to make media more accessible to everyone.
If you want to know more, please get in touch with the authors.
The ImAc project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 761974.