Video Analysis & Coding
Qualitative video analysis is a key tool in UX research when it comes to reconstructing real usage processes in detail. Instead of relying solely on questionnaires or metrics, researchers analyze what users actually do, say and show - for example through gaze behavior, hesitation or unintended clicks. Particularly in usability tests or field observations, this method provides a deep insight into cognitive processes and interaction problems.
Why Video Analysis Makes Sense
While live observations are often limited by their fleeting nature, video analysis allows for repeated, precisely timed observation of individual sections of behavior. Especially nonverbal reactions, such as frowning when reading a label or withdrawing the hand after a click, can be better captured in the video and discussed in the team.

Example: A user clicks on “Order” and immediately withdraws her hand. In the subsequent analysis, it becomes clear that she was unsure whether she had already completed a purchase. A typical UI friction point that would have gone unnoticed without video.
How Does a Video Analysis Work?
A structured analysis process begins with the selection of relevant recordings - such as certain test tasks or conspicuous sessions. The sequences are then transcribed or described. Not only spoken statements, but also mouse movements, interactions and reactions are noted - often supplemented by time stamps.
In the actual coding, individual events are then assigned to specific categories, such as “confusion”, “successfully resolved”, “system feedback missing” or “unclear wording”. This procedure can be deductive (based on predefined categories) or inductive (based on the material).
Exemplary coding:
- 00:01:45: Hesitation before button click - User:in reads button text three times
- 00:03:10: System feedback missing - no hint after clicking on “Send”
- 00:05:30: Success - Form filled out correctly and sent
Typical Application Contexts
Video analysis is not only used in classic usability tests. It is also helpful in remote tests with recorded sessions or in field studies with real use in everyday life. For example, it can be used to investigate how commuters use a navigation app on the subway - including contextual factors such as poor internet connection, background noise or one-handed operation.

In exploratory studies, for example when introducing new technologies such as voice assistants or mixed reality applications, video analysis helps to understand spontaneous reactions and usage patterns that were not previously anticipated.
Tools and Platforms
Different tools are used depending on the depth of analysis and team structure:
- Lookback, PlaybookUX or UsabilityHub offer simple integrated recording and tagging functions.
- For more complex evaluations, Dovetail, Airtable, Taguette or scientific software such as MAXQDA, ATLAS.ti or ELAN are used.
- In smaller teams, a structured table in Excel or Google Sheets, in which timestamps, codes and quotations are documented, is also sufficient.
Practical tip: Color coding for categories (e.g. red = problem, green = success) allows you to quickly identify a pattern - for example, if several test subjects drop out at the same point.
Ensuring Quality: Best Practices
Careful video analysis depends on clear categories and intersubjective comprehensibility. It is advisable to develop a common coding scheme and have several people code independently. The consistency of the coding (interrater reliability) can be an indication of the robustness of the analysis.
Triangulation is also important: the observations from the video should be combined with other sources such as interview statements, protocols or metrics (e.g. task success rate) in order to obtain a complete picture.
Data protection law must be observed: Video recordings must be stored and used anonymized or with express consent. This should be communicated transparently, especially in a remote context.
Conclusion
Video analysis and coding enable a precise, dense description of the user experience that goes far beyond pure metrics. Particularly in the case of complex systems, sensitive tasks or unclear UI moments, this method offers the opportunity to identify user-side problems in detail and based on evidence - and to derive targeted optimizations from this.
Video analysis in usability research
These articles examine methods for the visual and systematic evaluation of usability test videos. They range from classic frameworks to collaborative tools and AI-supported coding methods.
Visualizing Usability Testing Video Data
Introduces a visual scheme for coding and analyzing video data from usability tests to systematically capture emotions and thoughts.
Cho, S., Frost, A., Providenti, M., & Reichler, D. (2017). Visualizing usability testing video data. In International Conference on Design of Communication. https://doi.org/10.1145/3121113.3121237
CoUX: Collaborative Visual Analysis of Think-Aloud Usability Test Videos
AI-supported tool for annotated evaluation of think-aloud videos. Facilitates collaborative UX analysis through automatic problem indicators.
Jahangirzadeh Soure, E., Kuang, E., Fan, M., & Zhao, J. (2021). CoUX: Collaborative visual analysis of think-aloud usability test videos. IEEE Transactions on Visualization and Computer Graphics. https://doi.org/10.1109/TVCG.2021.3114822
Development of a Video Coding Scheme for Health Information Systems
Extension of classic usability coding with socio-technical variables (e.g. error communication, security behavior).
Discount Video Analysis for Usability Engineering
Proposal to automate coding (e.g. through sound recognition) to save time in iterative UX processes.
Chignell, M., Motoyama, T., & Melo, V. (1995). Discount video analysis for usability engineering. In Advances in Human Factors/Ergonomics. https://doi.org/10.1016/S0921-2647(06)80237-6
The Video Analysis Method: An Integrated Approach
Structured framework for linking video, transaction logs and usability metrics for deeper insights.
Prasse, M. J. (1990). The video analysis method: An integrated approach to usability assessment. Journal of Human Factors. https://doi.org/10.1177/154193129003400436
Last modified: 17 June 2025