TY - GEN
T1 - Challenges in evaluating usability and user experience of reality-based interaction
AU - Christou, Georgios
AU - Law, Effie Lai Chong
AU - Green, William
AU - Hornbæk, Kasper
PY - 2009
Y1 - 2009
N2 - This workshop aims to further the understanding of the challenges relating to the evaluation methods of usability and user experience that are specific to Reality-Based Interaction (RBI), and to identify effective practical responses to these challenges. The emergence of Post-WIMP interfaces has led to new ways of interacting with technologies. However, there are still no integrated ways of evaluating the usability and user experience of these interfaces. Developers and designers are left to discover their own metrics and evaluation methods. This approach presents problems, in that the metrics used in each case may provide results that are neither valid nor meaningful. For this reason, the time is ripe to integrate the methods that have been developed for evaluating interfaces that belong to the RBI umbrella. The measures and techniques will then be turned into a framework that enables designers of RBI interfaces to select appropriately existing methods and tools to evaluate systematically the usability and user experience of their prototypes and products. Reusing and adapting validated evaluation approaches can not only avoid reinventing the wheel and wasting time but also further improve and consolidate these approaches. Such a framework will also provide a basis for comparison between designs of RBI interfaces in different application contexts.
AB - This workshop aims to further the understanding of the challenges relating to the evaluation methods of usability and user experience that are specific to Reality-Based Interaction (RBI), and to identify effective practical responses to these challenges. The emergence of Post-WIMP interfaces has led to new ways of interacting with technologies. However, there are still no integrated ways of evaluating the usability and user experience of these interfaces. Developers and designers are left to discover their own metrics and evaluation methods. This approach presents problems, in that the metrics used in each case may provide results that are neither valid nor meaningful. For this reason, the time is ripe to integrate the methods that have been developed for evaluating interfaces that belong to the RBI umbrella. The measures and techniques will then be turned into a framework that enables designers of RBI interfaces to select appropriately existing methods and tools to evaluate systematically the usability and user experience of their prototypes and products. Reusing and adapting validated evaluation approaches can not only avoid reinventing the wheel and wasting time but also further improve and consolidate these approaches. Such a framework will also provide a basis for comparison between designs of RBI interfaces in different application contexts.
KW - Evaluation
KW - Human-computer interaction
KW - Reality-based interaction
KW - Usability
KW - User experience
UR - http://www.scopus.com/inward/record.url?scp=70349182072&partnerID=8YFLogxK
U2 - 10.1145/1520340.1520747
DO - 10.1145/1520340.1520747
M3 - Conference contribution
AN - SCOPUS:70349182072
SN - 9781605582474
T3 - Conference on Human Factors in Computing Systems - Proceedings
SP - 4811
EP - 4814
BT - Proceedings of the 27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009
T2 - 27th International Conference Extended Abstracts on Human Factors in Computing Systems, CHI 2009
Y2 - 4 April 2009 through 9 April 2009
ER -