Evaluation beyond Usability: Validating Sustainable HCI Research

Christian Remy, Oliver Bates, Alan Dix, Vanessa Thomas, Mike Hazas, Adrian Friday, Elaine M. Huang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

16 Citations (Scopus)


The evaluation of research artefacts is an important step to validate research contributions. Sub-disciplines of HCI often pursue primary goals other than usability, such as Sustainable HCI (SHCI), HCI for development, or health and wellbeing. For such disciplines, established evaluation methods are not always appropriate or sufficient, and new conventions for identifying, discussing, and justifying suitable evaluation methods need to be established. In this paper, we revisit the purpose and goals of evaluation in HCI and SHCI, and elicit five key elements that can provide guidance to identifying evaluation methods for SHCI
research. Our essay is meant as a starting point for discussing current and improving future evaluation practice in SHCI; we also believe it holds value for other subdisciplines in HCI that encounter similar challenges while evaluating their research.
Original languageEnglish
Title of host publicationProceedings of ACM CHI 2018 Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
Number of pages15
ISBN (Print)978-1-4503-5620-6
Publication statusPublished - 21 Apr 2018
EventACM CHI 2018 Conference on Human Factors in Computing Systems (CHI 2018) - Montreal, Canada
Duration: 21 Apr 201826 Apr 2018


ConferenceACM CHI 2018 Conference on Human Factors in Computing Systems (CHI 2018)
Internet address


  • Sustainable HCI
  • Sustainability
  • Evaluation
  • Validation


Dive into the research topics of 'Evaluation beyond Usability: Validating Sustainable HCI Research'. Together they form a unique fingerprint.

Cite this