Evaluation beyond Usability: Validating Sustainable HCI Research

Christian Remy, Oliver Bates, Alan Dix, Vanessa Thomas, Mike Hazas, Adrian Friday, Elaine M. Huang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

16 Citations (Scopus)

Abstract

The evaluation of research artefacts is an important step to validate research contributions. Sub-disciplines of HCI often pursue primary goals other than usability, such as Sustainable HCI (SHCI), HCI for development, or health and wellbeing. For such disciplines, established evaluation methods are not always appropriate or sufficient, and new conventions for identifying, discussing, and justifying suitable evaluation methods need to be established. In this paper, we revisit the purpose and goals of evaluation in HCI and SHCI, and elicit five key elements that can provide guidance to identifying evaluation methods for SHCI
research. Our essay is meant as a starting point for discussing current and improving future evaluation practice in SHCI; we also believe it holds value for other subdisciplines in HCI that encounter similar challenges while evaluating their research.
Original languageEnglish
Title of host publicationProceedings of ACM CHI 2018 Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
Number of pages15
ISBN (Print)978-1-4503-5620-6
DOIs
Publication statusPublished - 21 Apr 2018
EventACM CHI 2018 Conference on Human Factors in Computing Systems (CHI 2018) - Montreal, Canada
Duration: 21 Apr 201826 Apr 2018
https://chi2018.acm.org/

Conference

ConferenceACM CHI 2018 Conference on Human Factors in Computing Systems (CHI 2018)
Country/TerritoryCanada
CityMontreal
Period21/04/1826/04/18
Internet address

Keywords

  • Sustainable HCI
  • Sustainability
  • Evaluation
  • Validation

Fingerprint

Dive into the research topics of 'Evaluation beyond Usability: Validating Sustainable HCI Research'. Together they form a unique fingerprint.

Cite this