Validation of the Mobile Application Rating Scale (MARS)

Yannik Terhorst*, Paula Philippi, Lasse B. Sander, Dana Schultchen, Sarah Paganini, Marco Bardus, Karla Santo, Johannes Knitza, Gustavo C. Machado, Stephanie Schoeppe, Natalie Bauereiß, Alexandra Portenhauser, Matthias Domhardt, Benjamin Walter, Martin Krusche, Harald Baumeister, Eva Maria Messner

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

37 Citations (Scopus)

Abstract

Background Mobile health apps (MHA) have the potential to improve health care. The commercial MHA market is rapidly growing, but the content and quality of available MHA are unknown. Instruments for the assessment of the quality and content of MHA are highly needed. The Mobile Application Rating Scale (MARS) is one of the most widely used tools to evaluate the quality of MHA. Only few validation studies investigated its metric quality. No study has evaluated the construct validity and concurrent validity. Objective This study evaluates the construct validity, concurrent validity, reliability, and objectivity, of the MARS. Methods Data was pooled from 15 international app quality reviews to evaluate the metric properties of the MARS. The MARS measures app quality across four dimensions: Engagement, functionality, aesthetics and information quality. Construct validity was evaluated by assessing related competing confirmatory models by confirmatory factor analysis (CFA). Noncentrality (RMSEA), incremental (CFI, TLI) and residual (SRMR) fit indices were used to evaluate the goodness of fit. As a measure of concurrent validity, the correlations to another quality assessment tool (ENLIGHT) were investigated. Reliability was determined using Omega. Objectivity was assessed by intra-class correlation. Results In total, MARS ratings from 1,299 MHA covering 15 different health domains were included. Confirmatory factor analysis confirmed a bifactor model with a general factor and a factor for each dimension (RMSEA = 0.074, TLI = 0.922, CFI = 0.940, SRMR = 0.059). Reliability was good to excellent (Omega 0.79 to 0.93). Objectivity was high (ICC = 0.82). MARS correlated with ENLIGHT (ps<.05). Conclusion The metric evaluation of the MARS demonstrated its suitability for the quality assessment. As such, the MARS could be used to make the quality of MHA transparent to health care stakeholders and patients. Future studies could extend the present findings by investigating the re-test reliability and predictive validity of the MARS.

Original languageEnglish
Article numbere0241480
JournalPLoS ONE
Volume15
Issue number11 November
DOIs
Publication statusPublished - Nov 2020

Bibliographical note

Publisher Copyright:
© 2020 Terhorst et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

ASJC Scopus subject areas

  • General

Fingerprint

Dive into the research topics of 'Validation of the Mobile Application Rating Scale (MARS)'. Together they form a unique fingerprint.
  • University Research Board (URB) Grant

    Bardus, Marco (Recipient), Ghandour, Lilian A. (Recipient), Fares, Elie Jacques (Recipient) & Gherbal, Tarek (Recipient), Mar 2018

    Prize: Fellowship awarded competitively

Cite this