A systematic review finds that diagnostic reviews fail to incorporate quality despite available tools

P. Whiting, A.W.S. Rutjes, J. Dinnes, J.B. Reitsma, P.M.M. Bossuyt, J. Kleijnen

    Research output: Contribution to journalArticlepeer-review

    58 Citations (Scopus)

    Abstract

    To review existing quality assessment tools for diagnostic accuracy studies and to examine to what extent quality was assessed and incorporated in diagnostic systematic reviews. Electronic databases were searched for tools to assess the quality of studies of diagnostic accuracy or guides for conducting, reporting or interpreting such studies. The Database of Abstracts of Reviews of Effects (DARE; 1995-2001) was used to identify systematic reviews of diagnostic studies to examine the practice of quality assessment of primary studies. Ninety-one quality assessment tools were identified. Only two provided details of tool development, and only a small proportion provided any indication of the aspects of quality they aimed to assess. None of the tools had been systematically evaluated. We identified 114 systematic reviews, of which 58 (51%) had performed an explicit quality assessment and were further examined. The majority of reviews used more than one method of incorporating quality. Most tools to assess the quality of diagnostic accuracy studies do not start from a well-defined definition of quality. None has been systematically evaluated. The majority of existing systematic reviews fail to take differences in quality into account. Reviewers should consider quality as a possible source of heterogeneity.
    Original languageEnglish
    Pages (from-to)1-12
    Number of pages12
    JournalJournal of Clinical Epidemiology
    Volume58
    Issue number1
    DOIs
    Publication statusPublished - 1 Jan 2005

    Fingerprint

    Dive into the research topics of 'A systematic review finds that diagnostic reviews fail to incorporate quality despite available tools'. Together they form a unique fingerprint.

    Cite this