Tify: a quality-based frame selection tool for improving the output of unstable biomedical imaging

Research output: Contribution to journalArticle

Colleges, School and Institutes


The ability to image biological tissues is critical to our understanding of a range of systems and processes. In the case of in situ living tissue, such imaging is hampered by the innate mechanical properties of the tissue. In many cases, this provides challenges in how to process large amounts of image data which may contain aberrations from movement. Generally, current tools require the provision of reference images and are unable to maintain temporal correlations within an image set. Here, we describe a tool–Tify–which can accurately predict a numerical quality score versus human scoring and can analyse image sets in a manner that allows the maintenance of temporal relationships. The tool uses regression-based techniques to link image statistics to image quality based on user provided scores from a sample of images. Scores calculated by the software correlate strongly with the scores provided by human users. We identified that, in most cases, the software requires users to score between 20–30 frames in order to be able to accurately calculate the remaining images. Importantly, our results suggest that the software can use coefficients generated from consolidated image sets to process images without the need for additional manual scoring. Finally, the tool is able to use a frame windowing technique to identify the highest quality frame from a moving window, thus retaining macro-chronological connections between frames. In summary, Tify is able to successfully predict the quality of images in an image set based on a small number of sample scores provided by end-users. This software has the potential to improve the effectiveness of biological imaging techniques where motion artefacts, even in the presence of stabilisation, pose a significant problem.


Original languageEnglish
Article numbere0213162
Number of pages17
JournalPLoS ONE
Issue number3
Publication statusPublished - 11 Mar 2019