Automated non-invasive cell counting in phase contrast microscopy with automated image analysis parameter selection

Rachel Flight, Gabriel Landini, Iain Styles, Richard Shelton, Michael Milward, Paul Cooper

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)
237 Downloads (Pure)


Cell counting is commonly used to determine proliferation rates in cell cultures and for adherent cells it is often a ‘destructive’ process requiring disruption of the cell monolayer resulting in the inability to follow cell growth longitudinally. This process is time consuming and utilises significant resource. In this study a relatively inexpensive, rapid and widely applicable phase contrast microscopy based technique has been developed that emulates the contrast changes taking place when bright field microscope images of epithelial cell cultures are defocused. Processing of the resulting images produces an image that can be segmented using a global threshold; the number of cells is then deduced from the number of segmented regions and these cell counts can be used to generate growth curves. The parameters of this method were tuned using the discrete mereotopological relations between ground truth and processed images. Cell count accuracy was improved using linear discriminant analysis to identify spurious noise regions for removal.
The proposed cell counting technique was validated by comparing the results with a manual count of cells in images, and subsequently applied to generate growth curves for oral keratinocyte cultures supplemented with a range of concentrations of foetal calf serum. The approach developed has broad applicability and utility for researchers with standard laboratory imaging equipment.
Original languageEnglish
JournalJournal of Microscopy
Early online date12 Jul 2018
Publication statusE-pub ahead of print - 12 Jul 2018


  • Phase contrast microscopy
  • cell cultures
  • growth curve


Dive into the research topics of 'Automated non-invasive cell counting in phase contrast microscopy with automated image analysis parameter selection'. Together they form a unique fingerprint.

Cite this