Abstract
Water pollution is a widespread problem, with lakes, rivers, and oceans contaminated by an increasing amount of microplastics and other pollutants. Microplastic counting from microscope images is a laborious, time-consuming, and error-prone task. The ability of researchers to automate the detection and counting of microplastics would accelerate research and monitoring activities. This paper applies machine learning techniques to automatically segment and count microplastics in a given image, in challenging cluttered conditions. A U-Net neural network was trained to segment microplastics and image post-processing techniques were then applied to count the number of microplastics as well as highlight their position in an image. Different forms of skip connections from the U-Net encoder layers to decoder layers were tested to assess the impact of skip connections on the performance of the U-Net architecture. Our work shows that U-Net can achieve human-level performance in enumerating microplastics in cluttered images and that the standard skip-connection architecture is not necessarily optimal.










Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Thompson RC, Olsen Y, Mitchell RP, Davis A, Rowland SJ, John AW, McGonigle D, Russell AE et al (2004) Lost at sea: where is all the plastic? Science 304(5672):838
Jambeck JR, Geyer R, Wilcox C, Siegler TR, Perryman M, Andrady A, Narayan R, Law KL (2015) Plastic waste inputs from land into the ocean. Science 347(6223):768–771
Zelinsky GJ, Yu C-P (2015) Clutter perception is invariant to image size. Vis Res 116:142–151. computational Models of Visual Attention. [Online]. Available https://www.sciencedirect.com/science/article/pii/S0042698915001716
Gauci A, Deidun A, Montebello J, Abela J, Galgani F (2019) Automating the characterisation of beach microplastics through the application of image analyses. Ocean Coast Manage 182:104950
Wegmayr V, Sahin A, Saemundsson B, Buhmann J (2020) Instance segmentation for the quantification of microplastic fiber images. In: Proceedings of the IEEE/CVF winter conference on applications of computer vision, pp 2210–2217
Lorenzo-Navarro J, Castrillón-Santana M, Santesarti E, De Marsico M, Martínez I, Raymond E, Gómez M, Herrera A (2020) Smacc: a system for microplastics automatic counting and classification. IEEE Access 8:25249–25261
Lorenzo-Navarro J, Castrillón-Santana M, Sánchez-Nielsen E, Zarco B, Herrera A, Martínez I, Gómez M (2021) Deep learning approach for automatic microplastics counting and classification. Sci Total Environ 765:142728
Ronneberger O, Fischer P, Brox T (2015) U-net: convolutional networks for biomedical image segmentation. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 234–241
Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition. In: 3rd International conference on learning representations, ICLR 2015, San Diego, May 7–9, conference track proceedings, Bengio Y, LeCun Y (eds). [Online]. Available: http://arxiv.org/abs/1409.1556
Siddique N, Paheding S, Elkin CP, Devabhaktuni V (2021) U-net and its variants for medical image segmentation: a review of theory and applications. IEEE Access 9:82-031-82–057
Zhou Z, Siddiquee MMR, Tajbakhsh N, Liang J (2018) Unet++: a nested u-net architecture for medical image segmentation. In: Deep learning in medical image analysis and multimodal learning for clinical decision support. Springer, pp 3–11
Zhang J, Jin Y, Xu J, Xu X, Zhang Y (2018) Mdu-net: multi-scale densely connected u-net for biomedical image segmentation. arXiv:1812.00352
He K, Gkioxari G, Dollár P, Girshick R (2017) Mask r-cnn. In: Proceedings of the IEEE international conference on computer vision, pp 2961–2969
Lin T-Y, Dollár P, Girshick R, He K, Hariharan B, Belongie S (2017) Feature pyramid networks for object detection. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2117–2125
Ding X, Zhang Q, Welch WJ (2020) Classification beats regression: Counting of cells from greyscale microscopic images based on annotation-free training samples. In: The CAAI international conference on artificial intelligence (CICAI 2021)
Gao G, Liu Q, Wang Y (2020) Counting from sky: a large-scale data set for remote sensing object counting and a benchmark method. IEEE Trans Geosci Remote Sens
Li X, Wang W, Hu X, Yang J (2019) Selective kernel networks. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 510–519
Qin X, Wu C, Chang H, Lu H, Zhang X (2020) Match feature u-net: dynamic receptive field networks for biomedical image segmentation. Symmetry 12(8):1230
Hernández CX, Sultan MM, Pande VS (2018) Using deep learning for segmentation and counting within microscopy data. In: CVPR
De Witte B, Devriese L, Bekaert K, Hoffman S, Vandermeersch G, Cooreman K, Robbens J (2014) Quality assessment of the blue mussel (mytilus edulis): comparison between commercial and wild types. Mar Pollut Bull 85(1):146–155
Alexandre M (2021) Unet: semantic segmentation with pytorch. https://github.com/milesial/Pytorch-UNet
Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. In: International conference on machine learning. PMLR, pp 448–456
Yosinski J, Clune J, Bengio Y, Lipson H (2014) How transferable are features in deep neural networks? In: Advances in neural information processing systems, pp 3320–3328
Tan C, Sun F, Kong T, Zhang W, Yang C, Liu C (2018) A survey on deep transfer learning. In: International conference on artificial neural networks. Springer, pp 270–279
Matovinovic IZ, Loncaric S, Lo J, Heisler M, Sarunic M (2019) Transfer learning with u-net type model for automatic segmentation of three retinal layers in optical coherence tomography images. In: 2019 11th international symposium on image and signal processing and analysis (ISPA)
Chen H, Selvam S, Ting K, Gibbins C (2021) Microplastic pollution in freshwater systems in south east Asia: contamination levels, sources and ecological impacts. Environ Sci Pollut Res
Chen H, Gibbins C, Selvam S, Ting K (2021) Spatio-temporal variation of microplastic along a rural-urban transition in a tropical river. Environ Pollut
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Lee, K.S., Chen, H.L., Ng, Y.S. et al. U-Net skip-connection architectures for the automated counting of microplastics. Neural Comput & Applic 34, 7283–7297 (2022). https://doi.org/10.1007/s00521-021-06876-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-021-06876-w