Skip to main content
Log in

Automatic fish counting via a multi-scale dense residual network

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The existing fish counting methods count the number of fish through target detection or regression, and these methods are difficult to process the image of fish with serious occlusion and small target. In response to this problem, using the idea of density regression, this paper proposes a fish counting method based on a multi-scale dense residual network, and designs a multi-scale attention mechanism to improve the network’s ability to extract features of fish of different sizes. The weight is used to discriminate the size of the fish scale, which solves the problem of large changes in the fish scale. In order to generate a higher definition fish density map, a dense residual module is constructed to fuse the shallow and deep features of the image, which ensures that the density map generated by this method can truly reflect the distribution of fish. Experimental results show that the peak signal to noise ratio (PSNR) and structural similarity of the density map generated by the proposed method are within a reasonable range. Compared with the existing methods, our method improves the mean absolute error by 21.8% and the mean square error by 22.8% under high density, and our method also achieves good results in real scenes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
€32.70 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (France)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Achanta SDM, Karthikeyan T, Vinoth RK (2020) A wireless IOT system towards gait detection technique using FSR sensor and wearable IOT devices. International Journal of Intelligent Unmanned Systems 8(1):43–54. https://doi.org/10.1108/IJIUS-01-2019-0005

    Article  Google Scholar 

  2. Achanta SDM, Karthikeyan T, Vinothkanna R (2019) A novel hidden Markov model-based adaptive dynamic time warping (HMDTW) gait analysis for identifying physically challenged persons. Soft Comput 23:8359–8366. https://doi.org/10.1007/s00500-019-04108-x

    Article  Google Scholar 

  3. Albuquerque PLF, Garcia V, Junior ADSO, Lewandowski T, Detweiler C, Gonçalves AB, Pistori H (2019) Automatic live fingerlings counting using computer vision. Comput Electron Agric 167:105015. https://doi.org/10.1016/j.compag.2019.105015

    Article  Google Scholar 

  4. Al-Saaidah B, Al-Nuaimy W, Al-Hadidi MR, Young I (2018) Automatic counting system for zebrafish eggs using optical scanner. In: 2018 9th international conference on information and communication systems (ICICS), Irbid, Jordan, pp 107–110. https://doi.org/10.1109/iacs.2018.8355450

    Chapter  Google Scholar 

  5. Cao X, Wang Z, Zhao Y, Su F (2018) Scale aggregation network for accurate and efficient crowd counting. In: Proceedings of the European conference on computer vision (ECCV), pp 734–750. https://doi.org/10.1007/978-3-030-01228-1_45

    Chapter  Google Scholar 

  6. Del Río J, Aguzzi J, Costa C, Menesatti P, Sbragaglia V, Nogueras M, Manuèl A (2013) A new colorimetrically-calibrated automated video-imaging protocol for day-night fish counting at the OBSEA coastal cabled observatory. Sensors 13(11):14740–14753. https://doi.org/10.3390/s131114740

    Article  Google Scholar 

  7. Fabic JN, Turla IE, Capacillo JA, David LT, Naval PC (2013) Fish population estimation and species classification from underwater video sequences using blob counting and shape analysis. In: 2013 IEEE international underwater technology symposium (UT), Tokyo, Japan, pp 1–6. https://doi.org/10.1109/ut.2013.6519876

    Chapter  Google Scholar 

  8. Fan L, Liu Y (2013) Automate fry counting using computer vision and multi-class least squares support vector machine. Aquaculture 380:91–98. https://doi.org/10.1016/j.aquaculture.2012.10.016

    Article  Google Scholar 

  9. Follana-Berná G, Palmer M, Lekanda-Guarrotxena A, Grau A, Arechavala-Lopez P (2020) Fish density estimation using unbaited cameras: accounting for environmental-dependent detectability. J Exp Mar Biol Ecol 527:151376. https://doi.org/10.1016/j.jembe.2020.151376

    Article  Google Scholar 

  10. French G, Fisher M, Mackiewicz M, Needle C (2015) Convolutional neural networks for counting fish in fisheries surveillance video. In: Procedings of the machine vision of animals and their behaviour workshop (MVAB). BMVA Press, GBR. ISBN 1-901725-57-X. https://doi.org/10.5244/C.29.MVAB.7

    Chapter  Google Scholar 

  11. Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier neural networks. In: Proceedings of the fourteenth international conference on artificial intelligence and statistics, PMLR, vol 15, pp 315–323

    Google Scholar 

  12. Hernández-Ontiveros JM, Inzunza-González E, García-Guerrero EE, López-Bonilla OR, Infante-Prieto SO, Cárdenas-Valdez JR, Tlelo-Cuautle E (2018) Development and implementation of a fish counter by using an embedded system. Comput Electron Agric 145:53–62. https://doi.org/10.1016/j.compag.2017.12.023

    Article  Google Scholar 

  13. Hore A, Ziou D (2010) Image quality metrics: PSNR vs. In: SSIM. 2010 20th international conference on pattern recognition, Istanbul, Turkey, 2010, pp 2366–2369. https://doi.org/10.1109/icpr.2010.579

    Chapter  Google Scholar 

  14. Huang G, Liu Z, Van Der Maaten L, Weinberger KQ (2017) Densely connected convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 4700–4708. https://doi.org/10.1109/cvpr.2017.243

    Chapter  Google Scholar 

  15. Islam MJ, Xia Y, Sattar J (2020) Fast underwater image enhancement for improved visual perception. IEEE Robotics and Automation Letters 5(2):3227–3234. https://doi.org/10.1109/lra.2020.2974710

    Article  Google Scholar 

  16. Lainez SMD, Gonzales DB (2019) Automated fingerlings counting using convolutional neural network. In: 2019 IEEE 4th international conference on computer and communication systems (ICCCS). Singapore, pp 67–72. https://doi.org/10.1109/ccoms.2019.8821746

    Chapter  Google Scholar 

  17. Le J, Xu L (2017) An automated fish counting algorithm in aquaculture based on image processing. In: 2016 international forum on mechanical, control and automation (IFMCA 2016). Atlantis Press, pp 358–366. https://doi.org/10.2991/ifmca-16.2017.56

    Chapter  Google Scholar 

  18. Lempitsky V, Zisserman A (2010) Learning to count objects in images. In: Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010. NIPS

    Google Scholar 

  19. Li Y, Hu J, Zhao X, Xie W, Li J (2017) Hyperspectral image super-resolution using deep convolutional neural network. Neurocomputing 266:29–41. https://doi.org/10.1016/j.neucom.2017.05.024

    Article  Google Scholar 

  20. Li Y, Zhang X, Chen D (2018) Csrnet: dilated convolutional neural networks for understanding the highly congested scenes. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 1091–1100. https://doi.org/10.1109/cvpr.2018.00120

    Chapter  Google Scholar 

  21. Liu Y, Jia R, Liu Q, Xu ZF, Sun HM (2020) Crowd counting via an inverse attention residual network. Journal of Electronic Imaging 29(3):033010. https://doi.org/10.1117/1.jei.29.3.033010

    Article  Google Scholar 

  22. Luong MT, Pham H, Manning CD (2015) Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025. https://doi.org/10.18653/v1/d15-1166

  23. Newbury PF, Culverhouse PF, Pilgrim DA (1996) Automatic fish population counting by artificial neural network. Oceanogr Lit Rev 1(43):55–55. https://doi.org/10.1016/0044-8486(95)00003-k

    Article  Google Scholar 

  24. Sharif MH, Galip F, Guler A, Uyaver S (2015, December) A simple approach to count and track underwater fishes from videos. In: 2015 18th international conference on computer and information technology (ICCIT), Dhaka, Bangladesh, 2015, pp 347–352. https://doi.org/10.1109/iccitechn.2015.7488094

    Chapter  Google Scholar 

  25. Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, … Rabinovich A (2015) Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp 1–9. https://doi.org/10.1109/cvpr.2015.7298594

    Chapter  Google Scholar 

  26. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Polosukhin I (2017) Attention is all you need. arXiv preprint arXiv:1706.03762

  27. Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process 13(4):600–612. https://doi.org/10.1109/tip.2003.819861

    Article  Google Scholar 

  28. Wang X, Girshick R, Gupta A, He K (2018) Non-local neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 7794–7803

    Google Scholar 

  29. Westling F, Sun C, Wang D (2014) A modular learning approach for fish counting and measurement using stereo baited remote underwater video. In: 2014 international conference on digital image computing: techniques and applications (DICTA), Wollongong, NSW, Australia, pp 1–7. https://doi.org/10.1109/dicta.2014.7008086

    Chapter  Google Scholar 

  30. Zeng L, Xu X, Cai B, Qiu S, Zhang T (2017, September) Multi-scale convolutional neural networks for crowd counting. In: 2017 IEEE international conference on image processing (ICIP) Beijing, China, 2017, pp 465–469. https://doi.org/10.1109/ICIP.2017.8296324

    Chapter  Google Scholar 

  31. Zhang Y, Zhou D, Chen S, Gao S, Ma Y (2016) Single-image crowd counting via multi-column convolutional neural network. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 589–597. https://doi.org/10.1109/cvpr.2016.70

    Chapter  Google Scholar 

  32. Zhang Y, Tian Y, Kong Y, Zhong B, Fu Y (2020) Residual dense network for image restoration. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1016/j.ijleo.2020.165341

  33. Zhang L, Li W, Liu C, Zhou X, Duan Q (2020) Automatic fish counting method using image density grading and local regression. Comput Electron Agric 179:105844. https://doi.org/10.1016/j.compag.2020.105844

    Article  Google Scholar 

  34. Zhao H, Shi J, Qi X, Wang X, Jia J (2017) Pyramid scene parsing network. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2881–2890. https://doi.org/10.1109/cvpr.2017.660

    Chapter  Google Scholar 

  35. Zheng X, Zhang Y (2010) A fish population counting method using fuzzy artificial neural network. In: 2010 IEEE international conference on Progress in informatics and computing. Shanghai, China, pp 225–228. https://doi.org/10.1109/PIC.2010.5687462

    Chapter  Google Scholar 

  36. Zhu C (2009) A novel fries-counting method based on machine vision technique. Fishery Modernization 36:25–28

Download references

Funding

The authors are grateful for collaborative funding support from the Humanity and Social Science Foundation of Ministry of Education, China (21YJAZH077).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Rui-Sheng Jia or Hong-Mei Sun.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yu, JT., Jia, RS., Li, YC. et al. Automatic fish counting via a multi-scale dense residual network. Multimed Tools Appl 81, 17223–17243 (2022). https://doi.org/10.1007/s11042-022-12672-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12672-y

Keywords

Navigation