Skip to main content
Log in

Modeling bio-inspired visual neural for detecting visual features of small- and wide-field moving targets synchronously from complex dynamic environments

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

The synchronous detection of visual features of small- and wide-field moving targets in complex dynamic environments has been a challenge in the field of moving target detection. Fortunately, the visual system of Drosophila flies can detect visual features of small- and wide-field moving targets synchronously from complex dynamic environments, thus providing a good paradigm for the synchronous detection of visual features of small- and wide-field moving targets in complex dynamic environments, however, there is little literature that comprehensively analyses and verify this. In this paper, we present a bio-inspired computing model for detecting visual features of small- and wide-field moving targets synchronously. The model consists of three stages. First, visual stimuli are perceived and divided into parallel ON and OFF pathways. Then, the feedback mechanism and the full Hassenstein-Reichardt correlator are applied to the Medulla neurons. Finally, the Lobula Columnar 11 is used to detect visual features of small-field moving targets, i.e., the position, meanwhile, the Lobula Plate Tangential Cell is utilized to detect visual features of wide-field moving targets, i.e., the translational directional selectivity. Through extensive experiments, the proposed model can detect visual features of small- and wide-field moving targets synchronously. In addition, the proposed model improves the detection rate in small-field moving target detection by 17.18% compared with the traditional bio-inspired computing model, while the effectiveness of the proposed model is further verified by comparing it with the conventional moving target detection methods. Moreover, the proposed model can also effectively detect visual features of wide-field moving targets. The source code can be found at https://github.com/szhanghh/A-bio-inspired-visual-neural-computing-model.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
€32.70 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (France)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

Data availability

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

References

  1. Lee, B., Kim, S., Oulasvirta, A., et al.: Moving target selection: a cue integration model. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, Canada, (2018). https://doi.org/10.1145/3173574.3173804

  2. Minaeian, S., Liu, J., Son, Y.J.: Effective and efficient detection of moving targets from a UAV’s camera. IEEE Trans. Intell. Transp. Syst. 19(2), 497–506 (2018). https://doi.org/10.1109/TITS.2017.2782790

    Article  Google Scholar 

  3. Lin, T.Y., Maire, M., Serge, B.S., et al.: Microsoft COCO: common objects in context. Lect. Notes Comput. Sci. 8693(1), 740–755 (2014). https://doi.org/10.1007/978-3-319-10602-1_48

    Article  Google Scholar 

  4. Shen, K., Yang, Y., Liang, Y., et al.: Modeling Drosophila vision neural pathways to detect weak moving targets from cluttered backgrounds. Comput. Electr. Eng. 99, 107678 (2022). https://doi.org/10.1016/j.compeleceng.2021.107678

    Article  Google Scholar 

  5. Giurfa, M., Menzel, R.: Insect visual perception: complex abilities of simple nervous systems. Curr. Opin. Neurobiol. 7(4), 505–513 (1997). https://doi.org/10.1016/S0959-4388(97)80030-X

    Article  Google Scholar 

  6. Nicholas, S., Supple, J., Leibbrandt, R., et al.: Integration of small- and wide-field visual features in target-selective descending neurons of both predatory and nonpredatory dipterans. J. Neurosci. 38(50), 10725–10733 (2018). https://doi.org/10.1523/JNEUROSCI.1695-18.2018

    Article  Google Scholar 

  7. Wang, H., Peng, J., Fu, Q., et al.: Visual cue integration for small target motion detection in natural cluttered backgrounds. In: 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, (2019). https://doi.org/10.1109/IJCNN.2019.8851913

  8. Keleş, M.F., Frye, M.A.: Object-detecting neurons in Drosophila. Curr. Biol. 27(5), 680–687 (2017). https://doi.org/10.1016/j.cub.2017.01.012

    Article  Google Scholar 

  9. Ferreira, C.H., Moita, M.A.: Behavioral and neuronal underpinnings of safety in numbers in fruit flies. Nat. Commun. 11(1), 4182 (2020). https://doi.org/10.1038/s41467-020-17856-4

    Article  Google Scholar 

  10. Schnell, B., Raghu, S.V., Nern, A., et al.: Columnar cells necessary for motion responses of wide-field visual interneurons in Drosophila. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 198(5), 389–395 (2012). https://doi.org/10.1007/s00359-012-0716-3

    Article  Google Scholar 

  11. Wei, H., Kyung, H., Kim, P.J., et al.: The diversity of lobula plate tangential cells (LPTCs) in the Drosophila motion vision system. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 206(2), 139–148 (2020). https://doi.org/10.1007/s00359-019-01380-y

    Article  Google Scholar 

  12. Floris, V.B.: Complex behavior and perception in Drosophila emerges from iterative feedback- regulated reflexes. Calif. Inst. Technol. (2014). https://doi.org/10.7907/WSE4-WG98

    Article  Google Scholar 

  13. Cruz, T.L., Pérez, S.M., Chiappe, M.E.: Fast tuning of posture control by visual feedback underlies gaze stabilization in walking Drosophila. Curr. Biol. 31(20), 4596–4607 (2021). https://doi.org/10.1016/j.cub.2021.08.041

    Article  Google Scholar 

  14. Yu, H., Wang, W.Q., Li, M.: FNSAM: Image super-resolution using a feedback network with self-attention mechanism. Technol. Health Care: Off. J. Eur. Soc. Eng. Med. 31(1), 1–13 (2023). https://doi.org/10.3233/THC-236033

    Article  Google Scholar 

  15. Luo, Y., Li, X., Chen, S.: Feedback spatial–temporal infrared small target detection based on orthogonal subspace projection. IEEE Trans. Geosci. Remote Sens. 62, 1–19 (2024). https://doi.org/10.1109/TGRS.2024.3368099

    Article  Google Scholar 

  16. Chen, J., Ye, S., Jiang, Z., et al.: Image deblurring using feedback mechanism and dual gated attention network. Neural. Process. Lett. 56(2), 88 (2024). https://doi.org/10.1007/s11063-024-11462-x

    Article  Google Scholar 

  17. Basch, M.E., Cristea, D.G., Tiponut, V., et al.: Elaborated motion detector based on Hassenstein- Reichardt correlator model. In: Proceedings of the 14th WSEAS international conference on Systems, Wisconsin, USA, (2010).

  18. Wiederman, S.D., Shoemaker, P.A., O’Carroll, D.C.: A model for the detection of moving targets in visual clutter inspired by insect physiology. PLoS ONE 3(7), e2784 (2008). https://doi.org/10.1371/journal.pone.0002784

    Article  Google Scholar 

  19. Wiedermann, S.D., O’Carroll, D.C.: Biologically inspired feature detection using cascaded correlations of off and on channels. J. Artif. Intell. Soft Comput. Res. 3(1), 5–14 (2013). https://doi.org/10.2478/jaiscr-2014-0001

    Article  Google Scholar 

  20. Wang, H., Peng, J., Yue, S.: A directionally selective small target motion detecting visual neural network in cluttered backgrounds. IEEE Trans. Cybern. 50(4), 1541–1555 (2020). https://doi.org/10.1109/TCYB.2018.2869384

    Article  Google Scholar 

  21. Huang, S., Niu, X., Wang, Z.: A moving target detection model inspired by spatio-temporal information accumulation of avian tectal neurons. Mathematics 11(1169), 1169 (2023). https://doi.org/10.3390/math11051169

    Article  Google Scholar 

  22. Yang, H.H., Clandinin, T.R.: Elementary motion detection in Drosophila algorithms and mechanisms. Ann. Rev. Vision Sci. 4(1), 143–163 (2018). https://doi.org/10.1146/annurev-vision-091517-034153

    Article  Google Scholar 

  23. Meier, M., Serbe, E., Maisak, M.S., et al.: ON and OFF pathways in Drosophila motion vision. Nature 468(7321), 300–304 (2010). https://doi.org/10.1016/j.cub.2014.01.006

    Article  Google Scholar 

  24. Eichner, H., Joesch, M., Schnell, B., et al.: Internal structure of the fly elementary motion detector. Neuron 70(6), 1155–1164 (2011). https://doi.org/10.1016/j.neuron.2011.03.028

    Article  Google Scholar 

  25. Clark, D.A., Bursztyn, L., Horowitz, M.A., et al.: Defining the computational structure of the motion detector in Drosophila. Neuron 70(6), 1165–1177 (2011). https://doi.org/10.1016/j.neuron.2011.05.023

    Article  Google Scholar 

  26. Fu, Q., Yue, S.: Modelling Drosophila motion vision pathways for decoding the direction of translating objects against cluttered moving backgrounds. Biol. Cybern. 114(4), 443–460 (2020). https://doi.org/10.1007/s00422-020-00841-x

    Article  Google Scholar 

  27. Fu, Q., Peng, J., Yue, S.: Bioinspired contrast vision computation for robust motion estimation against natural signals. In: 2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China, (2021). https://doi.org/10.1109/IJCNN52387.2021.9533680

  28. Warrant, E.J.: Matched filtering and the ecology of vision in insects. Ecol. Anim. Sens. (2016). https://doi.org/10.1007/978-3-319-25492-0_6

    Article  Google Scholar 

  29. Perry, M., Konstantinides, N., Pinto-Teixeira, F., et al.: Generation and evolution of neural cell types and circuits: insights from the Drosophila visual system. Annu. Rev. Genet. 51(1), 501–527 (2017). https://doi.org/10.1146/annurev-genet-120215-035312

    Article  Google Scholar 

  30. Tuthill, J.C., Nern, A., Holtz, S.L., et al.: Contributions of the 12 neuron classes in the fly lamina to motion vision. Neuron 79(1), 128–140 (2013). https://doi.org/10.1016/j.neuron.2013.05.024

    Article  Google Scholar 

  31. Behnia, R., Clark, D.A., Carter, A.G., et al.: Processing properties of ON and OFF pathways for Drosophila motion detection. Nature 512(7515), 427–430 (2014). https://doi.org/10.1038/nature13427.Epub2014Jul6

    Article  Google Scholar 

  32. Shinomiya, K., Nern, A., Meinertzhagen, I.A., et al.: Neuronal circuits integrating visual motion information in Drosophila melanogaster. Curr. Biol. 32(16), 3529–3544 (2022). https://doi.org/10.1016/j.cub.2022.06.061

    Article  Google Scholar 

  33. Clarke, S.E., Maler, L.: Feedback synthesizes neural codes for motion. Curr. Biol. 27(9), 1356–1361 (2017). https://doi.org/10.1016/j.cub.2017.03.068

    Article  Google Scholar 

  34. Borst, A., Egelhaaf, M., Haag, J.: Mechanisms of dendritic integration underlying gain control in fly motion-sensitive interneurons. J. Comput. Neurosci. 2(1), 5–18 (1995). https://doi.org/10.1007/BF00962705

    Article  Google Scholar 

  35. Lee, Y.J., Jönsson, H.O., Nordström, K.: Spatio- temporal dynamics of impulse responses to figure motion in optic flow neurons. PLoS ONE 10(5), e0126265 (2015). https://doi.org/10.1371/journal.pone.0126265

    Article  Google Scholar 

  36. Straw, A.D.: Vision egg: an open-source library for realtime visual stimulus generation. Front. Neuroinform. 2(1), 4 (2008). https://doi.org/10.3389/neuro.11.004.2008

    Article  Google Scholar 

  37. Guo, J., Wang, J., Bai, R.: A new moving object detection method based on frame-difference and background subtraction. IOP Conf. Series: Mater. Sci. Eng. 242(1), 012115 (2017). https://doi.org/10.1088/1757-899X/242/1/012115

    Article  Google Scholar 

  38. Zhang, Z., Zhang, H., Zhang, Z.: Using three- frame difference algorithm to detect moving objects. Cyber Secur. Intell. Anal. 928, 923–928 (2019). https://doi.org/10.1109/ICECC.2012.120

    Article  Google Scholar 

  39. Zivkovic, Z.: Improved adaptive gaussian mixture model for background subtraction. In: Proceedings of the 17th International Conference on Pattern Recognition (ICPR), Cambridge, UK, (2004). https://doi.org/10.1109/ICPR.2004. 1333992

  40. Barnich, O., Van Droogenbroeck, M.: ViBe: a universal background subtraction algorithm for video sequences. IEEE Trans. Image Process. 20(6), 1709–1724 (2011). https://doi.org/10.1109/TIP.2010.2101613

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This research was funded by the Natural Science Foundation of Jiangxi Province under grant No. 20232BAB202003.

Funding

Funding Natural Science Foundation of Jiangxi Province, 20232BAB202003.

Author information

Authors and Affiliations

Authors

Contributions

Sheng Zhang: Conceptualization, Methodology, Writing-Original Draft. Ke Li: Software, Funding acquisition, Validation, Formal analysis. Dan Zhou: Software, Resources, Visualization, Project Administration. Jingjing Tang: Data curation, Writing- Reviewing and Editing, Supervision, Investigation.

Corresponding authors

Correspondence to Sheng Zhang or Ke Li.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, S., Li, K., Zhou, D. et al. Modeling bio-inspired visual neural for detecting visual features of small- and wide-field moving targets synchronously from complex dynamic environments. SIViP 18, 8881–8898 (2024). https://doi.org/10.1007/s11760-024-03515-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-024-03515-4

Keywords

Navigation