Skip to main content

Judgment Analysis in a Streaming Setting Incurring Logarithmic Space of the Annotators

  • Conference paper
  • First Online:
Pattern Recognition (ICPR 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15324))

Included in the following conference series:

  • 103 Accesses

Abstract

Judgment analysis in a crowdsourcing environment refers to seeking opinions from a diverse and large set of online annotators for various applications and making a consensus out of them. The existing approaches to judgment analysis work only in the static scenario where all the opinions are available beforehand. We aim to develop a generalized approach of judgment analysis in a streaming setting, where the questions are available but opinions received from the annotators are streaming in. The current paper provides the very first algorithm that can perform judgment analysis on crowdsourced opinions received in streams. We demonstrate the performance of the proposed approach on two datasets achieving an accuracy closer to the majority voting, although the space requirement of our approach is significantly better. The required space is bounded by a logarithmic factor of the number of annotators.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
€32.70 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (France)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 63.34
Price includes VAT (France)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 78.06
Price includes VAT (France)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Amer-Yahia, S., Roy, S.B.: Human factors in crowdsourcing. Proceedings of the VLDB Endowment 9(13), 1615–1618 (2016)

    Article  Google Scholar 

  2. Boudreau, K.J., Lacetera, N., Lakhani, K.R.: Incentives and problem uncertainty in innovation contests: An empirical analysis. Manage. Sci. 57(5), 843–863 (2011)

    Article  Google Scholar 

  3. Boudreau, K.J., Lakhani, K.R.: Using the crowd as an innovation partner. Harv. Bus. Rev. 91(4), 60–9 (2013)

    Google Scholar 

  4. Buecheler, T., Sieg, J.H., Füchslin, R.M., Pfeifer, R.: Crowdsourcing, open innovation and collective intelligence in the scientific method: a research agenda and operational framework. In: The 12th International Conference on the Synthesis and Simulation of Living Systems, Odense, Denmark, 19-23 August 2010. pp. 679–686. MIT Press (2010)

    Google Scholar 

  5. Chatterjee, S., Bhattacharyya, M.: Judgment analysis of crowdsourced opinions using biclustering. Inf. Sci. 375, 138–154 (2017)

    Article  Google Scholar 

  6. Chatterjee, S., Mukhopadhyay, A., Bhattacharyya, M.: A review of judgment analysis algorithms for crowdsourced opinions. IEEE Trans. Knowl. Data Eng. 32(7), 1234–1248 (2019)

    Article  Google Scholar 

  7. Datar, M., Gionis, A., Indyk, P., Motwani, R.: Maintaining stream statistics over sliding windows. SIAM J. Comput. 31(6), 1794–1813 (2002)

    Article  MathSciNet  Google Scholar 

  8. Dustdar, S., Gaedke, M.: The social routing principle. IEEE Internet Comput. 15(4), 80–83 (2011)

    Article  Google Scholar 

  9. Gino, F., Staats, B.R.: The microwork solution. Harvard Business Review 90(12), 92–+ (2012)

    Google Scholar 

  10. Ipeirotis, P.G.: Analyzing the amazon mechanical turk marketplace. XRDS: Crossroads, The ACM magazine for students 17(2), 16–21 (2010)

    Google Scholar 

  11. Kittur, A., Nickerson, J.V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., Lease, M., Horton, J.: The future of crowd work. In: Proceedings of the 2013 Conference on Computer Supported Cooperative Work. pp. 1301–1318 (2013)

    Google Scholar 

  12. Leimeister, J.M., Huber, M., Bretschneider, U., Krcmar, H.: Leveraging crowdsourcing: activation-supporting components for it-based ideas competition. J. Manag. Inf. Syst. 26(1), 197–224 (2009)

    Article  Google Scholar 

  13. Li, B., Cheng, Y., Yuan, Y., Yang, Y., Jin, Q., Wang, G.: Acta: Autonomy and coordination task assignment in spatial crowdsourcing platforms. Proceedings of the VLDB Endowment 16(5), 1073–1085 (2023)

    Article  Google Scholar 

  14. Liu, T.X., Yang, J., Adamic, L.A., Chen, Y.: Crowdsourcing with all-pay auctions: A field experiment on taskcn. Manage. Sci. 60(8), 2020–2037 (2014)

    Article  Google Scholar 

  15. Muthukrishnan, S., et al.: Data streams: Algorithms and applications. Foundations and Trends® in Theoretical Computer Science 1(2), 117–236 (2005)

    Google Scholar 

  16. Paolacci, G., Chandler, J., Ipeirotis, P.G.: Running experiments on amazon mechanical turk. Judgm. Decis. Mak. 5(5), 411–419 (2010)

    Article  Google Scholar 

  17. Rubinfeld, R., Shapira, A.: Sublinear time algorithms. SIAM J. Discret. Math. 25(4), 1562–1588 (2011)

    Article  MathSciNet  Google Scholar 

  18. Snow, R., O’connor, B., Jurafsky, D., Ng, A.Y.: Cheap and fast–but is it good? evaluating non-expert annotations for natural language tasks. In: Proceedings of the 2008 conference on empirical methods in natural language processing. pp. 254–263 (2008)

    Google Scholar 

  19. Sorokin, A., Forsyth, D.: Utility data annotation with amazon mechanical turk. In: 2008 IEEE computer society conference on computer vision and pattern recognition workshops. pp. 1–8. IEEE (2008)

    Google Scholar 

  20. Tang, J.C., Cebrian, M., Giacobe, N.A., Kim, H.W., Kim, T., Wickert, D.B.: Reflecting on the darpa red balloon challenge. Commun. ACM 54(4), 78–85 (2011)

    Article  Google Scholar 

  21. Varshney, L.R., Agarwal, S., Chee, Y.M., Sindhgatta, R.R., Oppenheim, D.V., Lee, J., Ratakonda, K.: Cognitive coordination of global service delivery. arXiv preprint arXiv:1406.0215 (2014)

  22. Yang, Y., Cheng, Y., Yuan, Y., Wang, G., Chen, L., Sun, Y.: Privacy-preserving cooperative online matching over spatial crowdsourcing platforms. Proceedings of the VLDB Endowment 16(1), 51–63 (2022)

    Article  Google Scholar 

  23. Yu, L.: Daren c. brabham: Crowdsourcing: The mit press, 2013, 176 pp, 12.95, isbn9780262518475 (2014)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Malay Bhattacharyya .

Editor information

Editors and Affiliations

Ethics declarations

Data and Code Availability

Both the datasets, namely Fact Evaluation and Sentiment Analysis, analyzed in this paper along with the source codes of the algorithms applied on them are freely accessible from the GitHub link: https://github.com/malaybhattacharyya/Judgment_Streaming_Logarithmic.

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Patel, J., Mandal, S., Bhattacharyya, M. (2025). Judgment Analysis in a Streaming Setting Incurring Logarithmic Space of the Annotators. In: Antonacopoulos, A., Chaudhuri, S., Chellappa, R., Liu, CL., Bhattacharya, S., Pal, U. (eds) Pattern Recognition. ICPR 2024. Lecture Notes in Computer Science, vol 15324. Springer, Cham. https://doi.org/10.1007/978-3-031-78383-8_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-78383-8_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-78382-1

  • Online ISBN: 978-3-031-78383-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics