Parametric Sensitivity and Scalability of k-Winners-Take-All Networks with a Single State Variable and Infinity-Gain Activation Functions | SpringerLink
Skip to main content

Parametric Sensitivity and Scalability of k-Winners-Take-All Networks with a Single State Variable and Infinity-Gain Activation Functions

  • Conference paper
Advances in Neural Networks - ISNN 2010 (ISNN 2010)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6063))

Included in the following conference series:

Abstract

In recent years, several k-winners-take-all (kWTA) neural networks were developed based on a quadratic programming formulation. In particular, a continuous-time kWTA network with a single state variable and its discrete-time counterpart were developed recently. These kWTA networks have proven properties of global convergence and simple architectures. Starting with problem formulations, this paper reviews related existing kWTA networks and extends the existing kWTA networks with piecewise linear activation functions to the ones with high-gain activation functions. The paper then presents experimental results of the continuous-time and discrete-time kWTA networks with infinity-gain activation functions. The results show that the kWTA networks are parametrically robust and dimensionally scalable in terms of problem size and convergence rate.

The work described in this paper was supported by a grant from the Research Grants Council of the Hong Kong Special Administrative Region, China (Project no. CUHK417608E).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11439
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14299
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Majani, E., Erlanson, R., Abu-Mostafa, Y.: On the k-winners-take-all network. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems, vol. 1, pp. 634–642. Morgan-Kaufmann, San Mateo (1989)

    Google Scholar 

  2. Maass, W.: On the computational power of winner-take-all. Neural Comput. 12, 2519–2535 (2000)

    Article  Google Scholar 

  3. Fish, A., Akselrod, D., Yadid-Pecht, O.: High precision image centroid computation via an adaptive k-winner-take-all circuit in conjunction with a dynamic element matching algorithm for star tracking applications. Analog Integrated Circuits and Signal Processing 39, 251–266 (2004)

    Article  Google Scholar 

  4. Marr, D., Poggio, T.: Cooperative computation of stereo disparity. Science 194(4262), 283–287 (1976)

    Article  Google Scholar 

  5. Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Analysis and Machine Intelligence 20, 1254–1259 (1998)

    Article  Google Scholar 

  6. Yuille, A., Geiger, D.: Winner-take-all networks. In: The Handbook of Brain Theory and Neural Networks, 2nd edn., pp. 1228–1231. MIT Press, Cambridge (2003)

    Google Scholar 

  7. Pouliquen, P.O., Andreou, A.G., Strohbehn, K.: Winner-takes-all associative memory: a hamming distance vector quantizer. Analog Integrated Circuits and Signal Processing 13, 211–222 (1997)

    Article  Google Scholar 

  8. DeSouza, G.N., Kak, A.C.: Vision for mobile robot navigation: a survey. IEEE Trans. Pattern Analysis and Machine Intelligence 24, 237–267 (2002)

    Article  Google Scholar 

  9. Wolfe, W., Mathis, D., Anderson, C., Rothman, J., Gottler, M., Brady, G., Walker, R., Duane, G., Algghband, G.: K-winner networks. IEEE Trans. Neural Networks 2, 310–315 (1991)

    Article  Google Scholar 

  10. Dempsey, G.L., McVey, E.S.: Circuit implementation of a peak detector neural network. IEEE Trans. Circuits Syst. II 40, 585–591 (1993)

    Article  Google Scholar 

  11. Wang, J.: Analogue winner-take-all neural networks for determining maximum and minimum signals. Int. J. Electron. 77, 355–367 (1994)

    Article  Google Scholar 

  12. Urahama, K., Nagao, T.: K-winners-take-all circuit with o(n) complexity. IEEE Trans. Neural Networks 6(3), 776–778 (1995)

    Article  Google Scholar 

  13. Wu, J.L., Tseng, Y.H.: On a constant-time, low-complexity winner-take-all neural network. IEEE Trans. Computers 44, 601–604 (1995)

    Article  MATH  Google Scholar 

  14. Yen, J., Guo, J., Chen, H.: A new k-winners-take-all neural network and its array architecture. IEEE Trans. Neural Networks 9(5), 901–912 (1998)

    Article  Google Scholar 

  15. Sekerkiran, B., Cilingiroglu, U.: A CMOS k-winners-take-all circuit with O(N) complexity. IEEE Trans. Circuits and Systems 46, 1–5 (1999)

    Google Scholar 

  16. Sum, J.P.F., Leung, C.S., Tam, P.K.S., Young, G.H., Kan, W.K., Chan, L.W.: Analysis for a class of winner-take-all model. IEEE Trans. Neural Netw. 10, 64–71 (1999)

    Article  Google Scholar 

  17. Calvert, B.A., Marinov, C.: Another k-winners-take-all analog neural network. IEEE Trans. Neural Netw. 11, 829–838 (2000)

    Article  Google Scholar 

  18. Marinov, C., Calvert, B.: Performance analysis for a k-winners-take-all analog neural network: basic theory. IEEE Trans. Neural Networks 14(4), 766–780 (2003)

    Article  MathSciNet  Google Scholar 

  19. Marinov, C.A., Hopfield, J.J.: Stable computational dynamics for a class of circuits with O(N) interconnections capable of KWTA and rank extractions. IEEE Trans. Circuits Syst. I 52, 949–959 (2005)

    Article  MathSciNet  Google Scholar 

  20. Liu, S., Wang, J.: A simplified dual neural network for quadratic programming with its KWTA application. IEEE Trans. Neural Netw. 17(6), 1500–1510 (2006)

    Article  Google Scholar 

  21. Hu, X., Wang, J.: Design of general projection neural networks for solving monotone linear variational inequalities and linear and quadratic optimization problems. IEEE Tran. Systems, Man and Cybernetics - Part B 37, 1414–1421 (2007)

    Article  Google Scholar 

  22. Liu, Q., Wang, J.: Two k-winners-take-all networks with discontinuous activation functions. Neural Networks 21(2-3), 406–413 (2008)

    Article  Google Scholar 

  23. Hu, X., Wang, J.: An improved dual neural network for solving a class of quadratic programming problems and its k-winners-take-all application. IEEE Trans. Neural Networks 19, 2022–2031 (2008)

    Article  MathSciNet  Google Scholar 

  24. Xia, Y., Sun, C.: A novel neural dynamical approach to convex quadratic program and its efficient applications. Neural Networks 22(10), 1463–1470 (2009)

    Article  MathSciNet  Google Scholar 

  25. Liu, Q., Cao, J., Liang, J.: A discrete-time recurrent neural network with one neuron for k-winners-take-all operation. In: Yu, W., He, H., Zhang, N. (eds.) ISNN 2009. LNCS, vol. 5551, pp. 272–278. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  26. Bazaraa, M.S., Jarvis, J.J., Sherali, H.D.: Linear Programming and Network Flows. Wiley, New York (1990)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wang, J., Guo, Z. (2010). Parametric Sensitivity and Scalability of k-Winners-Take-All Networks with a Single State Variable and Infinity-Gain Activation Functions. In: Zhang, L., Lu, BL., Kwok, J. (eds) Advances in Neural Networks - ISNN 2010. ISNN 2010. Lecture Notes in Computer Science, vol 6063. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13278-0_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-13278-0_11

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-13277-3

  • Online ISBN: 978-3-642-13278-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics