Analysis on the Boltzmann Machine with Random Input Drifts in Activation Function | SpringerLink
Skip to main content

Analysis on the Boltzmann Machine with Random Input Drifts in Activation Function

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2020)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12534))

Included in the following conference series:

  • 2113 Accesses

Abstract

The Boltzmann machine (BM) model is able to learn the probability distribution of input patterns. However, in analog realization, there are thermal noise and random offset voltages of amplifiers. Those realization issues affect the behaviour of the neurons’ activation function and they can be modelled as random input drifts. This paper analyzes the activation function and state distribution of BMs under the input random drift model. Since the state of a neuron is also determined by its activation function, the random input drifts may cause a BM to change the behaviour. We show that the effect of random input drifts is equivalent to raising temperature factor. Hence, from the Kullback–Leibler (KL) divergence perspective, we propose a compensation scheme to reduce the effect of random input drifts. In our derive of compensation scheme, we assume that the input drift follows the Gaussian distribution. Surprisedly, from our simulations, the proposed compensation scheme also works very well for other distributions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11439
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14299
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Ackley, D.H., Hinton, G.E., Sejnowski, T.J.: A learning algorithm for Boltzmann machines. Cogn. Sci. 9(1), 147–169 (1985)

    Article  Google Scholar 

  2. Hinton, G.E., Sejnowski, T.J., Ackley, D.H.: Boltzmann machines: constraint satisfaction networks that learn. Carnegie-Mellon University, Pittsburgh (1984)

    Google Scholar 

  3. Prager, R.W., Harrison, T.D., Fallside, F.: Boltzmann machines for speech recognition. Comput. Speech Lang. 1(1), 3–27 (1986)

    Article  Google Scholar 

  4. Ma, H.: Pattern recognition using Boltzmann machine. In: Proceedings IEEE Southeastcon 1995. Visualize the Future, pp. 23–29 (1995)

    Google Scholar 

  5. Tang, Y., Salakhutdinov, R., Hinton, G.: Robust Boltzmann machines for recognition and denoising. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2264–2271 (2012)

    Google Scholar 

  6. Lee, P.: Low noise amplifier selection guide for optimal noise performance. In: Analog Devices Application Note AN-940 (2009)

    Google Scholar 

  7. He, J., Zhan, S., Chen, D., Geiger, R.L.: Analysis of static and dynamic random offset voltages in dynamic comparators. IEEE Trans. Circuits Syst. I Regul. Pap. 56(5), 911–919 (2009)

    Article  MathSciNet  Google Scholar 

  8. Redoute, J.-M., Steyaert, M.: Measurement of emi induced input offset voltage of an operational amplifier. Electronics Letters. 43(20), 1088–1090 (2007)

    Article  Google Scholar 

  9. Wang, H., Feng, R., Han, Z.F., Leung, C.S.: ADMM-based algorithm for training fault tolerant RBF networks and selecting centers. IEEE Trans. Neural Netw. Learn. Syst. 29(8), 3870–3878 (2018)

    Article  MathSciNet  Google Scholar 

  10. Leung, C.S., Wan, W.Y., Feng, R.: A regularizer approach for RBF networks under the concurrent weight failure situation. IEEE Trans. Neural Netw. Learn. Syst. 28(6), 1360–1372 (2017)

    Article  Google Scholar 

  11. Xiao, Y., Feng, R., Leung, C.S., Sum, J.: Objective function and learning algorithm for the general node fault situation. IEEE Trans. Neural Netw. Learn. Syst. 27(4), 863–874 (2016)

    Article  MathSciNet  Google Scholar 

  12. Chen, H., Murray, A.F.: Continuous restricted Boltzmann machine with an implementable training algorithm. IEE Proc. Vis. Image Signal Process. 150(3), 153–158 (2003)

    Article  Google Scholar 

  13. Chen, H., Fleury, P.C.D., Murray, A.F.: Continuous-valued probabilistic behavior in a VLSI generative model. IEEE Trans. Neural Netw. Learn. Syst. 17(3), 755–770 (2006)

    Article  Google Scholar 

  14. Sum, J., Leung, C.S.: Learning algorithm for Boltzmann machines with additive weight and bias noise. IEEE Trans. Neural Netw. Learn. Syst. 30(10), 3200–3204 (2019)

    Article  MathSciNet  Google Scholar 

  15. Haley, D.C.: Estimation of the dosage mortality relationship when the dose is subject to error. Tech. rep. Applied Mathematics and Statistics Labs. Standford University (1952)

    Google Scholar 

Download references

Acknowledgement

The work presented in this paper is supported by a research grant from the Taiwan MOST No. 108-2221-E-005-036 and a research grant from City University of Hong Kong (9610431).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chi-Sing Leung .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lu, W., Leung, CS., Sum, J. (2020). Analysis on the Boltzmann Machine with Random Input Drifts in Activation Function. In: Yang, H., Pasupa, K., Leung, A.CS., Kwok, J.T., Chan, J.H., King, I. (eds) Neural Information Processing. ICONIP 2020. Lecture Notes in Computer Science(), vol 12534. Springer, Cham. https://doi.org/10.1007/978-3-030-63836-8_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-63836-8_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-63835-1

  • Online ISBN: 978-3-030-63836-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics