Conditions for Unnecessary Logical Constraints in Kernel Machines | SpringerLink
Skip to main content

Conditions for Unnecessary Logical Constraints in Kernel Machines

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2019: Deep Learning (ICANN 2019)

Abstract

A main property of support vector machines consists in the fact that only a small portion of the training data is significant to determine the maximum margin separating hyperplane in the feature space, the so called support vectors. In a similar way, in the general scheme of learning from constraints, where possibly several constraints are considered, some of them may turn out to be unnecessary with respect to the learning optimization, even if they are active for a given optimal solution. In this paper we extend the definition of support vector to support constraint and we provide some criteria to determine which constraints can be removed from the learning problem still yielding the same optimal solutions. In particular, we discuss the case of logical constraints expressed by Łukasiewicz logic, where both inferential and algebraic arguments can be considered. Some theoretical results that characterize the concept of unnecessary constraint are proved and explained by means of examples.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11210
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14013
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Predicates sharing the same domain may be approximated in the same RKHS by using the same kernel function.

  2. 2.

    See e.g. [12] for more details on fuzzy logics.

  3. 3.

    The number of linear pieces \(I_h\) depends on both the formula and the number of groundings used in that formula.

References

  1. Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp. 144–152. ACM (1992)

    Google Scholar 

  2. Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)

    Book  Google Scholar 

  3. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995). https://doi.org/10.1023/A:1022627411411

    Article  MATH  Google Scholar 

  4. Cumby, C.M., Roth, D.: On kernel methods for relational learning. In: Proceedings of the 20th International Conference on Machine Learning (ICML 2003), pp. 107–114 (2003)

    Google Scholar 

  5. Diligenti, M., Gori, M., Maggini, M., Rigutini, L.: Bridging logic and kernel machines. Mach. Learn. 86(1), 57–88 (2012)

    Article  MathSciNet  Google Scholar 

  6. Diligenti, M., Gori, M., Saccà, C.: Semantic-based regularization for learning and inference. Artif. Intell. 244, 143–165 (2015)

    Article  MathSciNet  Google Scholar 

  7. Giannini, F., Diligenti, M., Gori, M., Maggini, M.: Learning Łukasiewicz logic fragments by quadratic programming. In: Ceci, M., Hollmén, J., Todorovski, L., Vens, C., Džeroski, S. (eds.) ECML PKDD 2017. LNCS (LNAI), vol. 10534, pp. 410–426. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-71249-9_25

    Chapter  Google Scholar 

  8. Giannini, F., Diligenti, M., Gori, M., Maggini, M.: On a convex logic fragment for learning and reasoning. IEEE Trans. Fuzzy Syst. 27(7), 1407–1416 (2018)

    Article  Google Scholar 

  9. Gnecco, G., Gori, M., Melacci, S., Sanguineti, M.: Foundations of support constraint machines. Neural computation 27(2), 388–480 (2015)

    Article  MathSciNet  Google Scholar 

  10. Gori, M., Melacci, S.: Support constraint machines. In: Lu, B.-L., Zhang, L., Kwok, J. (eds.) ICONIP 2011. LNCS, vol. 7062, pp. 28–37. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-24955-6_4

    Chapter  Google Scholar 

  11. Gori, M., Melacci, S.: Constraint verification with kernel machines. IEEE Trans. Neural Networks Learn. Syst. 24(5), 825–831 (2013)

    Article  Google Scholar 

  12. Hájek, P.: Metamathematics of Fuzzy Logic. Trends in Logic, vol. 4, 1st edn. Springer, Dordrecht (1998). https://doi.org/10.1007/978-94-011-5300-3

    Book  MATH  Google Scholar 

  13. Hu, Z., Ma, X., Liu, Z., Hovy, E., Xing, E.: Harnessing deep neural networks with logic rules. arXiv preprint arXiv:1603.06318 (2016)

  14. Jung, J.H., O’Leary, D.P., Tits, A.L.: Adaptive constraint reduction for convex quadratic programming. Comput. Optim. Appl. 51(1), 125–157 (2012)

    Article  MathSciNet  Google Scholar 

  15. Muggleton, S., Lodhi, H., Amini, A., Sternberg, M.J.E.: Support vector inductive logic programming. In: Hoffmann, A., Motoda, H., Scheffer, T. (eds.) DS 2005. LNCS (LNAI), vol. 3735, pp. 163–175. Springer, Heidelberg (2005). https://doi.org/10.1007/11563983_15

    Chapter  Google Scholar 

  16. Paulsen, V.I., Raghupathi, M.: An Introduction to the Theory of Reproducing Kernel Hilbert spaces, vol. 152. Cambridge University Press, Cambridge (2016)

    Book  Google Scholar 

  17. Rockafellar, R.T., Wets, R.J.B.: Variational analysis. Grundlehren der mathematischen Wissenschaften, vol. 317, 1st edn. Springer, Heidelberg (2009)

    MATH  Google Scholar 

  18. Serafini, L., Garcez, A.d.: Logic tensor networks: deep learning and logical reasoning from data and knowledge. arXiv preprint arXiv:1606.04422 (2016)

  19. Serafini, L., d’Avila Garcez, A.S.: Learning and reasoning with logic tensor networks. In: Adorni, G., Cagnoni, S., Gori, M., Maratea, M. (eds.) AI*IA 2016. LNCS (LNAI), vol. 10037, pp. 334–348. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-49130-1_25

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francesco Giannini .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Giannini, F., Maggini, M. (2019). Conditions for Unnecessary Logical Constraints in Kernel Machines. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Deep Learning. ICANN 2019. Lecture Notes in Computer Science(), vol 11728. Springer, Cham. https://doi.org/10.1007/978-3-030-30484-3_49

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30484-3_49

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30483-6

  • Online ISBN: 978-3-030-30484-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics