Lazy Learning Algorithms for Problems with Many Binary Features and Classes | SpringerLink
Skip to main content

Lazy Learning Algorithms for Problems with Many Binary Features and Classes

  • Conference paper
  • First Online:
Progress in Artificial Intelligence — IBERAMIA 98 (IBERAMIA 1998)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1484))

Included in the following conference series:

Abstract

We have designed several new lazy learning algorithms for learning problems with many binary features and classes. This particular type of learning task can be found in many machine learning applications but is of special importance for machine learning of natural language. Besides pure instance-based learning we also consider prototype-based learning, which has the big advantage of a large reduction of the required memory and processing time for classification. As an application for our learning algorithms we have chosen natural language database interfaces. In our interface architecture the machine learning module replaces an elaborate semantic analysis component. The learning task is to select the correct command class based on semantic features extracted from the user input. We use an existing German natural language interface to a production planning and control system as a case study for our evaluation and compare the results achieved by the different lazy learning algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Aha, D. W., Kibler, D., Albert, M.: Instance-based learning algorithms. Machine Learning 7 (1991) 37–66 113

    Google Scholar 

  2. Androutsopoulos, I., Ritchie, G. D., Thanisch, P.: Natural language interfaces to databases — an introduction. Journal of Natural Language Engineering 1:1 (1995) 29–81 113

    Article  Google Scholar 

  3. Barja, M. L. et al.: An effective deductive object-oriented database through language integration. Proc. of the Intl. Conf. on Very Large Data Bases (1994) 463–474 113

    Google Scholar 

  4. Daelemans, W., van den Bosch, A.: Generalisation performance of backpropagation learning on a syllabification task. Drossaers, M., Nijholt, A. (eds): TWLT3: Connectionism and Natural Language Processing. Twente University Press, Enschede (1992) 27–37 113

    Google Scholar 

  5. Datta, P., Kibler, D.: Learning prototypical concept descriptions. Proc. of the Intl. Conf. on Machine Learning (1995) 41–45 113

    Google Scholar 

  6. Imielinski, T., Mannila, H.: A database perspective on knowledge discovery. Communications of the ACM 39:11 (1996) 58–64 113

    Article  Google Scholar 

  7. Quinlan, J. R.: Induction of decision trees. Machine Learning 1 (1986) 81–206 113

    Google Scholar 

  8. Riloff, E., Lehnert, W.: Information extraction as a basis for high-precision text classification. ACM Transactions on Information Systems 12:3 (1994) 296–333 113

    Article  Google Scholar 

  9. Tversky, A.: Features of similarity. Psychological Review, 84:4 (1977) 327–352 113

    Article  Google Scholar 

  10. Uehara, K., Tanizawa, M., Maekawa, S.: PBL: Prototype-based learning algorithm. 113 Wess, S., Althoff, K.-D., Richter, M. M. (eds): Topics in Case-Based Reasoning. Springer-Verlag, Berlin (1994)

    Google Scholar 

  11. Winiwarter, W.: The Integrated Deductive Approach to Natural Language Interfaces. Ph.D. thesis, University of Vienna (1994) 117

    Google Scholar 

  12. Winiwarter, W.: MIDAS — the morphological component of the IDA system for efficient natural language interface design. Proc. of the Intl. Conf. on Database and Expert Systems Applications (1995) 584–593 115

    Google Scholar 

  13. Winiwarter, W.: Unknown value lists and their use for semantic analysis in IDA — the integrated deductive approach to natural language interface design. Proc. of the Australasian Database Conf. (1996) 194–203 115

    Google Scholar 

  14. Winiwarter, W., Kambayashi, Y.: A comparative study of the application of different learning techniques to natural language interfaces. Proc. of the Workshop on Computational Natural Language Learning (1997) 125–135 116

    Google Scholar 

  15. Winiwarter, W., Kambayashi, Y.: A machine learning workbench in a DOOD framework. Proc. of the Intl. Conf. on Database and Expert Systems Applications (1997) 452–461 113

    Google Scholar 

  16. Zavrel, J., Daelemans, W.: Memory-based learning: Using similarity for smoothing. Proc. of the Annual Meeting of the ACL and Conf. of the European Chapter of the ACL (1997) 436–443 112

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Winiwarter, W. (1998). Lazy Learning Algorithms for Problems with Many Binary Features and Classes. In: Coelho, H. (eds) Progress in Artificial Intelligence — IBERAMIA 98. IBERAMIA 1998. Lecture Notes in Computer Science(), vol 1484. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-49795-1_10

Download citation

  • DOI: https://doi.org/10.1007/3-540-49795-1_10

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-64992-2

  • Online ISBN: 978-3-540-49795-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics