Distributed Prediction and Hierarchical Knowledge Discovery by ARTMAP Neural Networks | SpringerLink
Skip to main content

Distributed Prediction and Hierarchical Knowledge Discovery by ARTMAP Neural Networks

  • Conference paper
Knowledge-Based Intelligent Information and Engineering Systems (KES 2003)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2773))

Abstract

Adaptive Resonance Theory (ART) neural networks model real-time prediction, search, learning, and recognition. ART networks function both as models of human cognitive information processing [1,2,3] and as neural systems for technology transfer [4]. A neural computation central to both the scientific and the technological analyses is the ART matching rule [5], which models the interaction between top-down expectation and bottom-up input, thereby creating a focus of attention which, in turn, determines the nature of coded memories. Sites of early and ongoing transfer of ART-based technologies include industrial venues such as the Boeing Corporation [6] and government venues such as MIT Lincoln Laboratory [7]. A recent report on industrial uses of neural networks [8] states: ”[The] Boeing ... Neural Information Retrieval System is probably still the largest-scale manufacturing application of neural networks. It uses [ART] to cluster binary templates of aeroplane parts in a complex hierarchical network that covers over 100,000 items, grouped into thousands of self-organised clusters. Claimed savings in manufacturing costs are in millions of dollars per annum.” At Lincoln Lab, a team led by Waxman developed an image mining system which incorporates several models of vision and recognition developed in the Boston University Department of Cognitive and Neural Systems (BU/CNS). Over the years a dozen CNS graduates (Aguilar, Baloch, Baxter, Bomberger, Cunningham, Fay, Gove, Ivey, Mehanian, Ross, Rubin, Streilein) have contributed to this effort, which is now located at Alphatech, Inc. Customers for BU/CNS neural network technologies have attributed their selection of ART over alternative systems to the model’s defining design principles. In listing the advantages of its THOT(r) technology, for example, American Heuristics Corporation (AHC) cites several characteristic computational capabilities of this family of neural models, including fast on-line (one-pass) learning, “vigilant” detection of novel patterns, retention of rare patterns, improvement with experience, ”weights [which] are understandable in real world terms,” and scalability (www.heuristics.com). Design principles derived from scientific analyses and design constraints imposed by targeted applications have jointly guided the development of many variants of the basic networks, including fuzzy ARTMAP [9], ART-EMAP [10], ARTMAP-IC [11], Gaussian ARTMAP [12], and distributed ARTMAP [3,13]. Comparative analysis of these systems has led to the identification of a default ARTMAP network, which features simplicity of design and robust performance in many application domains [4,14]. Selection of one particular ARTMAP algorithm is intended to facilitate ongoing technology transfer. The default ARTMAP algorithm outlines a procedure for labeling an arbitrary number of output classes in a supervised learning problem. A critical aspect of this algorithm is the distributed nature of its internal code representation, which produces continuous-valued test set predictions distributed across output classes. The character of their code representations, distributed vs. winner-take-all, is, in fact, a primary factor differentiating various ARTMAP networks. The original models [9,15] employ winner-take-all coding during training and testing, as do many subsequent variations and the majority of ART systems that have been transferred to technology. ARTMAP variants with winner-take-all (WTA) coding and discrete target class predictions have, however, shown consistent deficits in labeling accuracy and post-processing adjustment capabilities. The talk will describe a recent application that relies on distributed code representations to exploit the ARTMAP capacity for one-to-many learning, which has enabled the development of self-organizing expert systems for multi-level object grouping, information fusion, and discovery of hierarchical knowledge structures. A pilot study has demonstrated the network’s ability to infer multi-level fused relationships among groups of objects in an image, without any supervised labeling of these relationships, thereby pointing to new methodologies for self-organizing knowledge discovery.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Grossberg, S.: The link between brain, learning, attention, and consciousness. Consciousness and Cognition 8, 1–44 (1999), ftp://cns-ftp.bu.edu/pub/diana/Gro.concog98.ps.gz

    Article  Google Scholar 

  2. Grossberg, S.: How does the cerebral cortex work? Development, learning, attention, and 3D vision by laminar circuits of visual cortex. Behavioral and Cognitive Neuroscience Reviews (2003) (in press), http://www.cns.bu.edu/Profiles/Grossberg/Gro2003BCNR.pdf

  3. Carpenter, G.A.: Distributed learning, recognition, and prediction by ART and ARTMAP neural networks. Neural Networks 10, 1473–1494 (1997), http://cns.bu.edu/~gail/115_dART_NN_1997_.pdf

    Article  Google Scholar 

  4. Parsons, O., Carpenter, G.A.: ARTMAP neural networks for information fusion and data mining: map production and target recognition methodologies. Neural Networks 16 (2003), http://cns.bu.edu/~gail/ARTMAP_map_2003_.pdf

  5. Carpenter, G.A., Grossberg, S.: A massively parallel architecture for a self-organizing neural pattern recognition machine. Computer Vision, Graphics, and Image Processing 37, 54–115 (1987)

    Article  MATH  Google Scholar 

  6. Caudell, T.P., Smith, S.D.G., Escobedo, R., Anderson, M.: NIRS: Large scale ART 1 neural architectures for engineering design retrieval. Neural Networks 7, 1339–1350 (1994), http://cns.bu.edu/~gail/NIRS_Caudell_1994_.pdf

    Article  Google Scholar 

  7. Streilein, W., Waxman, A., Ross, W., Liu, F., Braun, M., Fay, D., Harmon, P., Read, C.H.: Fused multi-sensor image mining for feature foundation data. In: Proceedings of 3rd International Conference on Information Fusion, Paris., vol. I (2000)

    Google Scholar 

  8. Lisboa, P.: Industrial use of saftey-related artificial neural netoworks. Contract Research Report 327/2001, Liverpool John Moores University (2001), http://www.hse.gov.uk/research/crr_pdf/2001/crr01327.pdf

  9. Carpenter, G.A., Grossberg, S., Markuzon, N., Reynolds, J.H., Rosen, D.B.: Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional maps. IEEE Transactions on Neural Networks 3, 698–713 (1992), http://cns.bu.edu/~gail/070_Fuzzy_ARTMAP_1992_.pdf

    Article  Google Scholar 

  10. Carpenter, G.A., Ross, W.D.: ART-EMAP: A neural network architecture for object recognition by evidence accumulation. IEEE Transactions on Neural Networks 6, 805–818 (1995), http://cns.bu.edu/~gail/097_ART-EMAP_1995_.pdf

    Article  Google Scholar 

  11. Carpenter, G.A., Markuzon, N.: ARTMAP-IC and medical diagnosis: Instance counting and inconsistent cases. Neural Networks 11, 323–336 (1998), http://cns.bu.edu/~gail/117_ARTMAP-IC_1998_.pdf

    Article  Google Scholar 

  12. Williamson, J.R.: Gaussian ARTMAP: A neural network for fast incremental learning of noisy multidimensional maps. Neural Networks 9, 881–897 (1998), http://cns.bu.edu/~gail/G-ART_Williamson_1998_.pdf

    Article  Google Scholar 

  13. Carpenter, G.A., Milenova, B.L., Noeske, B.W.: Distributed ARTMAP: a neural network for fast distributed supervised learning. Neural Networks 11, 793–813 (1998), http://cns.bu.edu/~gail/120_dARTMAP_1998_.pdf

    Article  Google Scholar 

  14. Carpenter, G.A.: Default ARTMAP. In: Proceedings of the International Joint Conference on Neural Networks, IJCNN 2003 (2003), http://cns.bu.edu/~gail/Default_ARTMAP_2003_.pdf

  15. Carpenter, G.A., Grossberg, S., Reynolds, J.H.: ARTMAP: Supervised real-time learning and classification of nonstationary data by a self-organizing neural network. Neural Networks 4, 565–588 (1991), http://cns.bu.edu/~gail/054_ARTMAP_1991_.pdf

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Carpenter, G.A. (2003). Distributed Prediction and Hierarchical Knowledge Discovery by ARTMAP Neural Networks. In: Palade, V., Howlett, R.J., Jain, L. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2003. Lecture Notes in Computer Science(), vol 2773. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-45224-9_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-45224-9_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40803-1

  • Online ISBN: 978-3-540-45224-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics