Abstract
Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a “well-posed” statistical problem in the manner of a ridge regression. The advantage of BRANNs is that the models are robust and the validation process, which scales as O(N2) in normal regression methods, such as back propagation, is unnecessary. These networks provide solutions to a number of problems that arise in QSAR modeling, such as choice of model, robustness of model, choice of validation set, size of validation effort, and optimization of network architecture. They are difficult to overtrain, since evidence procedures provide an objective Bayesian criterion for stopping training. They are also difficult to overfit, because the BRANN calculates and trains on a number of effective network parameters or weights, effectively turning off those that are not relevant. This effective number is usually considerably smaller than the number of weights in a standard fully connected back-propagation neural net. Automatic relevance determination (ARD) of the input variables can be used with BRANNs, and this allows the network to “estimate” the importance of each input. The ARD method ensures that irrelevant or highly correlated indices used in the modeling are neglected as well as showing which are the most important variables for modeling the activity data.
This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model. Some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Burden FR, Winkler DA (1999) Robust QSAR models using Bayesian regularized neural networks. J Med Chem 42:3183–3187.
Winkler DA, Burden FR (2000) Robust QSAR models from novel descriptors and Bayesian regularized neural networks. Mol Simul 24:243–258.
MacKay DJC (1992) A practical Bayesian framework for backpropagation networks. Neural Computation 4:448–472.
Lucic B, Amic D, Trinajstic N. (2000) Nonlinear multivariate regression outperforms several concisely designed neural networks on three QSPR data sets. J Chem Inf Comput Sci 40:403–413.
Neal RN (1996) Bayesian learning for neural networks. Springer-Verlag New York, Inc., Secaucus, NJ.
Hawkins DM, Basak SC, Mills D (2003) Assessing model fit by cross-validation. J Chem. Inf Comput Sci 43:579–58
Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford.
Nabney IT (2002) Netlab: algorithms for pattern recognition. Springer-Verlag, London.
Baskin II, Ait AO, Halberstamc NM, PalyulinVA, Zefirov NS (2002) An approach to the interpretation of backpropagation neural network models in QSAR studies. SAR QSAR Environ Res 13:35–41.
Burden FR, Ford MG, Whitley DC, Winkler DA (2000) Use of automatic relevance determination in QSAR studies using Bayesian neural networks. J Chem Inf Comput Sci 40:1423–1430.
Polley MJ, Burden FR, Winkler, D. A. (2005) Predictive human intestinal absorption QSAR models using Bayesian regularized neural networks. Australian Journal of Chemistry 58:859–863.
Burden F R (1996) Using artificial neural networks to predict biological activity from simple molecular structure considerations. Quant Struct-Act Relat 15:7–11.
Burden FR (1989) Molecular identification number for substructure searches. J Chem Inf Comput Sci 29:225–227.
Winkler DA, Burden FR (2004) Bayesian neural nets for modeling in drug discovery. Biosilico 2:104–111.
Gasteiger J, Marsili,M (1980) Iterative partial equalization of orbital electronegativity—a rapid access to atomic charges. Tetrahedron. 36:3219–3288.
Burden FR, Winkler DA. (2000) A QSAR model for the acute toxicity of substituted benzenes to tetrahymena pyriformis using Bayesian Regularized neural networks. Chem Re. Toxicol 13:436--440.
Burden FR (1997) A chemically intuitive molecular index based on the eigenvalues of a modified adjacency matrix. Quant Struct-Act Relat 16:309–314.
Winkler DA, Burden FR (2004) Modelling blood brain barrier partitioning using Bayesian neural nets, J. Mol. Graph. Model. 22:499–508.
van Rossum G. (1995) Python tutorial. Technical Report CS-R9526, Centrum voor Wiskunde en Informatica (CWI), Amsterdam, May1995.
van Rossum G, Drake FL Jr (eds) (2003) Python/C API reference manual. PythonLabs, release 2.2.330 May.
van Rossum G, Drake FL Jr (eds) (2003) Python library reference. PythonLabs, release 2.2.330 May.
Winkler DA, Burden FR (2000) Robust QSAR models from novel descriptors and Bayesian regularized neural networks. Mol Simul 24:243–258.
Winkler DA, Burden FR. (2002) Application of neural networks to large dataset QSAR, virtual screening and library design. in: Bellavance-English,L (ed) Combinatorial chemistry methods and protocols., Humana Press, Totowa, NJ.
Bruneau P (2001) Search for predictive generic model of aqueous solubility using Bayesian neural nets. J Chem Inf Comput Sci 41:1605–1616.
Klocker J, Wailzer B, Buchbauer G, Wolschann P (2002) Bayesian neural networks for aroma classification. J Chem Inf Comput Sci 42:1443–1449.
MacKay DJC (1992) Bayesian interpolation. Neural Computation 4:415–447.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Humana Press, a part of Springer Science + Business Media, LLC
About this protocol
Cite this protocol
Burden, F., Winkler, D. (2008). Bayesian Regularization of Neural Networks. In: Livingstone, D.J. (eds) Artificial Neural Networks. Methods in Molecular Biology™, vol 458. Humana Press. https://doi.org/10.1007/978-1-60327-101-1_3
Download citation
DOI: https://doi.org/10.1007/978-1-60327-101-1_3
Publisher Name: Humana Press
Print ISBN: 978-1-58829-718-1
Online ISBN: 978-1-60327-101-1
eBook Packages: Springer Protocols