Abstract
Background: The design of novel drugs is vital to combat fatal diseases such as Alzheimer’s. With quantum advances in computational methods, artificial intelligence (AI) techniques have been widely utilized in drug discovery. Since drug design is a protracted and resource-intensive process, extensive research is necessary for building predictive in-silico models to discover new medications for Alzheimer’s. A thorough analysis of models is, therefore, required to expedite the discovery of new drugs.
Objective: In this study, the performance of machine learning (ML) and deep learning (DL) models for predicting the bioactivity of compounds for Alzheimer’s inhibition is assessed. Additionally, an interaction network is constructed to visualize the clustered bioactivity networks.
Methods: The dataset was initially prepared from a public repository of bioactive compounds and was curated. Exploratory data analysis was performed to get insights into the gathered data. A bioactivity interaction network was then constructed to detect communities and compute the network metrics. Next, ML and DL models were built, and their hyperparameters were tuned to improve model performance. Finally, the metrics of all the models were compared to identify the best-performing model for bioactivity prediction.
Results: The bioactivity network revealed the formation of three communities. The ML models were ranked based on lower error scores, and the best five models were hybridized to create a blended regressor. Subsequently, two DL models, namely a deep neural network (DNN) and long short-term memory with recurrent neural network architecture (LSTM-RNN), were built. The analysis revealed that the LSTM-RNN outperformed all the models analysed in this study.
Conclusion: In summary, this study illustrates a bioactivity network and proposes a DL technique to build robust models for in-silico prediction of drug bioactivity against Alzheimer's.
Keywords: Drug bioactivity prediction, Alzheimer’s disease, bioactivity network, machine learning, deep neural networks, regression, LSTM-RNN.
[http://dx.doi.org/10.1155/2016/3891253]
[http://dx.doi.org/10.3389/fncom.2015.00066] [PMID: 26082713]
[http://dx.doi.org/10.1007/s11030-017-9732-0] [PMID: 28275924]
[http://dx.doi.org/10.2174/1389203720666190103123434] [PMID: 30605056]
[http://dx.doi.org/10.1007/s11682-015-9448-7] [PMID: 26363784]
[http://dx.doi.org/10.1038/s41598-019-49656-2] [PMID: 31541187]
[http://dx.doi.org/10.3390/antiox9030229] [PMID: 32168776]
[http://dx.doi.org/10.1155/2017/4649191]
[http://dx.doi.org/10.1093/bioinformatics/btz418] [PMID: 31116390]
[http://dx.doi.org/10.1016/j.drudis.2019.07.006] [PMID: 31377227]
[http://dx.doi.org/10.1186/s13321-017-0222-2] [PMID: 29086166]
[http://dx.doi.org/10.1007/978-1-61779-965-5_8] [PMID: 22821596]
[http://dx.doi.org/10.1371/journal.pone.0246126] [PMID: 33508008]
[http://dx.doi.org/10.1186/s13321-016-0162-2] [PMID: 28316649]
[http://dx.doi.org/10.1186/s40537-020-00299-5]
[http://dx.doi.org/10.5220/0007949902960303]
[http://dx.doi.org/10.3390/molecules25225277] [PMID: 33198233]
[http://dx.doi.org/10.2174/1574893615999200724145434]
[http://dx.doi.org/10.1007/s11030-021-10282-8] [PMID: 34327619]
[http://dx.doi.org/10.2174/1567205016666190103154855] [PMID: 30605059]
[http://dx.doi.org/10.2174/1389450119666180809122244] [PMID: 30091413]
[http://dx.doi.org/10.1038/s41598-021-85157-x] [PMID: 33692435]
[http://dx.doi.org/10.1186/s13321-020-00428-5] [PMID: 33430964]
[http://dx.doi.org/10.1016/j.tins.2017.10.001] [PMID: 29074032]
[http://dx.doi.org/10.1038/s41573-019-0024-5] [PMID: 30976107]
[http://dx.doi.org/10.1016/j.drudis.2018.05.010] [PMID: 29750902]
[http://dx.doi.org/10.1007/s10822-020-00314-0] [PMID: 32361862]
[http://dx.doi.org/10.1109/EuroSP.2019.00044]
[http://dx.doi.org/10.1109/HSCMA.2017.7895577]
[http://dx.doi.org/10.1021/acs.jcim.8b00768] [PMID: 30753070]
[http://dx.doi.org/10.3389/frobt.2019.00108] [PMID: 33501123]
[http://dx.doi.org/10.6026/97320630003384] [PMID: 19707563]
[http://dx.doi.org/10.1016/j.csbj.2018.11.007] [PMID: 30595814]
[http://dx.doi.org/10.1016/j.comtox.2017.01.001]
[http://dx.doi.org/10.1186/s13321-017-0232-0] [PMID: 29086168]
[http://dx.doi.org/10.1038/aps.2015.143] [PMID: 26725739]
[http://dx.doi.org/10.1145/3318464.3383126]
[http://dx.doi.org/10.1016/j.commatsci.2019.109472]
[http://dx.doi.org/10.1002/widm.1178]
[http://dx.doi.org/10.1038/s41598-020-75029-1] [PMID: 33093586]
[http://dx.doi.org/10.1109/ACCESS.2016.2612242]
[http://dx.doi.org/10.1103/PhysRevE.97.052306] [PMID: 29906904]
[http://dx.doi.org/10.1016/j.asoc.2017.11.014]
[http://dx.doi.org/10.1145/3343031.3356060]
[http://dx.doi.org/10.1109/JIOT.2021.3066575]
[http://dx.doi.org/10.1080/1206212X.2019.1692511]
[http://dx.doi.org/10.1002/hbm.24899] [PMID: 31837193]
[http://dx.doi.org/10.1080/1062936X.2019.1650827] [PMID: 31460798]
[http://dx.doi.org/10.2174/1574893614666191127101836]
[http://dx.doi.org/10.1007/978-3-030-00006-6_29]
[http://dx.doi.org/10.1080/08838151.2012.732140]
[http://dx.doi.org/10.2174/1574893615999200707141420]
[http://dx.doi.org/10.1109/BigData.2018.8622462]
[http://dx.doi.org/10.1016/j.mcna.2018.10.009] [PMID: 30704681]