Classification of Cattle Behaviours Using Neck-Mounted Accelerometer-Equipped Collars and Convolutional Neural Networks
Abstract
:1. Introduction
2. Related Work
3. Materials and Methods
- Rumination—the animal regurgitates partially digested feed, which is re-chewed and re-swallowed, aiding the further breaking down of the feed and thus improving nutrient absorption.
- Eating—the animal is ingesting food from a feed source.
- Other—the animal is engaged in activity which is neither ruminating nor eating.
4. Results and Discussion
4.1. CNN Design and Performance
4.1.1. Training and Validation
4.1.2. Hyper-Parameter Tuning
4.1.3. Window Lengths
4.2. Network Reduction
4.3. Practical Implementation on Low-Power Micro-Controllers
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Cattle Behaviours | |
---|---|
Abell et al. (2017) [18] | Lying, Mounting, Standing & Walking |
Benaissa et al. (2019) [13] | Feeding, Lying & Standing |
Benaissa et al. (2019) [19] | Feeding, Ruminating & Other |
Diosdado et al. (2015) [20] | Feeding, Lying, Standing & Transitions between Standing and Lying |
Dutta et al. (2015) [12] | Grazing, Resting, Ruminating, Scratching or urinating & Searching |
Gonzalez et al. (2015) [17] | Foraging, Resting, Ruminating, Traveling & Other active behaviors |
Hamilton et al. (2019) [10] | Rumination & Non-Rumination |
Kasfi et al. (2016) [21] | Grazing & Other |
Martiskainen et al. (2009) [22] | Feeding, Lame Walking, Lying, Lying down, Ruminating, Standing, Standing up & Walking normally |
Peng et al. (2019) [15] | Feeding, Head butt, Licking salt, Lying, Moving, Ruminating-Lying, Ruminating-Standing & Social licking |
Rahman et al. (2016) [16] | Chewing, Grazing, Resting-Lying, Resting-Standing, Ruminating-Lying or Sitting, Ruminating-Standing, Searching, Walking & Other |
Rahman et al. (2018) [23] | Grazing, Ruminating & Standing |
Robert et al. (2009) [24] | Lying, Standing & Walking |
Smith et al. (2016) [25] | Grazing, Resting, Ruminating & Walking |
Current Study | Eating, Rumination & Other |
References
- AHDB Dairy. AHDB Dairy Statistics. 2020. Available online: https://ahdb.org.uk/dairy (accessed on 12 October 2020).
- Michie, C.; Andonovic, I.; Gilroy, M.; Ross, D.; Duthie, C.A.; Nicol, L. Oestrus Detection in Free Roaming Beef Cattle. In Proceedings of the European Conference on Precision Livestock Farming—EC-PLF 2013, Posters, Lueven, Belgium, 10–12 September 2013. [Google Scholar]
- Fricke, P.M.; Carvalho, P.D.; Giordano, J.O.; Valenza, A.; Lopes, G.; Amundson, M.C. Expression and detection of estrus in dairy cows: The role of new technologies. Animal 2014. [Google Scholar] [CrossRef] [Green Version]
- Roelofs, J.B.; Van Erp-van der Kooij, E. Estrus detection tools and their applicability in cattle: Recent and perspectival situation. Anim. Reprod. 2015, 12, 498–504. [Google Scholar]
- Afimilk/NMR. Silent Herdsman/Better Performing Cows; NMR: Chippenham, UK, 2012. [Google Scholar]
- Stangaferro, M.; Wijma, R.; Caixeta, L.; Al-Abri, M.; Giordano, J. Use of rumination and activity monitoring for the identification of dairy cows with health disorders: Part III. Metritis. J. Dairy Sci. 2016. [Google Scholar] [CrossRef] [Green Version]
- Wolfger, B.; Timsit, E.; Pajor, E.A.; Cook, N.; Barkema, H.W.; Orsel, K. Technical note: Accuracy of an ear tag-attached accelerometer to monitor rumination and feeding behavior in feedlot cattle. J. Anim. Sci. 2015. [Google Scholar] [CrossRef] [Green Version]
- Bar, D.; Solomon, R. Rumination Collars: What Can They Tell Us. In Proceedings of the First North American Conference on Precision Dairy Management, Toronto, ON, Canada, 2–5 March 2010; p. 2. [Google Scholar]
- Pahl, C.; Hartung, E.; Mahlkow-Nerge, K.; Haeussermann, A. Feeding characteristics and rumination time of dairy cows around estrus. J. Dairy Sci. 2015. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hamilton, A.W.; Davison, C.; Tachtatzis, C.; Andonovic, I.; Michie, C.; Ferguson, H.J.; Somerville, L.; Jonsson, N.N. Identification of the rumination in cattle using support vector machines with motion-sensitive bolus sensors. Sensors 2019, 19, 1165. [Google Scholar] [CrossRef] [Green Version]
- Uberoi, E. UK Dairy Industry Statistics. In House of Commons: Brief Paper; House of Commons Library: London, UK, 2020; p. 10. [Google Scholar]
- Dutta, R.; Smith, D.; Rawnsley, R.; Bishop-Hurley, G.; Hills, J.; Timms, G.; Henry, D. Dynamic cattle behavioural classification using supervised ensemble classifiers. Comput. Electron. Agric. 2015, 111, 18–28. [Google Scholar] [CrossRef]
- Benaissa, S.; Tuyttens, F.A.M.; Plets, D.; de Pessemier, T.; Trogh, J.; Tanghe, E.; Martens, L.; Vandaele, L.; Van Nuffel, A.; Joseph, W.; et al. On the use of on-cow accelerometers for the classification of behaviours in dairy barns. Res. Vet. Sci. 2019, 125, 425–433. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zehner, N.; Umstätter, C.; Niederhauser, J.J.; Schick, M. System specification and validation of a noseband pressure sensor for measurement of ruminating and eating behavior in stable-fed cows. Comput. Electron. Agric. 2017, 136, 31–41. [Google Scholar] [CrossRef]
- Peng, Y.; Kondo, N.; Fujiura, T.; Suzuki, T.; Wulandari; Yoshioka, H.; Itoyama, E. Classification of multiple cattle behavior patterns using a recurrent neural network with long short-term memory and inertial measurement units. Comput. Electron. Agric. 2019, 157, 247–253. [Google Scholar] [CrossRef]
- Rahman, A.; Smith, D.; Hills, J.; Bishop-Hurley, G.; Henry, D.; Rawnsley, R. A comparison of autoencoder and statistical features for cattle behaviour classification. In Proceedings of the 2016 International Joint Conference on Neural Networks (IJCNN), Vancouver, BC, Canada, 24–29 July 2016; pp. 2954–2960. [Google Scholar] [CrossRef]
- González, L.A.; Bishop-Hurley, G.J.; Handcock, R.N.; Crossman, C. Behavioral classification of data from collars containing motion sensors in grazing cattle. Comput. Electron. Agric. 2015. [Google Scholar] [CrossRef]
- Abell, K.M.; Theurer, M.E.; Larson, R.L.; White, B.J.; Hardin, D.K.; Randle, R.F. Predicting bull behavior events in a multiple-sire pasture with video analysis, accelerometers, and classification algorithms. Comput. Electron. Agric. 2017, 136, 221–227. [Google Scholar] [CrossRef]
- Benaissa, S.; Tuyttens, F.A.; Plets, D.; Cattrysse, H.; Martens, L.; Vandaele, L.; Joseph, W.; Sonck, B. Classification of ingestive-related cow behaviours using RumiWatch halter and neck-mounted accelerometers. Appl. Anim. Behav. Sci. 2019, 211, 9–16. [Google Scholar] [CrossRef] [Green Version]
- Diosdado, J.A.V.; Barker, Z.E.; Hodges, H.R.; Amory, J.R.; Croft, D.P.; Bell, N.J.; Codling, E.A. Classification of behaviour in housed dairy cows using an accelerometer-based activity monitoring system. Anim. Biotelemetry 2015, 3, 1–14. [Google Scholar]
- Kasfi, K.T.; Hellicar, A.; Rahman, A. Convolutional Neural Network for Time Series Cattle Behaviour Classification. In Proceedings of the Workshop on Time Series Analytics and Applications—TSAA ’16, Hobart, Tasmania, 5 December 2016; ACM Press: New York, NY, USA, 2016; pp. 8–12. [Google Scholar] [CrossRef]
- Martiskainen, P.; Järvinen, M.; Skön, J.K.; Tiirikainen, J.; Kolehmainen, M.; Mononen, J. Cow behaviour pattern recognition using a three-dimensional accelerometer and support vector machines. Appl. Anim. Behav. Sci. 2009, 119, 32–38. [Google Scholar] [CrossRef]
- Rahman, A.; Smith, D.V.; Little, B.; Ingham, A.B.; Greenwood, P.L.; Bishop-Hurley, G.J. Cattle behaviour classification from collar, halter, and ear tag sensors. Inf. Process. Agric. 2018. [Google Scholar] [CrossRef]
- Robert, B.; White, B.J.; Renter, D.G.; Larson, R.L. Evaluation of three-dimensional accelerometers to monitor and classify behavior patterns in cattle. Comput. Electron. Agric. 2009, 67, 80–84. [Google Scholar] [CrossRef]
- Smith, D.; Rahman, A.; Bishop-Hurley, G.J.; Hills, J.; Shahriar, S.; Henry, D.; Rawnsley, R. Behavior classification of cows fitted with motion collars: Decomposing multi-class classification into a set of binary problems. Comput. Electron. Agric. 2016, 131, 40–50. [Google Scholar] [CrossRef]
- ITIN+HOCH. RumiWatchSystem: Measurement System for Automatic Health Monitoring in Ruminants. 2014. Available online: https://www.rumiwatch.com/ (accessed on 12 October 2020).
- Poulopoulou, I.; Lambertz, C.; Gauly, M. Are automated sensors a reliable tool to estimate behavioural activities in grazing beef cattle? Appl. Anim. Behav. Sci. 2019, 216, 1–5. [Google Scholar] [CrossRef]
- Borchers, M.R.; Chang, Y.M.; Tsai, I.C.; Wadsworth, B.A.; Bewley, J.M. A validation of technologies monitoring dairy cow feeding, ruminating, and lying behaviors. J. Dairy Sci. 2016, 99, 7458–7466. [Google Scholar] [CrossRef]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. arXiv 2015, arXiv:1502.03167. [Google Scholar]
- Glorot, X.; Bordes, A.; Bengio, Y. Deep sparse rectifier neural networks. In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, Fort Lauderdale, FL, USA, 11–13 April 2011; pp. 315–323. [Google Scholar]
- Loshchilov, I.; Hutter, F. Decoupled Weight Decay Regularization. In Proceedings of the International Conference on Learning Representations, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
- Smith, L.N. A disciplined approach to neural network hyper-parameters: Part 1—Learning rate, batch size, momentum, and weight decay. arXiv 2018, arXiv:cs.LG/1803.09820. [Google Scholar]
- Prechelt, L. Early stopping-but when. In Neural Networks: Tricks of the Trade; Springer: Berlin/Heidelberg, Germany, 1998; pp. 55–69. [Google Scholar]
- Waskom, M.; Botvinnik, O.; O’Kane, D.; Hobson, P.; Lukauskas, S.; Gemperline, D.C.; Qalieh, A. Mwaskom/Seaborn. 2020. Available online: https://zenodo.org/record/3767070#.YMQgCUwRWUl (accessed on 12 June 2021).
- Denton, E.L.; Zaremba, W.; Bruna, J.; LeCun, Y.; Fergus, R. Exploiting linear structure within convolutional networks for efficient evaluation. arXiv 2014, arXiv:1404.0736. [Google Scholar]
- Jaderberg, M.; Vedaldi, A.; Zisserman, A. Speeding up Convolutional Neural Networks with Low Rank Expansions. In Proceedings of the British Machine Vision Conference, Nottingham, UK, 1–5 September 2014. [Google Scholar] [CrossRef] [Green Version]
- Hinton, G.; Vinyals, O.; Dean, J. Distilling the knowledge in a neural network. arXiv 2015, arXiv:1503.02531. [Google Scholar]
- Han, S.; Mao, H.; Dally, W.J. Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding. arXiv 2015, arXiv:1510.00149. [Google Scholar]
- Rastegari, M.; Ordonez, V.; Redmon, J.; Farhadi, A. Xnor-net: Imagenet classification using binary convolutional neural networks. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 11–14 October 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 525–542. [Google Scholar]
- Gong, Y.; Liu, L.; Yang, M.; Bourdev, L. Compressing deep convolutional networks using vector quantization. arXiv 2014, arXiv:1412.6115. [Google Scholar]
- Blalock, D.; Ortiz, J.J.G.; Frankle, J.; Guttag, J. What is the state of neural network pruning? arXiv 2020, arXiv:2003.03033. [Google Scholar]
- Han, S.; Pool, J.; Tran, J.; Dally, W.J. Learning both weights and connections for efficient neural networks. Adv. Neural Inf. Process. Syst. 2015, 2015, 1135–1143. [Google Scholar]
- Frankle, J.; Carbin, M. The lottery ticket hypothesis: Finding sparse, trainable neural networks. arXiv 2018, arXiv:1803.03635. [Google Scholar]
- LeCun, Y.; Denker, J.S.; Solla, S.A. Optimal brain damage. In Advances in Neural Information Processing Systems; Morgan Kaufmann: San Francisco, CA, USA, 1990; pp. 598–605. [Google Scholar]
- Hassibi, B.; Stork, D.G. Second order derivatives for network pruning: Optimal brain surgeon. In Advances in Neural Information Processing Systems; Morgan Kaufmann: San Francisco, CA, USA, 1993; pp. 164–171. [Google Scholar]
- Li, H.; Kadav, A.; Durdanovic, I.; Samet, H.; Graf, H.P. Pruning filters for efficient convnets. arXiv 2016, arXiv:1608.08710. [Google Scholar]
- He, Y.; Zhang, X.; Sun, J. Channel pruning for accelerating very deep neural networks. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 1389–1397. [Google Scholar]
- Luo, J.H.; Wu, J.; Lin, W. Thinet: A filter level pruning method for deep neural network compression. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 5058–5066. [Google Scholar]
- Liu, Z.; Li, J.; Shen, Z.; Huang, G.; Yan, S.; Zhang, C. Learning efficient convolutional networks through network slimming. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2736–2744. [Google Scholar]
- Zhu, M.; Gupta, S. To prune, or not to prune: Exploring the efficacy of pruning for model compression. arXiv 2017, arXiv:1710.01878. [Google Scholar]
- ST Microelectronics. UM2526: Introduction Getting Started with X-CUBE-AI Expansion Package for Artificial Intelligence (AI) UM2526 User Manual. 2020. Available online: https://www.st.com/resource/en/user_manual/dm00570145-getting-started-with-xcubeai-expansion-package-for-artificial-intelligence-ai-stmicroelectronics.pdf (accessed on 12 June 2021).
- Saft Batteries. LS14500 Datasheet. 2019. Available online: https://www.saftbatteries.com/products-solutions/products/ls-lsh-lsp/ (accessed on 3 October 2020).
- InvenSense. MPU-6000 and MPU-6050 Product Specification Revision 3.4; InvenSense Inc.: Sunnyvale, CA, USA, 2013. [Google Scholar]
Algorithm | Device | Data Set Size | Ground Truth | Number of Behaviours | Performance (Best Model/Device) | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Animals | Days | Hours | Acc | Pr | Re | F1 | |||||
Abell et al. (2017) [18] | DT, RT, RF | C/E/W A | 2 | 3 | - | V | 4 | 0.76–0.97 | 0.02–0.96 | 0.78–0.94 | - |
Benaissa et al. (2019) [13] | k-NN, Naive Bayes, SVM | C/P A | 16 | - | 96 | HO V | 3 | 0.99 | 0.96–0.99 | 0.96–1.00 | - |
Benaissa et al. (2019) [19] | SVM, DT | C A | 10 | 5 | 60 | HO | 3 | 0.93 | 0.88–0.98 | 0.85–0.92 | - |
Diosdado et al. (2015) [20] | DT, SVM, K-means, HMM | C A | 6 | 6 | 34 | HO | 4 | - | 0.55–0.98 | 0.77–0.98 | - |
Dutta et al. (2015) [12] | Ensemble | C A/M GPS | 24 | 10 | - | HO | 5 | 0.96 | - | 0.97 | 0.89 |
Gonzalez et al. (2015) [17] | DT | C A/M GPS | 58 | 31 | 43 | HO | 5 | - | - | 0.90 | - |
Hamilton et al. (2019) [10] | SVM | B A | 3 | 16 | 181.8 | C | 2 | - | 0.83 | 0.89 | 0.86 |
Kasfi et al.(2016) [21] | CNN | C A | 22 | 8 | - | HO | 2 | - | 0.82 | 0.89 | 0.84 |
Martiskainen et al. (2009) [22] | SVM | C A | 30 | 30 | 95.5 | V | 8 | 0.84–1.00 | 0.78 | 0.00–0.80 | - |
Peng et al. (2019) [15] | RNN/LSTM | C A/G/M | 6 | 7 | 420 | V | 8 | 0.88 | 0.88 | 0.88 | 0.88 |
Rahman et al. (2016) [16] | Autoencoder/SVM | C A | 22 | 8 | - | HO | 9 | - | 0.40–0.82 | 0.45–0.95 | 0.63–0.85 |
Rahman et al. (2018) [23] | RF | C/H/E A | - | - | - | HO V | 3 | - | - | - | 0.89–0.93 |
Robert et al. (2009) [24] | DT | P A | 15 | 21 | 11 | V | 3 | 0.98 | - | - | - |
Smith et al. (2016) [25] | Ensemble | C A | 24 | 8 | - | HO | 4 | - | 0.77–0.97 | 0.69–0.99 | 0.73–0.98 |
Current Study | CNN | C A | 18 | 62† | 3460 | H | 3 | - | 0.83 | 0.82 | 0.82 |
Pruned Filters | FP Precision | precision | recall | F1 Score | Params | Compression | Operations | Speed-Up | Memory (kB) |
---|---|---|---|---|---|---|---|---|---|
0 | FP32 | 0.84 | 0.82 | 0.82 | 170,563 | - | - | 666.2 | |
48 | FP32 | 0.83 | 0.82 | 0.82 | 11,923 | 14.30 | 13.3 | 46.6 | |
60 | FP32 | 0.81 | 0.81 | 0.81 | 1063 | 160.45 | 125.7 | 4.1 | |
0 | FP16 | 0.83 | 0.82 | 0.82 | 170,563 | - | - | 333.1 | |
48 | FP16 | 0.83 | 0.82 | 0.82 | 11,923 | 14.30 | 13.3 | 23.3 | |
60 | FP16 | 0.84 | 0.83 | 0.83 | 1063 | 160.45 | 125.7 | 2.0 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pavlovic, D.; Davison, C.; Hamilton, A.; Marko, O.; Atkinson, R.; Michie, C.; Crnojević, V.; Andonovic, I.; Bellekens, X.; Tachtatzis, C. Classification of Cattle Behaviours Using Neck-Mounted Accelerometer-Equipped Collars and Convolutional Neural Networks. Sensors 2021, 21, 4050. https://doi.org/10.3390/s21124050
Pavlovic D, Davison C, Hamilton A, Marko O, Atkinson R, Michie C, Crnojević V, Andonovic I, Bellekens X, Tachtatzis C. Classification of Cattle Behaviours Using Neck-Mounted Accelerometer-Equipped Collars and Convolutional Neural Networks. Sensors. 2021; 21(12):4050. https://doi.org/10.3390/s21124050
Chicago/Turabian StylePavlovic, Dejan, Christopher Davison, Andrew Hamilton, Oskar Marko, Robert Atkinson, Craig Michie, Vladimir Crnojević, Ivan Andonovic, Xavier Bellekens, and Christos Tachtatzis. 2021. "Classification of Cattle Behaviours Using Neck-Mounted Accelerometer-Equipped Collars and Convolutional Neural Networks" Sensors 21, no. 12: 4050. https://doi.org/10.3390/s21124050
APA StylePavlovic, D., Davison, C., Hamilton, A., Marko, O., Atkinson, R., Michie, C., Crnojević, V., Andonovic, I., Bellekens, X., & Tachtatzis, C. (2021). Classification of Cattle Behaviours Using Neck-Mounted Accelerometer-Equipped Collars and Convolutional Neural Networks. Sensors, 21(12), 4050. https://doi.org/10.3390/s21124050