Deep Learning-Based Behavior Recognition for Group-Housed Pigs: Advancing Livestock Management with Segmentation Techniques
Downloads
The increasing demand for sustainable, welfare-oriented livestock management necessitates innovative solutions for behavior monitoring, particularly in group-housed settings, where challenges such as animal density and overlapping bodies hinder traditional observation methods. This study introduces a Convolutional Neural Network (CNN)-based model enhanced with segmentation techniques to accurately classify behaviors among group-housed pigs, a context in which individual monitoring is crucial for welfare assessment, disease prevention, and production efficiency. By leveraging segmentation, the model isolates individual pigs in video footage, overcoming occlusion issues and significantly improving classification accuracy. This approach not only advances the analysis of animal behavior in dense environments but also aligns with the principles of innovation, promoting the adoption of AI-driven monitoring solutions in livestock management. In comparison with various models, YOLOv11m-augmentation achieved the highest mAP@0.5 score of 0.969 and a notable precision of 0.925. This CNN and segmentation-based method effectively identifies key behaviors, including eating, drinking, sleeping, and standing, with particularly high precision for behaviors most indicative of animal welfare. This research contributes to sustainable livestock practices by offering a scalable, cost-effective technology for real-time welfare assessment, potentially reducing labor requirements, enhancing farm management decisions, and promoting animal health. The study’s findings underscore the potential of integrating innovation principles with AI in agriculture, presenting a viable pathway toward sustainable livestock management practices that balance productivity with animal welfare.
Downloads
[1] Hassan, M., Park, J. H., & Han, M. H. (2023). Enhancing livestock management with IoT-based wireless sensor networks: a comprehensive approach for health monitoring, location tracking, behavior analysis, and environmental optimization. Journal of Sustainable Urban Futures, 13(6), 34-46.
[2] Kleen, J. L., & Guatteo, R. (2023). Precision Livestock Farming: What Does It Contain and What Are the Perspectives? Animals, 13(5), 779. doi:10.3390/ani13050779.
[3] Min, P. K., Mito, K., & Kim, T. H. (2024). The Evolving Landscape of Artificial Intelligence Applications in Animal Health. Indian Journal of Animal Research, 58(10), 1793–1798. doi:10.18805/IJAR.BF-1742.
[4] Ali AlZubi, A., & Al-Zu’bi, M. (2023). Application of Artificial Intelligence in Monitoring of Animal Health and Welfare. Indian Journal of Animal Research, 57(11), 1550–1555. doi:10.18805/IJAR.BF-1698.
[5] Melak, A., Aseged, T., & Shitaw, T. (2024). The Influence of Artificial Intelligence Technology on the Management of Livestock Farms. International Journal of Distributed Sensor Networks, 2024, 1–12. doi:10.1155/2024/8929748.
[6] Zhang, X., Xiong, Y., Pan, Y., Xu, D., Kawsar, I., Liu, B., & Hou, L. (2023). Deep-learning-based inverse structural design of a battery-pack system. Reliability Engineering & System Safety, 238, 109464. doi:10.1016/j.ress.2023.109464.
[7] Akkajit, P., Sukkuea, A., & Thongnonghin, B. (2023). Comparative analysis of five convolutional neural networks and transfer learning classification approach for microplastics in wastewater treatment plants. Ecological Informatics, 78, 102328. doi:10.1016/j.ecoinf.2023.102328.
[8] Alameer, A., Kyriazakis, I., Dalton, H. A., Miller, A. L., & Bacardit, J. (2020). Automatic recognition of feeding and foraging behaviour in pigs using deep learning. Biosystems Engineering, 197, 91–104. doi:10.1016/j.biosystemseng.2020.06.013.
[9] Alameer, A., Buijs, S., O’Connell, N., Dalton, L., Larsen, M., Pedersen, L., & Kyriazakis, I. (2022). Automated detection and quantification of contact behaviour in pigs using deep learning. Biosystems Engineering, 224, 118–130. doi:10.1016/j.biosystemseng.2022.10.002.
[10] Guo, Q., Sun, Y., Orsini, C., Bolhuis, J. E., de Vlieg, J., Bijma, P., & de With, P. H. N. (2023). Enhanced camera-based individual pig detection and tracking for smart pig farms. Computers and Electronics in Agriculture, 211, 108009. doi:10.1016/j.compag.2023.108009.
[11] Han, X. Le, Jiang, N. J., Hata, T., Choi, J., Du, Y. J., & Wang, Y. J. (2023). Deep learning based approach for automated characterization of large marine microplastic particles. Marine Environmental Research, 183, 105829. doi:10.1016/j.marenvres.2022.105829.
[12] Kühnemund, A., Götz, S., & Recke, G. (2023). Automatic Detection of Group Recumbency in Pigs via AI-Supported Camera Systems. Animals, 13(13), 2205. doi:10.3390/ani13132205.
[13] Ocepek, M., Žnidar, A., Lavrič, M., Škorjanc, D., & Andersen, I. L. (2022). Digipig: First developments of an automated monitoring system for body, head and tail detection in intensive pig farming. Agriculture (Switzerland), 12(1), 2. doi:10.3390/agriculture12010002.
[14] Odo, A., Muns, R., Boyle, L., & Kyriazakis, I. (2023). Video Analysis Using Deep Learning for Automated Quantification of Ear Biting in Pigs. IEEE Access, 11, 59744–59757. doi:10.1109/access.2023.3285144.
[15] Riekert, M., Klein, A., Adrion, F., Hoffmann, C., & Gallmann, E. (2020). Automatically detecting pig position and posture by 2D camera imaging and deep learning. Computers and Electronics in Agriculture, 174, 105391. doi:10.1016/j.compag.2020.105391.
[16] Tran, D. D., & Thanh, N. D. (2023). Pig Health Abnormality Detection Based on Behavior Patterns in Activity Periods using Deep Learning. International Journal of Advanced Computer Science and Applications, 14(5), 603–610. doi:10.14569/IJACSA.2023.0140564.
[17] Yang, Q., Hui, X., Huang, Y., Chen, M., Huang, S., & Xiao, D. (2024). A Long-Term Video Tracking Method for Group-Housed Pigs. Animals, 14(10), 1505. doi:10.3390/ani14101505.
[18] Lu, S., Wang, B., Wang, H., Chen, L., Linjian, M., & Zhang, X. (2019). A real-time object detection algorithm for video. Computers and Electrical Engineering, 77, 398–408. doi:10.1016/j.compeleceng.2019.05.009.
[19] Wang, J., Wang, N., Li, L., & Ren, Z. (2020). Real-time behavior detection and judgment of egg breeders based on YOLO v3. Neural Computing and Applications, 32(10), 5471–5481. doi:10.1007/s00521-019-04645-4.
[20] Jha, S., Seo, C., Yang, E., & Joshi, G. P. (2021). Real time object detection and tracking system for video surveillance system. Multimedia Tools and Applications, 80(3), 3981–3996. doi:10.1007/s11042-020-09749-x.
[21] Sukkuea, A., & Akkajit, P. (2024). Using Convolutional Neural Network for Behavior Classification of Group-Housed Pigs. Proceedings of the 9th World Congress on Civil, Structural, and Environmental Engineering, 120. doi:10.11159/iceptp24.120.
[22] Lei, C., Zeng, J., Xia, Y., & Pang, F. (2024). Aircraft type recognition based on YOLOv8. Journal of Physics: Conference Series, 2787(1), 12047. doi:10.1088/1742-6596/2787/1/012047.
[23] Terven, J., Córdova-Esparza, D. M., & Romero-González, J. A. (2023). A Comprehensive Review of YOLO Architectures in Computer Vision: From YOLOv1 to YOLOv8 and YOLO-NAS. Machine Learning and Knowledge Extraction, 5(4), 1680–1716. doi:10.3390/make5040083.
[24] Mahato, S., & Neethirajan, S. (2024). Integrating Artificial Intelligence in dairy farm management − biometric facial recognition for cows. Information Processing in Agriculture, 12(3), 312-325. doi:10.1016/j.inpa.2024.10.001.
[25] Luo, Y., Xia, J., Lu, H., Luo, H., Lv, E., Zeng, Z., Li, B., Meng, F., & Yang, A. (2024). Automatic Recognition and Quantification Feeding Behaviors of Nursery Pigs Using Improved YOLOV5 and Feeding Functional Area Proposals. Animals, 14(4), 569. doi:10.3390/ani14040569.
[26] Lagua, E. B., Mun, H.-S., Mark, K., Ampode, B., Chem, V., Park, H.-R., Kim, Y.-H., & Yang, C.-J. (2023). A Comprehensive Monitoring System for Health Status and Behavior of Pigs Under Different Environmental Conditions Using Vision and Audio Technologies. doi:10.20944/preprints202309.0695.v1.
[27] Lagua, E. B., Mun, H. S., Ampode, K. M. B., Chem, V., Kim, Y. H., & Yang, C. J. (2023). Artificial Intelligence for Automatic Monitoring of Respiratory Health Conditions in Smart Swine Farming. Animals, 13(11), 1860. doi:10.3390/ani13111860.
[28] Zhang, L., Guo, W., Lv, C., Guo, M., Yang, M., Fu, Q., & Liu, X. (2024). Advancements in artificial intelligence technology for improving animal welfare: Current applications and research progress. Animal Research and One Health, 2(1), 93–109. doi:10.1002/aro2.44.
[29] Tzanidakis, C., Tzamaloukas, O., Simitzis, P., & Panagakis, P. (2023). Precision Livestock Farming Applications (PLF) for Grazing Animals. Agriculture (Switzerland), 13(2), 288. doi:10.3390/agriculture13020288.
[30] Rajendran, J. G., Alagarsamy, M., Seva, V., Dinesh, P. M., Rajangam, B., & Suriyan, K. (2023). IoT based tracking cattle healthmonitoring system using wireless sensors. Bulletin of Electrical Engineering and Informatics, 12(5), 3086–3094. doi:10.11591/eei.v12i5.4610.
[31] Kuzior, A., Sira, M., & Brożek, P. (2023). Use of Artificial Intelligence in Terms of Open Innovation Process and Management. Sustainability (Switzerland), 15(9), 7205. doi:10.3390/su15097205.
[32] Dambaulova, G. K., Madin, V. A., Utebayeva, Z. A., Baimyrzaeva, M. K., & Shora, L. Z. (2023). Benefits of automated pig feeding system: A simplified cost–benefit analysis in the context of Kazakhstan. Veterinary World, 16(11), 2205–2209. doi:10.14202/vetworld.2023.2205-2209.
[33] Chae, J. W., Sim, H. S., Lee, C. W., Choi, C. S., & Cho, H. C. (2024). Video-Based Analysis of Cattle Behaviors: Improved Classification Using FlowEQ Transform. IEEE Access, 12, 42860–42867. doi:10.1109/ACCESS.2024.3379277.
[34] Vladimirov, N., Brui, E., Levchuk, A., Al-Haidri, W., Fokin, V., Efimtcev, A., & Bendahan, D. (2023). CNN-based fully automatic wrist cartilage volume quantification in MR images: A comparative analysis between different CNN architectures. Magnetic Resonance in Medicine, 90(2), 737-751. doi:10.1002/mrm.29671.
[35] Campler, M., Pairis-Garcia, M., Kieffer, J., & Moeller, S. (2019). Sow behavior and productivity in a small stable group-housing system. Journal of Swine Health and Production, 27(2), 76–86. doi:10.54846/jshap/1080.
[36] Huang, L., Xu, L., Wang, Y., Peng, Y., Zou, Z., & Huang, P. (2022). Efficient Detection Method of Pig-Posture Behavior Based on Multiple Attention Mechanism. Computational Intelligence and Neuroscience, 2022, 1–12. doi:10.1155/2022/1759542.
[37] Tan, S. X., Ong, J. Y., Goh, K. O. M., & Tee, C. (2024). Boosting Vehicle Classification with Augmentation Techniques across Multiple YOLO Versions. International Journal on Informatics Visualization, 8(1), 45–54. doi:10.62527/joiv.8.1.2313.
[38] Pham, D. A., & Han, S. H. (2024). Deploying a Computer Vision Model Based on YOLOv8 Suitable for Drones in the Tuna Fishing and Aquaculture Industry. Journal of Marine Science and Engineering, 12(5), 828. doi:10.3390/jmse12050828.
[39] Ali, M. L., & Zhang, Z. (2024). The YOLO framework: A comprehensive review of evolution, applications, and benchmarks in object detection. Computers, 13(12), 336. doi:10.3390/computers13120336.
[40] Sapkota, R., Meng, Z., Churuvija, M., Du, X., Ma, Z., & Karkee, M. (2024). Comprehensive performance evaluation of yolov12, yolo11, yolov10, yolov9 and yolov8 on detecting and counting fruitlet in complex orchard environments. arXiv preprint, arXiv:2407.12040. doi:10.48550/arXiv.2407.12040.
[41] Sapkota, R., Flores-Calero, M., Qureshi, R. et al. YOLO advances to its genesis: a decadal and comprehensive review of the You Only Look Once (YOLO) series. Artificial Intelligence Review, 58, 274. doi:10.1007/s10462-025-11253-3.
[42] Casas, E., Ramos, L., Bendek, E., & Rivas-Echeverria, F. (2024). YOLOv5 vs. YOLOv8: Performance Benchmarking in Wildfire and Smoke Detection Scenarios. Journal of Image and Graphics, 12(2), 127–136. doi:10.18178/joig.12.2.127-136.
[43] Zhang, M., Wang, Z., Song, W., Zhao, D., & Zhao, H. (2024). Efficient Small-Object Detection in Underwater Images Using the Enhanced YOLOv8 Network. Applied Sciences (Switzerland), 14(3), 1095. doi:10.3390/app14031095.
[44] Li, J., Feng, Y., Shao, Y., & Liu, F. (2024). IDP-YOLOV9: Improvement of Object Detection Model in Severe Weather Scenarios from Drone Perspective. Applied Sciences (Switzerland), 14(12), 5277. doi:10.3390/app14125277.
[45] Lu, D., & Wang, Y. (2024). MAR-YOLOv9: A multi-dataset object detection method for agricultural fields based on YOLOv9. PLoS ONE, 19(10 October), 307643. doi:10.1371/journal.pone.0307643.
[46] Yilmaz, B., & Kutbay, U. (2024). YOLOv8-Based Drone Detection: Performance Analysis and Optimization. Computers, 13(9), 234. doi:10.3390/computers13090234.
[47] Duong-Trung, H., & Duong-Trung, N. (2024). Integrating YOLOv8-agri and DeepSORT for Advanced Motion Detection in Agriculture and Fisheries. EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, 11(1), 4618. doi:10.4108/eetinis.v11i1.4618.
[48] Ding, P., Zhan, H., Yu, J., & Wang, R. (2024). A bearing surface defect detection method based on multi-attention mechanism Yolov8. Measurement Science and Technology, 35(8), 86003. doi:10.1088/1361-6501/ad4386.
[49] Akkajit, P., Alahi, M. E. E., & Sukkuea, A. (2024). Enhanced detection and classification of microplastics in marine environments using deep learning. Regional Studies in Marine Science, 80. doi:10.1016/j.rsma.2024.103880.
[50] Orlando, J. I., Gerendas, B. S., Riedl, S., Grechenig, C., Breger, A., Ehler, M., ... & Schmidt-Erfurth, U. (2020). Automated quantification of photoreceptor alteration in macular disease using optical coherence tomography and deep learning. Scientific reports, 10(1), 5619. doi:10.1038/s41598-020-62329-9.
[51] Yang, Q., Hui, X., Huang, Y., Chen, M., Huang, S., & Xiao, D. (2024). A Long-Term Video Tracking Method for Group-Housed Pigs. Animals, 14(10), 1505. doi:10.3390/ani14101505.
[52] Ho, K. Y., Tsai, Y. J., & Kuo, Y. F. (2021). Automatic monitoring of lactation frequency of sows and movement quantification of newborn piglets in farrowing houses using convolutional neural networks. Computers and Electronics in Agriculture, 189, 106376. doi:10.1016/j.compag.2021.106376.
[53] Mao, R., Shen, D., Wang, R., Cui, Y., Hu, Y., Li, M., & Wang, M. (2024). An Integrated Gather-and-Distribute Mechanism and Attention-Enhanced Deformable Convolution Model for Pig Behavior Recognition. Animals, 14(9), 1316. doi:10.3390/ani14091316.
[54] Jin, M., Yang, B., & Wang, C. (2023). Research on Identification and Classification Method of Imbalanced Data Set of Pig Behavior. Engenharia Agricola, 43(2), e20220014. doi:10.1590/1809-4430-ENG.AGRIC.V43N2E20220014/2023.
[55] Wei, J., Tang, X., Liu, J., & Zhang, Z. (2023). Detection of Pig Movement and Aggression Using Deep Learning Approaches. Animals, 13(19), 3074. doi:10.3390/ani13193074.
[56] Tu, S., Zeng, Q., Liang, Y., Liu, X., Huang, L., Weng, S., & Huang, Q. (2022). Automated Behavior Recognition and Tracking of Group-Housed Pigs with an Improved DeepSORT Method. Agriculture (Switzerland), 12(11), 1907. doi:10.3390/agriculture12111907.
[57] Oliveira, R. F. D., Soares, R. D. T. R. N., Moreira, R. H. R., Andrade, R. P. D., Rosenfield, D. A., & Pizzutto, C. S. (2023). Effects of the environmental enrichment on pigs’ behavior and performance. Revista Brasileira de Zootecnia, 52, e20210123. doi:10.37496/rbz5220210123.
[58] Scaillierez, A. J., García-Faria, T. I., Broers, H., van Nieuwamerongen-De Koning, S. E., Van Der Tol, R. P. P. J., Bokkers, E. A. M., & Boumans, I. J. M. M. (2024). Determining the posture and location of pigs using an object detection model under different lighting conditions. Translational Animal Science, 8, 167. doi:10.1093/tas/txae167.
[59] Jahn, S., Schmidt, G., Bachmann, L., Louton, H., Homeier-Bachmann, T., & Schütz, A. K. (2024). Individual behavior tracking of heifers by using object detection algorithm YOLOv4. Frontiers in Animal Science, 5, 1499253. doi:10.3389/fanim.2024.1499253.
[60] Huang, Y., Xiao, D., Liu, J., Tan, Z., Liu, K., & Chen, M. (2023). An Improved Pig Counting Algorithm Based on YOLOv5 and DeepSORT Model. Sensors, 23(14), 6309. doi:10.3390/s23146309.
[61] Wang, L., Li, L., Wang, H., Zhu, S., Zhai, Z., & Zhu, Z. (2023). Real-time vehicle identification and tracking during agricultural master-slave follow-up operation using improved YOLO v4 and binocular positioning. Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, 237(6), 1393–1404. doi:10.1177/09544062221130928.
- This work (including HTML and PDF Files) is licensed under a Creative Commons Attribution 4.0 International License.



















