摘要

The aim is to develop a computer vision-based method for automatic recognition of nursing interactions under commercial farm conditions by using spatial and temporal information of nursing behaviour. For spatial information extraction, the spatial distribution between the mother sow and her piglets during nursing was used to detect the possible nursing episodes. Sows were segmented accurately by a fully convolutional network, and udder zones were calculated dynamically by the geometrical properties of the nursing sow and the piglet length. Spatial information from piglets was extracted in the self-adaptive udder zones. For temporal information extraction, to distinguish behaviours similar to nursing, temporal motion information about the intensity of motion and the occupation index was extracted from optical flow of the udder zones. Six sows with 64 piglets, each on different days postpartum, were captured on videos. From these videos, 507 episodes of videos of two sows were selected as training set and 502 episodes of another two sows were used as test set. The accuracy, sensitivity and specificity achieved 96.4%, 92.0% and 98.5%, respectively. In addition, our method was used to recognise the nursing behaviour in four extended videos of the two remaining sows. The accuracy achieved reached 97.6% with a sensitivity of 90.9% and with a specificity of 99.2%. The results show that the recognition method designed represents a significant step forward in automatically identifying nursing behaviour in commercial pig farms.