Fine-tuning faster region-based convolution neural networks for detecting poultry feeding behaviors
-
Graphical Abstract
-
Abstract
Poultry feeding behaviors provide valuable information for system design and farm management. This study developed poultry feeding behavior detectors using the faster region-based convolution neural network (faster R-CNN). Twenty 50-day-old Jingfen layer pullets were kept in four experimental compartments and could freely move between adjacent ones. Four light colors (white, red, green, and blue) were supplied to create environmental variations for detector development. A camera was installed atop each compartment to capture images for detector development. Several hyperparameters were fine-tuned to determine the optimal one. Based on the trade-off strategies between detection accuracy and processing speed, the following strategies were deployed to develop the detector: feature extractor of inception V2, the model trained with common objects in context dataset, fixed_shape_resizer with the size of 600×600 pixels, kernel stride of 8300 proposals, and dynamic learning rate. The final detector had 95.7% recall, 94.2% average precision, 94.9% F1 score, 23.5 mm root mean square error, and 8.3 fps processing speed, indicating decent performance for detecting poultry feeding behaviors. With the trained detector, temporal and spatial feeding behaviors of individual poultry can be successfully characterized. It is concluded that the faster R-CNN should be a useful tool to continuously monitor poultry feeding behaviors in group settings.
-
-