Author ORCID Identifier

https://orcid.org/0000-0003-0141-7839

Date Available

4-25-2018

Year of Publication

2018

Document Type

Doctoral Dissertation

Degree Name

Doctor of Philosophy (PhD)

College

Agriculture, Food and Environment

Department/School/Program

Animal and Food Sciences

Advisor

Dr. Jeffrey M. Bewley

Abstract

Precision dairy monitoring is used to supplement or replace human observation of dairy cattle. This study examined the value dairy producers placed on disease alerts generated from a precision dairy monitoring technology. A secondary objective was calculating the accuracy of technology-generated disease alerts compared against observed disease events. A final objective was determining the economic viability of investing in a precision dairy monitoring technology for detecting estrus and diseases.

A year-long observational study was conducted on four Kentucky dairy farms. All lactating dairy cows were equipped with a neck and leg tri-axial accelerometer. Technologies measured eating time, lying time, standing time, walking time, and activity (steps) in 15-min sections throughout the day. A decrease of ≥ 30% or more from a cow’s 10-d moving behavioral mean created an alert. Alerts were assessed by dairy producers for usefulness and by the author for accuracy. Finally, raw information was analyzed with three machine-learning techniques: random forest, least discriminate analyses, and principal component neural networks.

Through generalized linear mixed modeling analyses, dairy producers were found to utilize the alert list when ≤ 20 alerts occurred, when alerts occurred in cows’ ≤ 60 d in lactation, and when alerts occurred during the week. The longer the system was in place, the less likely producers were to utilize alerts. This is likely because the alerts were not for a specific disease, but rather informed the dairy producer an issue might have occurred. The longer dairy producers were exposed to a technology, producers more easily decided which alerts were worth their attention.

Sensitivity, specificity, accuracy, and balanced accuracy were calculated for disease alerts that occurred and disease events that were reported. Sensitivity ranged from 12 to 48%, specificity from 91 to 96%, accuracy from 90 to 96%, and balanced accuracy from 50 to 59%. The high number of false positives correspond with the lack of usefulness producers reported. Machine learning techniques improved sensitivity (66 to 86%) and balanced accuracy (48 to 85%). Specificity (24 to 89%) and accuracy (70 to 86%) decreased with the machine learning techniques, but overall detection performance was improved. Precision dairy monitoring technologies have potential to detect behavior changes linked to disease events.

A partial budget was created based on the reproduction, production, and early lactation removal rate of an average cow in a herd. The cow results were expanded to a 1,000 cow herd for sensitivity analyses. Four analyses were run including increased milk production from early disease detection, increased estrus detection rate, decreased early lactation removal from early disease detection, and all changes in combination. Economic profitability was determined through net present value with a value ≥ $0 indicating a profitable investment. Each sensitivity analysis was conducted 10,000, with different numbers for key inputs randomly selected from a previously defined distribution. If either milk production or estrus detection were improved, net present value was ≥ 0 in 76 and 85% of the iterations. However, reduced early lactation removal never resulted in a value ≥ 0. Investing in precision dairy technology resulting in improved estrus detection rate and early disease detection was a positive economic decision in most iterations.

Digital Object Identifier (DOI)

https://doi.org/10.13023/ETD.2018.081

Share

COinS