Objectives
Bovine Viral Diarrhoea (BVD) is an infectious disease of economic importance across the world. The production losses in BVD-affected farms have motivated many national control programmes. In 2013 Animal Health Ireland (AHI) initiated the compulsory phase of the Irish national BVD eradication programme based on ear notch antigen testing of newly born calves. The objective of our study was to conduct a comparative epidemiological and economic assessment to understand the benefits and pitfalls of introducing herd level serological testing (Option 2) as an alternative surveillance strategy to individual animal virus testing (Option 1).
Materials and Methods
The comparative evaluation of surveillance strategies was performed with a spatially explicit herd-level simulation model. The model was tailored to represent the farm and national dimension of the Irish cattle sector. It comprises approximately 100,000 individual cattle farms, their location, size, management groups and breeding schedules, contiguity on land parcel level and the complete record of cattle movements between Irish farms replicated by source, target and age data. Farm behaviour related to the retention of PI animals was also included. Secondary patterns reported from AHI programme data were used to calibrate the model. The model was run for 5 years of BVD endemicity (i.e. corresponding to 2008-2012) followed by 10 years of control programme (i.e. 2013 onwards). The simulated measures utilised Option 1 for 3 years, continuing (i.e. from 2016) with either Option 1 or 2. The latter including a temporary return to virus testing on farms with a BVD positive diagnosis. Key model outputs included the number of farms with BVD, the number of calves detected as PI per year, the quantity of testing applied under the various control programmes, and assessment of pathways leading to breakdown of BVD negative farms e.g. ‘Trojan’ cow movement or across-the-fence transmission.
Results
If retention of detected PI calves was excluded, both surveillance options resulted in BVD eradication, with only a slight delay associated with Option 2. Costs of diagnostic testing differed at the national level and were lower for option 1 because a BVD breakdown in a herd using serological screening would require a return to tag testing for some years. The risk of farm breakdown declined as the programme progressed. There was a tipping point in cost ratio, depending on farm level prevalence, beyond which the expected future costs would favour Option 2.However, when the level of PI retention, as reported from AHI programme data, was modelled, the simulation outcomes were less optimistic. Eradication was delayed with a doubling of the control time, or was not achieved within the 10 year time window of the simulation.
Conclusions
The retention of PI animals by some farmers is hindering the effectiveness of the simulated BVD control programme by lengthening time to eradication, increasing the cost of eradication and jeopardising control efforts on BVD-negative farms. In the absence of PI retention, in time it would become cost-effective to change from individual calf virus testing to a farm-level serological surveillance approach.
- Proceedings of the 29th World Buiatrics Congress, Dublin, Ireland, 3-8 July 2016 - Oral Communication and Poster Abstracts