University of the West of England: Using machine vision technology to monitor animal wellbeing
Today’s farmer is not only interested in their animals’ physical health, but also their emotional wellbeing. We’re not pretending these animals are not being reared for food, but we all have a responsibility to ensure animals are content, happy and healthy throughout their lives, and healthier animals deliver higher yields. Animal welfare is also often at the forefront of the customers’ mind.
Mel Smith, a professor at the Centre for Machine Vision (CMV), part of the Bristol Robotics Laboratory (BRL) at the University of West of England, has been using existing technologies in innovative ways to monitor animal wellbeing.
Tagging a pig’s ear can cause pain and distress to the pig. Tags can also get ripped off and they get dirty. Mel has been working on a way of identifying the pig without even touching it, using the latest ideas in deep learning. In a recent trial, a drinker was adapted and fitted with a motion activated webcam, which takes thousands of pictures of the pigs’ faces every day, feeding a computer algorithm which successfully identifies the animal with 97% accuracy. But this goes beyond facial recognition. Mel believes his work shows that pigs are revealing their emotional state through facial expression. Are they happy? Are they content? Are they nervous?
“You can interrogate the neural network to ask it which parts of the image it’s using to tell whether it’s a happy face or not. It produces a heat map showing the areas of the face it’s using to assess happiness. For pigs’ faces, it is around the eyes, ears and the top of the snout which relate to expression.”
Mel has been collaborating with other researchers on the potential of using other existing technologies and applying them in new ways. In one example, a system designed to analyse aggregate particles in the construction industry has found new uses in agriculture, to check the body condition score of livestock, a measure of health and welfare for animals. Mel explains how it involves a camera which takes a normal image and a 3D depth image. Looking down on a cow, it captures data as it walks underneath. “We’re looking at how bony the animal is – around its hindquarter, where you have its hook and pin bones. If they’re sticking through, they have a low body condition and if they’re nice and fat and rounded, they have a high body condition.”
How did Innovate UK KTN help?
Whilst many of the technologies Mel works on are not new, they are being applied in novel ways, and Innovate UK KTN has played a key role creating new opportunities and connecting Mel to the right AgriFood organisations.
“Innovate UK KTN has had a transformational impact on helping us to deploy our Machine Vision skills to collaborate with agriculture and food industry partners. We have really benefited from the ability to network through Innovate UK KTN. Their funding expertise and knowledge of the AgriFood industry has led us to many new innovation opportunities that we would not have identified ourselves. Several of these projects have resulted in products that are now reaching a commercial stage”.
Mel Smith, Professor at the Centre for Machine Vision
“It’s been great to work with Mel and his team to help them develop new collaborations, and apply the expertise they developed in other sectors to benefit the AgriFood industry. This is a great example of how bringing together people who would not usually meet can turn into powerful connections to drive positive change”
Dr David Telford, Head of AgriFood, Innovate UK KTN
Happier animals are more productive and deliver higher yields, so there is a clear commercial advantage. Customers also pay more and more attention to the way animals are treated. In an industry where profit margins are often very tight, new practices which promote efficiency or boost productivity are usually welcomed.