(ORDO NEWS) — Putting cute dogs in an MRI machine and watching their brains while watching home videos can seem like a fun pastime just for its own sake. As a bonus, it can also be educational.
A group of scientists did just that, using machine learning to decipher the visual processing going on in the brains of a pair of dogs. They found a surprising difference between canine and human perception: dogs are much better at perceiving actions than who or what is doing them.
This can be an important piece of the canine cognition puzzle, as it allows us to understand what the dog’s brain is all about when it comes to vision.
“Although our work is based on only two dogs, it provides evidence that these methods work in dogs,” says neuroscientist Erin Phillips, then at Emory University and now at Princeton.
“I hope this work will help other researchers apply these techniques to dogs, as well as other animal species, so that we can get more data and a deeper understanding of how different animal minds work.”
The study, Phillips noted, was conducted on two dogs, Daisy and Bhubo. The team filmed three 30-minute videos using a gimbal and selfie stick with dog-specific content.
These included dogs running around and people interacting with the dogs, giving them treats or treats. Other activities included cars passing by, people interacting with each other, a deer crossing the road, a cat in the house, and dogs walking on leashes.
Daisy and Bhubo showed these films for three 30-minute sessions, for a total of 90 minutes, while they lay relaxed in the fMRI machine.
This remarkable achievement was achieved through the use of training methods developed by psychologist Gregory Burns, who was able to perform the first MRI scan of a fully awake, wild dog ten years ago.
So the researchers were also able to scan the brains of Daisy and Bhubo as they sat in the machine, awake, alert and comfortable, and watched home movies made especially for them. Sounds pretty nice, actually.
“They didn’t even need treats,” says Phillips. “It was funny because it’s serious science and a lot of time and effort went into it, but it all came down to these dogs watching videos of other dogs and people acting pretty stupid.”
The video data was segmented by timestamps to determine classifiers such as objects (eg dogs, people, vehicles or other animals) or actions (eg sniffing, eating or playing).
This information, as well as data on the brain activity of the two dogs, was fed into a neural network called Ivis, designed to match the brain activity to these classifiers.
Two people also watched videos while undergoing MRI scans; this data was also shared with Ivis.
The AI was able to match human brain data to classifiers with 99 percent accuracy, for both object classifiers and action classifiers. Ivis did a little worse with the dogs. It didn’t work at all for object classifiers. However, for actions, AI matched visual images with brain activity with an accuracy of 75 to 88 percent.
“We humans are very object-oriented,” says Burns. “English has 10 times more nouns than verbs because we have a particular obsession with naming things. Dogs seem to be less concerned with who or what they see and more concerned with the action itself.”
Dogs, he added, have significant differences in how they perceive the world compared to humans. They only distinguish shades of what we perceive as the blue and yellow parts of the spectrum, but have a higher density of visual receptors that are sensitive to movement.
This may be because dogs need to be more alert to environmental threats than humans; or it may be due to dependence on other senses; and perhaps with both. Humans are very visually oriented, but dogs have the strongest sense of smell and a much larger part of their brain is dedicated to processing olfactory information.
Correlating brain activity with olfactory cues can be a more difficult experiment, but it can still be educational. As well as conducting further, more detailed studies of the visual perception of dogs, and in the future, possibly other animals.
“We’ve shown that we can track activity in a dog’s brain while watching a video and, at least to a limited extent, reconstruct what it’s looking at,” says Burns. “The fact that we are able to do this is amazing.”
Contact us: [email protected]