The word “visionary” gets tossed about freely when speaking about inventors. In the case of Aaron Seitz, professor of psychology at UCR, his work has less to do with his own vision than it does with everyone else’s. His pioneering research in perceptual learning — essentially, training the brain to better perceive sensory input — has led to real-world applications that improve eyesight. The eye-and-brain-training software Seitz has developed has worked well on those who already have excellent vision — notably the hitters on the UCR baseball team. The National Institutes of Health (NIH) recently awarded Seitz a five-year, $1.7 million grant to research potential therapies for low vision, including such conditions as lazy eye, cataracts and dry macular degeneration. He has also founded a company called Carrot Neurotechnology, which creates vision-training video games.
Better Batters Result from Brain‑training Research
Novel brain-training research by neuropsychologist Aaron Seitz significantly improved the vision of individual baseball players and may have added up to four or five games to the win column in UCR’s 2013 season.
One of Seitz’ key discoveries was the importance rewards in perceptual learning. “We used food- and water-deprived human subjects, and they came and sat in front of split-screen computer display, with one eye seeing a visual stimulus, while the other eye was presented with a dim, subtle, noisy pattern. Then, whenever the stimulus appeared to an eye, we gave them drop of water as a reward.” After did this training for period of time, the subjects learned to distinguish between the patterns that came with a reward and those that didn’t — without actually noticing the difference. “This proved that learning could take place without attention,” Seitz says.
“At the same time, there were other published papers showing that you could train people to improve their perceptual abilities by playing action video games. There’s evidence that people learn from those games, even if they’re not specifically designed for perceptual learning. I wanted to take the knowledge I’d gained from my basic research, get together with people who create video games, and make a custom game.”
Seitz started a company with Adam Goldberg and Simon Mathew, who were both in the video game industry. They developed a prototype of a vision training game, and started testing it out. He brought the concept to the athletics department at UCR, and the baseball team (whose game involves a really small ball moving really fast) volunteered. “With the baseball team, we trained all position players for about 30 days. They came in for 25-minute sessions four days a week. After 30 sessions, we tested them and found that their vision had improved by 31 percent. That is, they could read the letters from 31 percent farther away from the eye chart.”
In a paper published in February, Seitz and his co-authors — including his longtime collaborator, UCR post-doc Jenni Deveau — estimated that the Highlanders won five extra games as a result of their sharper vision. “They had fewer strikeouts, scored more runs, and showed improvement across some more esoteric statistics,” Seitz says. The paper makes it clear that the improved play was directly attributable to the vision training game, he adds.
The potential applications of this software extend far beyond the sports world. Seitz is starting a large study with the Riverside police department looking at how the vision training game can impact police skills including shooting, driving and reading license plates. “We can also do studies with helicopter pilots, or with people who suffer from schizophrenia,” Seitz says. “We’ve had success in a normal lab approach, but when you’re using some specialized software that only some computers run, you’re restricted in how far your studies can go. Once you’ve got something in an application that anyone download, that makes that many more studies that you can do — under real-world conditions.”
More about the vision training game: The two primary exercises in the game are “static search,” in which targets appear across the screen, and the player must simply click on them; and “dynamic search,” in which the targets fade into view — with a sound cue — rather than appearing all of a sudden. “It’s not as fun a game as I’d like it to be,” Seitz allows, “but it has all the key components, and it has been demonstrated to give rise to perceptual learning. Even still, it has an addictive element to it, and I find that when I start that I keep playing.”