The media have jumped all over this story about the study that shows playing shoot-em-up videogames increases visual attention skills. Could gaming really be good for you?
Well, I don’t doubt that playing games can be a good thing in some cases, and I’m not looking to join any kind of down-with-gaming posse. But there seems to be a huge hole in this study that I haven’t yet seen pointed out.
If, for instance, you read the Wall Street Journal account, this is what you learn about the various skills that seemed to improve among people who played a lot of computer games:
“In one test, a small object flashed on a computer screen for 1/160th of a second, and the volunteer had to indicate where.”
“In another test, between one and 10 small objects flashed on the screen simultaneously for a fraction of a second, too short to count them individually.”
“In another test, the researchers had volunteers indicate the location of a solid triangle in a circle on a screen filled with distracting shapes…”
Notice that one phrase keeps recurring: on screen. The researchers conducted all their visual tests on a computer.
As far as I can tell, this study managed to prove that if you spend many hours in front of a computer screen playing games, you will increase your ability to detect, identify, and remember objects on a computer screen.
That may have some real value in our society — which, after all, is increasingly operated via a computer screen. But it seems a far cry from proving the very different notion that playing computer games actually improves more general visual skills. I am not a student of the science of vision but I know there are such things as peripheral vision and depth-of-field, and that a computer screen — far from testing the full capacity of the amazing human sense of sight — operates in only a very narrow range of that sense.
There are no revisions for this post.