This paper describes the development and testing of a new evolutionary robotics research test bed. The test bed consists of a colony of small computationally powerful mobile robots that use evolved neural network controllers and vision based sensors to generate team game-playing behaviors. The vision based sensors function by converting video images into range and object color data. Large evolvable neural network controllers use these sensor data to control mobile robots. The networks require 150 individual input connections to accommodate the processed video sensor data. Using evolutionary computing methods, the neural network based controllers were evolved to play the competitive team game Capture the Flag with teams of mobile robots. Neural controllers were evolved in simulation and transferred to real robots for physical verification. Sensor signals in the simulated environment are formatted to duplicate the processed real video sensor values rather than the raw video images. Robot controllers receive sensor signals and send actuator commands of the same format, whether they are driving physical robots in a real environment or simulated robots agents in an artificial environment. Evolved neural controllers can be transferred directly to the real mobile robots for testing and evaluation. Experimental results generated with this new evolutionary robotics research test bed are presented.