I. Introduction
Robotic navigation requires learning new environments, or changes to a known environment, concurrent with real-time localization of the robot within said environment; this methodology, known as Simultaneous Localization And Mapping (SLAM), typically requires mathematically complex computation and the fusion of input data across many sensory modalities. Thus, prototypical SLAM implementations are often real-ized as over-sensored, power hungry devices. This becomes an unsuitable navigation paradigm for low-power mobile robotics applications, where algorithms must be executed at low costs, often in the presence of high noise. Animals, however, are able to navigate new environments and learn salient features with extremely low energy costs and limited sensory information. Accordingly, neurally-inspired approaches to SLAM are quite promising for providing robust, power-starved performance.