Pitonakova, Lenka, and Seth Bullock. “The robustness-fidelity trade-off in Grow When Required neural networks performing continuous novelty detection.” Neural Networks 122 (2020): 183-195.
URL1 URL2
Novelty detection allows robots to recognise unexpected data in their sensory field and can thus be utilised in applications such as reconnaissance, surveillance, self-monitoring, etc. We assess the suitability of Grow When Required Neural Networks (GWRNNs) for detecting novel features in a robot’s visual input in the context of randomised physics-based simulation environments. We compare, for the first time, several GWRNN architectures, including new Plastic architectures in which the number of activated input connections for individual neurons is adjusted dynamically as the robot senses a varying number of salient environmental features. The networks are studied in both one-shot and continuous novelty reporting tasks and we demonstrate that there is a trade-off, not unique to this type of novelty detector, between robustness and fidelity. Robustness is achieved through generalisation over the input space which minimises the impact of network parameters on performance, whereas high fidelity results from learning detailed models of the input space and is especially important when a robot encounters multiple novelties consecutively or must detect that previously encountered objects have disappeared from the environment. We propose a number of improvements that could mitigate the robustness-fidelity trade-off and demonstrate one of them, where localisation information is added to the input data stream being monitored.