1. 1. Introduction to Neuromorphic Computing
The human brain is a highly efficient biological computer with around 86 billion neurons and more in trillions of specialized interconnections, which work at an incredibly low power. As it still eludes the grasp of researchers in replicating the complete functionality of the brain, incorporating the senses and emotions of humans, the concept of neuromorphic engineering came into existence.
This field incorporates the use of all the present-day knowledge of neuroscience and machine learning to design and develop artificial neural systems that are inspired by biological neuronal properties. The brain-inspired architecture not only helps to replicate the functionality of the brain but also aids in achieving low power, high computation speed, and provides a small footprint when compared to classical computer architectures. Machine learning and artificial intelligence revolutionize the way of life, as they enhance the level of sophistication of automated systems. The use of these techniques has become very versatile and allows for their application even in the fields of security and surveillance. With crime rates increasing day by day, the heavy manual intervention and long hours required to provide security have become very inconvenient.
Hence, the need for an automatic security and surveillance system increases daily. Though there exist many automated threat detection systems, they lack adaptiveness as most of the rules they learn are programmed. They train themselves solely based on sample images and do not learn from the ground up to hunt the target. By incorporating the principles that govern the functionality of the human brain, which are primarily the principles that govern human learning, cooperative behavioral learning can be achieved. The application of neuromorphic architectures along with artificial neural networks leads to the realization of the principles that govern the human brain. This chapter provides insight into the concept of adaptive security for real-time threat detection of both humans and infrastructure using state-of-the-art algorithms combined with an artificial brain-inspired architecture.
1.1. 1.1 Definition and Principles
Neuromorphic computing is a computing paradigm that uses energy-efficient brain-inspired methods to process information. This computing paradigm provides a promising approach to implementing artificial intelligence algorithms, and in recent years, it has attracted increasing attention in different application areas, including real-time threat detection. Machine learning-based and conventional computer vision approaches either learn on training data under supervised or unsupervised conditions. Moreover, particularly in supervised learning, the high performance of an electronic computer is accompanied by high energy consumption. In contrast, the brain can perform complex visual processing and learning with only a moderate energy budget. Brain-inspired vision systems are very interesting concerning the design of adaptive security systems. In a real-time threat detection environment, an adaptive security system design allows for efficient detection and localization of suspicious events. Brain-inspired vision sensors bring advantages to designing adaptive security systems. One of the key principles of the brain’s vision system is event-based information processing based on spikes generated by local intensity changes in the visual inputs. Like the biological brain, brain-inspired neuromorphic vision systems process information in a fundamentally different way than conventional frame-based or video-rate approaches, known as frameless vision sensors. Neuromorphic frameless vision sensors provide an asynchronous stream of visual information, in which the pixels only produce data when the intensity changes. Due to the event-based data stream, low-latency visual processing can be achieved when suitable algorithms are implemented on event-based vision system hardware. The low-latency event-based visual input communication to the spike-based processing and neuron models is one of the central principles of neuromorphic computing hardware architectures. Other principles of the brain equipped in neuromorphic chips involve ultra-low power and inherent parallelism that can be exploited for efficient algorithm mapping of spiking neural networks. As a result, neuromorphic frameless vision sensors allow signal processing and deep spiking neural network configuration in real-time. At the same time, specialized extensions, such as for spiking neural network algorithm training and testing, can be integrated into the same chip. The mixed-signal electronics in conjunction with neuron models allow neuromorphic chips to carry out the performed calculations with orders of magnitude lower power consumption per operation compared to general-purpose microprocessors. Finally, neuromorphic chips enable low-latency and low-power data-driven visual and sensor fusion due to their tight sensor interface and spiking neural-inspired architecture.
1.2. 1.2 Comparison with Traditional Computing
Neuromorphic computing is the implementation of brain-inspired models in a hybrid hardware and software environment. A popular neuromorphic architecture consists of neurons and synapses to form networks of spiking elements that mimic the processing performed in biological brains. A key characteristic of neuromorphic computing is the inexactness, making neuromorphic hardware inherently more energy efficient than traditional microprocessor cores. The spike-based processing, as well as the weight storage efficiencies, stand out in comparison with digital counterparts. Two other advantages often associated with substantial energy reductions are reliance on operation at the edge and the ability to handle complex decision-making scenarios.
The proposed use of a neuromorphic processing unit must provide a significant advantage over the incumbent CPU architecture, which has been shown to handle the task at hand before it can be considered worthwhile. Not only must the neuromorphic processing unit be able to do the same or better, but the input and output layers must also be positioned to feed the NPU and utilize model-specific forms such as spike generation and reading, leading to a whole new world of computing that is still developer-dominant due to AI’s inherent characteristics.
2. 2. Real-Time Threat Detection in Security Systems
One of the reasons why modern standalone security devices for surveillance, threat detection, and control are rapidly being replaced by a network of smart point cameras is that the standalone units must be regularly recalibrated and they easily break down in hot, dusty environments typical for arid and semi-arid areas, especially for border control facilities located near arid or semi-arid power plants that attract dust. Recalibration requires the use of trained human operators who control and maintain a surveillance system and regularly check videos in the control center or cloud analysis. Since, by regulatory requirement, the standalone units should not ‘forget’ fixed patterns due to software emulation or hardware reconfiguration, even when stationary, and at the same time distinguish and correctly highlight only newly arisen threats, only the neural network in the structure can now correctly fulfill this requirement using spatial recognition of objects and adaptive learning. However, traditional neural networks do not have the required fast operation and do not have the programmed arrays to eliminate the occurrence of errors.
2.1. 2.1 Importance and Challenges
For military and first responder applications, early detection of associated signals is required for survival and decision-making. The brain can uniquely handle these tasks with minimum latency and power. Yet researchers face immense barriers in designing hardware with similar capabilities. To bridge this gap, we are harnessing the brain’s innate blend of memory and computing. We draw upon a catchment area with extensive expertise in neuromorphic integrated circuits, compressed sensing, grid cells, and spiking neural networks. We will create silicon neuromorphic circuits that can emulate fundamental brain circuitry associated with the earliest spatiotemporal detection, especially in cluttered environments akin to foreground information extraction.
We will further investigate the physical realization of unique cortical networks where the interplay of three archetypes of correlative cells is pivotal to learning and decision-making. Such capabilities are coincidental; they are thus deficient in our proposed implementations which include sensory epithelial, as well as cortex and sensor chips. Finally, we will use findings to guide our threshold-realistic yet speculative development of fully neuromorphic cross-cortical interactions based on garnered functional insights and the connectivity pattern inherent in the cortex. Such constructions will enable us to mimic emergent brain behaviors and contribute to the evolution of neuromorphic architectures that can interact fully with embodiment.
2.2. 2.2 Traditional Approaches vs. Neuromorphic Computing
Building adaptable and resilient security systems to protect military and civilian installations is a challenging problem due to the myriad types of threats, constantly changing threat environments, and limitations on available resources. In practice, legacy threat detection systems are often subject to a high rate of both false positives and false negatives. Threat detection systems can be made more adaptive by integrating hierarchically organized brain-inspired processing layers, collectively developing the threat-processing system into a trainable processor. Each hierarchical processing stage is designed to handle a unique aspect of threat detection, progressing from a first-order analysis of distinctive attributes through multi-level evaluations of gradually fused sensor data to a continuous assessment of surrounding traffic. These adaptive systems incorporate brain-inspired processing nodes implemented using neural networks. Neuromorphic event-domain processing architectures are computationally efficient, and their scalable core designs create swarm-quality processing modules that can be readily combined, increasing both processing capability and the recognition level of threat detection.
The combination of neuro-inspired algorithms embedded within integrated circuits facilitates innovative kinematic and dynamic labeling, fuzzy cognitive mapping, and spatio-temporal neural functions to detect and classify threats that are external, internal, or at the first-order root level. By affording extensive reaction capabilities, without expending excessive defending resources until direct action is necessary, these collaborative layers evolve the neural processing layering into a highly efficient and effective trainable cortex. Rapid threat assessments can be accomplished with minimal direct human intervention, while sustained long-term assessments can include explicit feedback for the training of the adaptable systems. The continual retraining of each processor’s system consisting of ten cortical processors forms the core of an evolvable early-warning risk assessment system suitable for real-time surveillance in asynchronous environments.
3. 3. Brain-Inspired Architectures in Security Systems
Interest is growing in the application of neuromorphic computing architectures – devices that are designed to mimic the brain’s structure either cellularly or via network layouts – to accelerate calculations that cannot be handled well using existing von Neumann architectures. Most often used for tasks such as pattern recognition and clustering, it has been proven that brain-inspired hardware can quickly solve visualization problems and estimate parameters through memory performance. This ability to quickly calculate approximate solutions suggests the application of neural-based architectures to many problems that cannot be handled well using purely von Neumann systems. As such, this technology is expected to have far-reaching impacts on not only computing but also everyday lives. Given the promise that such development offers, the increasing interest in this quickly evolving field is not surprising. The emergence of large volumes of data driven by the growing trends of the Internet of Things and cloud computing solutions also raises potential exposure and substantive consequences from both intent and unintended cyber attacks. The constantly evolving and increasing threats to Information Systems require adaptive security systems that operate in real time and have the learning capabilities of the brain. In this aspect, neuromorphic stands out as a viable solution for the design and deployment of these systems due to the fundamental principles by which computations and data representation occur. This chapter elaborates on the key ideas used for designing a complete neuromorphic solution to enhance adaptive security systems, based on the concepts acquired from a multidisciplinary perspective of neuromorphic computing, security, defense architectures, and requirements therein.
3.1. 3.1 Overview of Neuromorphic Hardware
In an endeavor to integrate more closely with field exercises of robotic platforms, we are investigating an innovative set of distributed sensor nodes as part of a neuromorphic computing architecture. Neuromorphic computing describes a rapidly growing field of hardware design dedicated to developing specialized computer hardware that simulates the computational architecture of a biological neural system. Neuromorphic computing hardware designs typically leverage the parallelism and energy efficiency of spiking neural networks to solve computationally expensive tasks, such as image recognition, signal processing, and autonomous robotics, while enabling real-time, low-energy operation. Neuromorphic hardware may significantly surpass energy efficiency, with a projected lower energy requirement to train spiking neural networks.
Synchronous Boolean artificial intelligence accelerators are inappropriate for real-time robotic platforms’ decentralized operation due to significant power requirements, latency, bandwidth limitations, and incremental energy efficiency degradation in large-scale, distributed embedded applications. Synchronous, centralized field-capable networking solutions for these distributed platforms require bandwidth, power, latency, and security resources underlying decentralized neuromorphic computing design. The hardware attributes of biological systems merit emulation and opportunistic interaction within a broader scope of network-centric systems. Decentralized neuromorphic computing systems leverage low-latency, in situ biological inspiration with self-destructing behavior when externalized, safeguarding the secure data communication safety provided by a homologous interconnection network and associated neuromorphic-based operations. These neurocyber discs base the perception-execution-communication data processing cycles on the mapping of spiking neural networks to organic, thin vapor-phase molecular alloy sensors while improving closed-loop embedded software development engagements.
3.2. 3.2 Spiking Neural Networks for Threat Detection
This paper demonstrates a neuromorphic spiking neural network system to detect automobile intruders for campus security applications. This proposed threat detection system consists of a silicon retina optical pre-processor, a silicon cochlea auditory pre-processor, a lateral geniculate nucleus encoded silicon spiking neural network, and a multi-layer polarity-based spiking neural network construction algorithm for discriminating threatening automobiles from non-threatening ones. We demonstrated this spiking neural network threat detection system in real time with a set of commercially available electric automobiles, including the potentially most threatening rebel buggies equipped with collision protection features, and our security staff always playing the role of potential intruders. The threatening automobiles equipped with collision protection features or the most flexible rebels are harder to detect compared to other intruders during gate operations, and though it usually doesn’t take that long, human security officers in such scenarios often require extra attentiveness and response time to double-check the actual authorization for their entrance.
This work presents an energy-efficient and real-time neuromorphic hardware system capable of monitoring and differentiating autonomous ground automobiles. For validation, we present prototypes of rebels that possess collision-avoidance features most at risk of violating major rules and as such require extra evaluation. The entire neuromorphic hardware proposed here consists of a silicon retina for real-time visual front-view processing, a cochlear index encoder for visual-triggered audio pre-processing, and two hidden layers of spiking neural networks for feature extraction and information classification. Combining a lateral geniculate nucleus for detecting visual moving objects, the cochlea audio-triggered visual processing unit, and a two-hidden-layer spiking neural network containing 300 and 100 neurons, on average, animates 10.94 mJ image–audio–image per frame. The proposed spiking neural network threat detection system has a detection rate of almost 90% and maintains a runtime of 20 Hz if traffic saturation allows, demonstrating a response time that expands with the number of automobiles, which is appropriate for real-time control.
4. 4. Case Studies and Applications
In this section, we illustrate the advantages of using neuromorphic computing for real-time threat detection and explain the process of achieving adaptive security systems using brain-inspired architectures. This can be seen as the mechanism behind the use of neuromorphic mean-field many-body theory for network security assessment. While using the Metropolis–Monte Carlo algorithm in a classical computer to approximate the spatial-temporal mean field of Ising systems, the time order needs to be able to discern the fluctuation areas from the bifurcation points to determine the instability of the equilibrium points. We used the TrueNorth neuromorphic system to accelerate the simulation process. While using Dirac sampling to maximize the advantage of the quantum annealer, we accelerate and optimize the decision-making process by setting the bistability, hopping fields, and suboptimal connections of the TrueNorth neuromorphic system for deep computing. As the goal of the study was to use the characteristics of quantum annealing to assist deep computing in bridging the gap of the exponential time favor, the isomorphism and degeneracy of the model were revealed by studying the mean-field proximity.
We calculate the decision-making ability of the sparsely connected Hopfield network and the projections of the energy landscapes projected by the scalar parameters in order to assess several types of security weaknesses, providing a closer inspection and the quantitative correlation between the quantum solutions using the framework, topological solutions, and the causes of several types of security inexploitables such as privacy breach, breach of trust, digital disinformation, and effective defense.
4.1. 4.1 Neuromorphic Chips in Surveillance Systems
Given the high energy efficiency, low power consumption, and real-time processing abilities of neuromorphic chips, they can be efficiently integrated into surveillance systems for audio and visual signal processing. In a scenario where security needs to identify audio signals or samples immediately, such as police officers receiving alerts in their vehicles based on big data analysis, neuromorphic chips can lower the time latency and energy consumption. In currently designed surveillance systems, raw data feeds received from sensors used for image capturing and conceptual clipping are transferred to processors for signal processing before they are used for subsequent analyses. This process consumes significant time and energy and causes some privacy exposure because public transferring networks are utilized.
If the sensors are integrated with preprocessors that make decisions locally with a certain level of accuracy, only conceptual information would need to be electronically transferred to the upper level processors. Neuromorphic chips can be deployed as preprocessors for tasks such as sound and voice recognition or event detection at low layers; they can also initiate the first round of active learning mechanisms to automatically trawl new data and anchor influential samples. Then, only the hidden categories and target features within the captured clips with important information would need to be transferred to other high layer processors for additional analysis. Therefore, neuromorphic chips not only can reduce transmission time and save energy significantly, but also can maintain more privacy and secure the surveillance data from malicious access.
4.2. 4.2 Adaptive Security Systems in IoT Environments
Neuromorphic technology can now provide lower power architectures that can be used in real-time edge devices under both normal and cyber-attack conditions in IoT. Neuromorphic technology can build applications that can infer the state of sensor data and respond to these inferences with the same speed and accuracy as our brain, providing a way to protect against cybersecurity threats. Neuromorphic devices have the advantages of faster response time, lower power, and less susceptibility to overfitting compared to traditional deep learning models, which are suitable for dealing with data generated by IoT devices with the continuous stream of sensor data that they can produce. In IoT edge devices, surveillance tasks are crucial to act more proactively to threats. Applying neuromorphic chips and combining them with signal processors and controlling functions at the network edge could bring the surveillance capability along with the real-time response needed for adaptive electronic security systems. Such a chip integrates both programmable learning and custom inferred operation of classical signal processing. The applied algorithms used in the brain will provide a very quick response to features in the incoming data and make decisions in near real-time. For the same reason, it does not require a constant connection with a parent server in order to function.
5. 5. Future Directions and Challenges
Given the wide array of threats and the constant evolution of attack strategies, a combination of heuristics, signature-based methods, and sophisticated artificial intelligence poses heavy burdens on current and future security systems. Neuromorphic computing platforms can provide immense analog processing capabilities and self-adaptation to identify complex patterns and anomalies, but they are currently fledgling in comparison. However, they are forecasted to grow rapidly in capabilities and complexity, as large investments from governments and tech giants show. The key contributions of the work presented here are in creating a security system that takes into account biologically inspired mechanisms such as fast learning, low power at low latencies, noise tolerance, and non-deterministic computations, as the requirements for the era of threat environment shift focus from the amount of data processed to precision and accuracy. Given all these capabilities, the path from neuromorphic vision technology to threat detection, or security systems in general, is an extremely powerful one. The flexibility and programmability of neuromorphic architectures make it possible to implement several algorithms such as linear threshold functions, edge and motion detectors, attention pointers, and so on. Algorithms such as deep belief networks, convolutional hidden Markov models, and event flows can be realized as well. All of these crucial vision building blocks might be further customized to extract security-relevant features. I imagine these can be further evolved in an unsupervised manner. The emerging understanding of neurons and neural networks with respect to security should be further developed to enable neuromorphic networks to innately understand threats and improve with experience. It should be noted here that such systems should have ultra-low post-manufacture latencies. More importantly, since neuromorphic architectures can amplify beneficial hardware side channels by allowing the attacker to efficiently solve sophisticated security puzzles, the security mechanisms should be provisioned—and provisioned by these architectures—against side channels.
5.1. 5.1 Ethical Considerations in Neuromorphic Security Systems
While advances in neuromorphic technologies can provide successful solutions to some of the most critical computing problems, as well as discovering potentials yet unrealized, it is important to accompany the realization of these advances with meaningful communication between the public, technical and scientific communities, and policy and decision-makers to ensure development and deployment aligned with social and ethical values. Special consideration is appropriate for the application of neuromorphic learning architectures to the adaptive security threats in physical systems. Neuromorphic architectures and techniques offer the potential for powerful innovations in real-time threat detection and other applications important to security, but require careful consideration of societal values when applied in these and other domains. Intellectual values require substantial rethinking as neuromorphic learning tools become increasingly more powerful, and opportunities to promote trust in the use of these tools are available.
Ethical consideration in neuromorphic security systems, as applied to adaptive security defense, is addressed from five different areas: system accuracy, system robustness, human intervention, system learning, and direct application of force. In cases where neuromorphic processing makes a threat determination, it is critical that systems have high performance accuracy. Technology focus in the development of neuromorphic learning systems for threat recognition and attack inhibition should demand that the technology scales to real-world applications, providing security improvements in practical, well-behaved ways: no unwanted collateral damage, respect for civil and privacy rights. Practical application will require continuing examination of system robustness. To address a full range of adaptive threats, and to support precise security responses by where and when required, training data must accurately account for the full range of potential adversarial approaches. Such capacity to preemptively defend systems against adaptive hardware attacks will make neuromorphic systems particularly able to protect themselves and other important security systems from malicious attack and misuse.
5.2. 5.2 Scalability and Integration Issues
Neuromorphic informatics integrates the “ultimate native” brain-system information processing paradigm and advanced micro-nano technologies at minimal computational circuit elements to interface smart adaptive data processing with cognitive informatics processes. However, the complexity of the information handled by the cognitive architecture systems continues to double, and it demands many orders beyond the capabilities of present technologies. Efficient spatiotemporal information representation and transformation are the essence of the systems. Large-scale neuromorphic systems require such processing in real time, but scaling of the present circuits limits the realization of multi-billion-gate systems. In some scenarios, those components and their parallel interconnections can also nullify the intrinsic ultra-low power and small-area benefits of emerging nanotechnologies.
Recently, performance issues in supercomputers are addressed by concurrent heterogeneous processing of one of the forthcoming hybrid versions of multicore general processors and many-core simple processors. However, their power consumption and chip area are very large. The dominance of floating point operations in graphical applications can be minimized by utilizing simpler custom elements combined with additional dedicated neuromorphic kernel libraries. However, these application-specific hardware libraries can reduce chip programmability and require additional off-chip data transfer bandwidth from the main processor to the kernel processor. The latency and power consumption of the parallel connections between the main processor and the kernel processors can also be a bottleneck for real-time performance. None of them are amenable to advanced sub-45 nm accelerators and are not adopted for sensor networks or very large scale integration applications.